Hacker News new | past | comments | ask | show | jobs | submit login
I built my own 16-Bit CPU in Excel [video] (youtube.com)
110 points by SushiHippie 11 months ago | hide | past | favorite | 17 comments



I built AES and DES in Excel almost about a decade ago. Note that these are combinational circuits, not sequential - so no feedback or clock are required. https://www.nayuki.io/page/aes-cipher-internals-in-excel , https://www.nayuki.io/page/des-cipher-internals-in-excel


I am super disappointed in the lack of evolution of dataflow, but am encouraged to see things like airtable, and I guess blender and etc using node-based interfaces for functional logic.

I did my senior thesis/project in CS (we had to do several, it was anticlimactic) about visual programming, and basic paradigms that might be the future.

I ended up writing a missive about labview holding people back, because 2D planes suck at communicating information to people who otherwise read books and blogs and C# code.

My conclusion 15 years later is that we’ll talk to LLMs and their successors rather than invent a great graphical user interface that works like a desktop or a <table> or even a repl.

Star Trek may have inspired the ipad and terrible polygon capacitive touchscreens… but we all know that “Computer, search for M-class planets without fans of Nickelback’s second album living there as of stardate 2024” is already basically a reality.

EDIT: I like this CPU experiment too! It is a great example of the thing I’m talking about. Realized after the fact that I failed to plant my context in my comment, before doing my graybeard routine.

So. Food for thought, our LLM overlords are just unfathomable spreadsheets.


Graphical programming just does not work, it has been tried often enough. As soon as you step beyond toy examples, you need a hierarchical organization, functions calling functions calling functions. How do you represent that graphically? You put additional graphs side by side or allow some kind of drill down. Now all your graphs are pretty trivial and you have not gained much over a hand full lines of code but you have reduced the density a lot with all the space between nodes and all the arrows.

Natural language programming is not going to happen either because natural languages are to ambiguous. You can probably write code iteratively in a natural language in some kind of dialog clarifying things as ambiguities arise, but using that dialog as the source of truth and treating the resulting code as a derivative output sounds not very useful to me. So if I had to bet, I would bet that text based programming languages are not going anywhere soon.

Maybe one day there will be no code at all, everything will just contain small artificial brains doing the things we want them to do without anything we would recognize as a program today, but who knows and seems not really worth spaculating about to me.

In the nearer term I could see domain specific languages becoming a more prevalent thing. A huge amount of the code we write are technical details because we have to express even the highest level business logic in terms of booleans, integers and strings. If we had a dozen different languages tailored to different aspects of an application, we could write a lot less code.

We have this to a certain extent, a language for code in general, one for querying data, one for laying out the user interface, one for styling it. But they are badly integrated and not customizable. The problem is of course that developing good languages and evolving them is hard and lowering them into some base language is tedious work. But in principle I could imagine that progress is possible at this front and that this becomes practical.


> If we had a dozen different languages tailored to different aspects of an application, we could write a lot less code.

I think this has potential. As we all known, natural language is a weak tool to express logic. On the other hand, programming languages are limited by their feature set and paradigmatic alignment. But whatever code language we use to express a particular software product, the yield for the end user is virtually the same. I mean, how the logic is laid and worked out practically has no effect on the perceived functionality, i.e. a button programmed to display an alert on the screen can be programmed in numerous languages but the effect is always the same. If however we had like drivers and APIs for everything we could possibly need in the course of designing a program, then we could just emit structured data to endpoints in a data flow fashion, such that the program is manifested as a managed activation pattern. In this scenario, different APIs could have different schemas and those effectively be synthesized through specialized syntax, hence nano DSLs for each task. It would be not so different, conceptually, from the very same ISAs embedded in processors: each instruction has its own syntax and semantics, it’s only very regular and simple. But for the scenario of pure composability to work at a high level, we would need to fully rework the ecosystem and platforms. I mean, in this context a single computer would need to work like a distributed system with homogeneous design and tightly integrated semantics for all its resident components.


I don’t anticipate “natural language programming.” I anticipate systems that maintain a declarative state of running services, based on requirements and iteration/adversarial chore-checking.

You won’t be using ChatGPT to write source code that you copy and paste, and debug.

You’ll be saying “no she was wearing a shorter dress, with flowers,” like Geordi LaForge using the holodeck to solve mysteries.

The boilerplate below won’t even be necessary. Here’s how I see it working though:

“Hey, welcome to Earth. So here’s the deal. You maintain my website that drop-ships pokemon cards using paypal merchant integration. You will have a team of AI designers who you will hire for specific skills by designing a plan with detailed job descriptions.

I want one guy to just make funny comments in the PR history. Make it look like cyberpunk. Respect EU privacy laws by maintaining regional databases, and hire another agent who has a JD to follow similar regulatory requirements in the news.

I hate oracle databases and j2ee, use anything else ;)”


>> you need a hierarchical organization, functions calling functions calling functions. How do you represent that graphically?

Not saying graphical programming is a good idea, but the basic abstraction mechanism is to define new boxes, which you can look inside by opening (like some VHDL modeling tools do). Even in SICP it is said it is a bad idea and do not escalate. But is clear that the primitive is “the box” the means of combination, the lines between boxes, and mean of abstraction is making new boxes.

I thin the real problem is that there is exactly 1 primitive, one mean of abstraction, and one of combination, and that seems to not be enough.


What else could you have? Whatever you are building, you will always have some primitives and ways of combining them and that's practically it. To make things more manageable, you start abstracting, give names to things and refer to them by name instead of by their structure. The next level up would probably by parameterization, instead of having a name for a thing you have a name plus parameters for a family of things. Maybe before that you could get a bit more fancy with instantiation, allow things like repetition. But that again is pretty much it, make parameterized instantiation a bit more fancy and you will quickly create a Turing complete meta layer capable of generating arbitrary constructs in the layer we started with.


I work on a product for building user interfaces for hardware devices. All the state management is done via incrementally updated, differential DataFlow systems. The interface is defined in code instead of graphically, but I think that's a feature, so that code can be version controlled.

I think there has been evolution in the underlying data computation side of things, but there are still unsolved questions about 'visibility' of graphical node based approaches. A node based editor is easy to write with, hard to read with.


Does dataflow necessarily require a graphical interface? My experience with LabVIEW was simply the sheer amount of manual labor required to write more than a trivial program, and the eyestrain headaches that came with it.

I did a huge amount of Excel with elaborate VB macros. Thinking back, it strikes me as odd that a dataflow programming tool used a conventional language as its macro language.


I am working on a side project related to dataflow and would like to see some input. Is there a simple way to get into contact?


The video is already 1 day old. Where is the DOOM port?


But can you now run it in a homebrew “tiny sheet” running on a tiny sheet version of your 16-Bit CPU?

I love that people do these ridiculous but inventive things. Never would have imagined this one.


That reminds me of the metapixel in the Game of Life, which is a large structure that emulates one cell. https://conwaylife.com/wiki/OTCA_metapixel


"Is it even possible? It's the best kind of possible: theoretically possible."


Or as it was known from that day forward. Windows EX, the final evolution of Microsoft's Operating System, running natively in Microsoft's Excel.


Impressive. Thanks for sharing!


TBH, the oooonly question is:

Can it run Crysis?

:D




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: