This reminds me of Dave Taylor's "Pinky Processor." [1] Crack dot com has been shut down for years, so I couldn't find a better source, but the idea was to have single bit processors and they could be linked in a "Pinky Farm" to work in parallel. Every instruction would be this type of branching instruction and it could handle calculations by performing table look ups. 15+ years ago people had even written assemblers and emulators to demonstrate the theory. Shortly after publishing his paper, Dave Taylor went to work for Transmeta on the Crusoe processor and any research into Pinky processors faded out of existence.
This a very sweet and cool design, it is nice to see this theorical idea applied to a real use case. I think it should be demonstrable that for a turing machine, for a size-bound input state space, there is a data bus width (with matching eprom and register) for this kind of machine, so that it emulated the said Turing machine. Also, the eprom is a non-invertible function, but there again is always a wider bus, register and eprom, this time with a reversible (injective) action, that would have, nontrivially, equivalent behaviour. In this reversible case, the action of the eprom could be expressed in terms of a unitary transformation acting on a quantum register, so the traces of the register values against time would be a subset of the valid quantum state dynamical trajectories. In this case the two I/O chips below are not required, and the system would evolve autonomously after an initial state. So look, this computer is a cousin of a quantum circuit.
My first instinct in this situation (adding some automation to a printing press) would be to reach for a microcontroller. I always overlook how much can actually be done without one.
Yes, certainly! And I admire the author to have put the idea into production.
For even more inspiration, the "Application Handbook" linked to on the Wikipedia-page for the MC145000B has a lot more examples similar in spirit: http://en.wikipedia.org/wiki/Motorola_MC14500B
But then, in an industrial job, you'd be scolded for the implementation, because only few of the colleagues would be able to follow this thought or make modifications to the logic stored in the 2kByte ROM.
I don't know about the precise time the article-author had put his logic into use, but PALs (the predecessors of FPGAs) were available since 1980 or so, and they should have been sufficient to implement the "printing-press" logic mentioned, and they meet the "programmable" requirement.
Assuming that specification of logic performed was in form of some state diagram or flowchart, transforming that into contents of that ROM is pretty straightforward, you just assign memory addresses to states and then follow arrows.
PAL seems to be natural choice for this, but:
* this circuit has more state than what would fit into PAL22V10 (I don't know if it is really necessary), which is largest common PAL device (it would not fit even into 26V12, might into GAL6001)
* rewriting some state machine description into logic equations for PAL compiler is significantly more work than creating contents of that ROM.
* PAL programmer is more specialized piece of hardware than (E)PROM programmer.
Original meaning of "microcontroller", that is "microprogrammed controller", is circuit similar to what is described in article. There were even chips containing bunch of PROM and latch designed for this kind of uses and original PICs are only more involved implementation of same idea. Circuitry in microcoded CISC CPUs that drives all the control signals according to microcode is also essentially the same thing.
So true. Arduino and Raspberry Pi have made it too easy and cheap to use way too much computing power any time we want to interact with "the real world". Not a bad thing, but we're collectively losing some skills in the process.
I would say that a solution can never be "too easy and cheap." I argue with engineers about some solutions that they feel are "cheating" because they seem too easy. No such thing. You're paid to solve the problem, not to be clever about it.
What skills do you feel are lost by doing things the "easy" way?
I had more of that feeling. Just that I feel silly booting the whole Linux kernel on an Rpi to read a few pins and trigger a light. Totally agree that engineers should do things the easiest way, just sometimes you don't have the easy tools and you realize you've never done it the hard way.
Kinda like how I can get away with Java and Python 99% of the time but I'm glad I know systems-level C for the times you have to dive deep.
I would label this a finite state machine more than a computer. Still, if it got the job done, great. It doesn't say when he created it, but if it was any time in the past 15 years, a microcontroller would have been smaller, cheaper, and easier.
Judging from the 74C logic and 2716 this was designed more than 20 years ago. In today's situation only expensive component in this thing is the 2716 (or just about any (E)PROM) and I believe that even few years ago this thing would be in single quantities cheaper than using microcontroller (although not significantly).
One bit refers to it having exactly one bit of what could be called "architecture state" or "datapath", that is the second channel of lower 74C374 going from input to A0 of program store. "Instruction" has two phases and result of first phase determines what will happen next by means of this one bit (by changing which pointer to next instruction will be read from memory).
That all makes it a one instruction set CPU (and since there is only one possible instruction it need not be stored in the program).
But the 1 bit is a reference to memory, it has an address space os 2^1 (Q1 and Q2 on the 4099 output register file) and a word size of 1, which as you point out allows you to store only a binary state.
[1] http://tech-beta.slashdot.org/story/98/06/06/136239/pinky-pr...