The mega processor is one of my all-time favorite computers, along with the Magic-1 https://homebrewcpu.com/
The megaprocessor is just absolutely wonderful in how it bridges from 'here is a transistor, it lights an LED' to 'here is a computer, it plays tetris'. I always struggled to unwind the layers of abstraction in a modern computer from atoms in the CPU to running python, but being able to just look at a bunch of literal transistors (with LEDs on each gate!) wired up playing tetris shows how a computer really works in such a profound and awe inspiring fashion.
Magic-1 is sort of the next level higher complexity, where it is made out of very simple TTL (most complicated chip function is the ALU--a circuit I had to build as an EE undergrad out of or- and and- gates) and it hosts a webpage. It currently seems to be down, but you can see it on the wayback machine
https://web.archive.org/web/20210815180101/http://www.magic-...
I will never forget when I came across that site and realized that I was interacting with a wirewrapped pile of ram and nor gates over the internet. There was even a time when you could telnet in and play some retro text-based adventure games, To this day, the only time I have played Adventure was on Magic-1.
I'm partial to the Gigatron[0] myself. Built entirely using a mere 34 TTL ICs available in the 70s (930 logic gates) and it's capable of driving a VGA monitor and 4-bit sound while running at 6.25Mhz. In my opinion, it is beautifully simple and elegant.
Charles Petzold's book Code: The Hidden Language of Computer Hardware and Software explains a computer from the ground up.
I don't know if the ideas still apply to modern computers, but it's pretty cool understanding how things like addresses are decoded and instructions are constructed and executed at the gate level in a very basic microprocessor.
I often wondered i could build some sort of general computing machine if we were pushed back to the dark ages or something. I guess you have to define exactly at what level of technological achievements we were pushed back to. But with the knowledge we have today, and without ICs (or advanced manufacturing facilities) and only "simple electronics" (whatever that would be) if this would be possible. Fun stuff to think about!
First gen transistor computers often used standard functional units - gates, flip flops, and such - packaged into small modules with edge connectors and wired together with wire wrap on a backplane. Like this DEC PDP-8.
Later TTL/CMOS designs replaced the packaged modules with much smaller 74xx/40xx ICs.
You can make basic logic gates with just diodes and resistors, but you need transistors for inversion, buffering, and a usable flip flop.
That's probably the minimum level for useful computing/calculating. If civilisation has ended and you have no transistors you probably don't have the resources to make glass valves either, so that's going to be a problem.
These modules seem to be the primary influence on sci-fi movie computer design, starting with HAL in "2001".
When sci-fi writers need to create some plot tension around getting a computer either up and running or down and disarmed, the characters will inevitably be plugging/unplugging colorful modules at some point.
(I googled this to make sure I wasn't misremembering what I read 40 years ago in an already outdated book at the library and I was suddenly filled with a sense memory of the smell of the interiors of old electric appliances loaded with tubes and dust.)
Yes...there were a couple of generations of what we would recognize as vaguely 'modern' computers (say...roughly ENIAC to the IBM 704/709) built completely out of stuff that looked like this:
Yes, that was the first all-electronic generation after the very earliest relay designs.
They were shockingly unreliable and incredibly expensive. Tubes have a very low mean time between failure, so any design that uses tubes exclusively can't work for more than short periods without breaking down - possibly minutes, maybe hours, probably not days, and absolutely not months or years.
And each failure means a cycle of fault finding, which can take hours or days in turn.
As a technology it sort of works in a prototype way - you can get some work done until you can't. But the unreliability means it's qualitatively different to a modern laptop or server farm.
The wonderful thing about integration on silicon is that it's the opposite - it's incredibly reliable, as long as you keep the thermals reasonable.
Well...certainly true of the original tube computers (ENIAC was famously temperamental), but that module comes from an IBM 700-series, which was a production product. Tube machines from IBM, Burroughs, Univac, Bendix, Ferranti and many others were in no way mere prototypes with hundreds built. The tube based AN/FSQ-7 was for years the basis for the USAF SAGE air defense network.
Tube reliability improved radically over the 15-20 years tube computers were a thing; it had too. And just like you point out about silicon, reasonable thermal management became recognized as important to tube reliability and designs changed accordingly. MTBF was lower than a modern computer, but they certainly ran for days or weeks and more. And debugging was usually fairly quick as you ran some diags that pinpointed the module (not single tube) that failed and replaced the whole thing.
I have an acquaintance with a Bendix G15 that still runs. Admittedly, the G15 is much simpler than an IBM 700, but it's a nearly 65 year old tube machine.
I forgot which book it was (maybe "the three body problem"?) but there was a science fiction story where a Chinese king makes his soldier act as a logical gate and his army becomes a computer. I was like, wow, I didn't think about that, but it totally makes sense!!
In that case, if you want a somewhat entertaining very-high-level overview of what would need to be done, then there's a manga that showed this off a few chapters ago, it's called Dr. Stone. What stuck with me the most was that the purity needed for the silicon used in processors was absurdly high, so much so that they couldn't quite do that just yet, so they made a processor out of parametrons and used magnetic core memory. I knew semiconductors had to be very pure, but it was a bit discouraging to realize just how much effort it would take if you started from zero.
Dr. Stone is great but I also found it to be a bit too hand-wavy. In real life you can't just build steam engines with a small village worth of labor + a "master craftsman". Mining, transporting, and refining iron ore alone is a huge task that could easily consume every drop of the village's labor resources and still not produce much iron. Fuel is also a huge task. Unless you have a high quality coal mine nearby, you have to create charcoal which is also very labor intensive (see: https://www.youtube.com/watch?v=GzLvqCTvOQY). I just can't fathom how Senku realistically makes processors unless he has a nation state worth of labor at his disposal.
But yeah, it is a fun "what if".
"What if a super genius with the entirety of wikipedia in his brain were sent back to the stone age? Could he rebuild modern society?"
IIRC a key obstacle why steam engines were not used earlier despite the concept being known for at least a millenium was the requirement for quite advanced metallurgy - you can make a nifty proof of concept from copper or iron, but a useful steam engine needs to be (a) relatively high pressure and (b) large, so you can do it only if you can reliably and cheaply make large quantities of decent steel. If you can't make large quantities of steel, your steam engine doesn't work; if your steel-making process has unpredictable results, then your boiler blows up at a weak spot, and if that steel is expensive, you're better off having the same people work a literal treadmill instead of making a steam machine.
At least with iron, you'd have the benefit of the existing refined ore lying all around you in a post-apocalyptic setting. There's little need for actually mining iron ore anymore if your population has been reduced by 99% or more. You can walk down any abandoned street and find sources of iron and other metals. Now, there's still the refining process (but it would be shorter from something already processed) and fuel to contend with.
Also, making glass is not just combining sand and seashells and fire.
I don't doubt that they could have made glass plates or something, but they start turning out vacuum tubes and borosilicate beakers next to each other like it was all a matter of knowing the recipe.
>the purity needed for the silicon used in processors was absurdly high
Yes. Silicon wafers are cut from a monocrystalline boule, a single flawless silicon crystal with no defects or inclusions. A big chunk of silicon atoms, nothing else at all. (Doping happens later) To the extent any physical object can be called "perfect", a semiconductor wafer is perfect.
(Of course after manufacturing it will start picking up embedded hydrogen and helium atoms from cosmic rays and alpha particle background radiation.)
The first computing machines used relays which are electromechanical mechanical switches. Current would flow into an electric magnetic and it would magnetize a switch and close a loop thereby switching something "on." By placing these switches together into different configurations you could form equivalent logic gates.
Sometimes insects or moths would get stuck in the relays which would screw up the system. This is the origin of the word "bug."
Prior to incorporating logic into electronics, computing machines were hand cranked or motor cranked gear machines. See: https://www.youtube.com/watch?v=fhUfRIeRSZE. The YouTube video literally is a hand cranked portable calculator.
The use of the term "bug" in engineering predates automatic computers by nearly a century; the Wiki article [1] on the topic gives a pretty good summary of its history.
Then the next question is; what would you do with it?
You need a source of problems to solve, and until you've bootstrapped the rest of society at least to the point where something like high-resolution trigonometric tables, desktop publishing, high-speed accounting, (for example) are needed, the effort isn't going to keep you fed...
I agree it wouldn't be high on the list, but I also imagine there would be practical needs. Like command/control. So, voice only radios first, but some sort of messaging that doesn't need a live listener on the radio would then be a nice next step. And that could be done with a simple computer.
Looking historically, you have a bunch of options for a pre-IC computer; there were lots of pre-IC computers. Transistors, of course, or vacuum tubes give you a useful computer. You can build a computer from relays, but the performance is pretty bad. Memory is also very important. Magnetic core memory is the way to go if you don't have ICs. None of this is going to help you if you went to the dark ages.
As far as mechanical devices, mechanical calculating machines didn't arise until the late 1600s and weren't reliable for many years. It's unlikely that you'd be capable of building a mechanical computer until the industrial revolution. Note that Babbage was unsuccessful in building his machines even in the late 1800s.
If your goal is to build a Turing-complete machine of some sort, even if totally impractical, you could push the date back a lot. But that would be more of a curiosity than a useful computer.
For arithmetic, pinwheel calculator (aka "Odhner's arithmometer") [0] is a pretty decent and reliable mechanical device. You can even give it an electric motor for doing the rotations for you and a numerical keyboard.
On that note, I was wondering on several occasions whether it would have been technologically possible to build neon lamp logic circuits in Babbage's time. Aside from the problem of building an air liquifier a few decades early, I don't see any really major technological hurdles there. That would have nicely solved his problems with mechanical manufacturing...
I used to play that same thought experiment with more basic utilities like my toaster with it's various settings and electronic controllers. Then I was given a Dualit. No more philosophical dilemmas!
Kidding aside, it's always staggering how far removed we really are from operating on (humanly) first principles. Humbling.
There are many people on the internet researching how basic things can be made in a low-tech fashion. I particularly enjoy https://simplifier.neocities.org/ for example.
But if you read those blogs you still notice the mind boggling height of the giants whose shoulders the bloggers stand on. Having access to simple chemicals like acids or various salts for example is huge. I wouldn't even know where to start if I had to bootstrap a highschool chemistry kit starting with nothing but my hands and my knowledge.
The catch is that there are tolerance issues. Doron Swade's account of building the two existing Difference Engine #2 models (http://www.amazon.com/exec/obidos/ASIN/0670910201/donhosek) is a good example of where the challenges lie. It was just barely possible to do with 19th century technology. Physical mechanisms deviate from theory by quite a bit.
Antikythera mechanism was built before the Dark Ages, he just wanted to go back to the Dark Ages so he can do precision work by that time, he can probably build a battleship fire control system at that time.
I think the minicomputers of the 70s well-represent the halfway point between there and what we have today.
At Basic Four Corporation I worked on systems built from 8"x11" circuit boards. A CPU might consist of two such cards joined on the front by a couple flat 50-pin cables and to the other components by a backplane.
Disk Controller: 1 board
Terminal controller: 1 board
etc
Chuck was president of ICS, which was acquired by Basic Four in like the mid-70s. I only met him long after, but he told me stories. Including one about how he wheeled an Apple II into the Basic Four boardroom and demonstrated it, saying in effect "this is the future, and if you're not on board with the microcomputer revolution you'll be left behind". They decided to pass, and continue figuring out ways to sell $50,000 hard disks to existing customers. And that's why most of Hackernews hasn't heard of Basic Four :)
I do know that MAI ended up selling microcomputer based products eventually, but by that time they were well into day-late-dollar-short territory and would continue to lose ground along with all the other minicomputer vendors like PRIME that hardly anyone these days has heard of.
Basic Four was about 300 employees when I landed there and having spent most of my time in manufacturing I didn't rub elbows with upper management. Although they did rub elbows with me once when they thought I was stealing their operating system. But that's a whole nother story. :)
This is a visual representation of about what I understand about a processor and still outside of what I could actually make without a lot of reference material.
Inspired by this great submission, I was instead at http://visual6502.org/JSSim/index.html - 6502 simulator in HTML5 with visual changes on the virtual circuitry.
The quality of not only the product, but the accompanying explanations is outstanding. I think it's a work of art, because not only is it visually impactful (especially at 1Hz, as in the demu), it also uses the medium to convey an idea that would be difficult to convey in any other way.
I'm interested in making (stochastic) algorithms fast, which always seems to eventually lead back to looking at code in compiler explorer. The extent of my knowledge there is basically "short assembly good, long assembly bad". But I've always lacked some "tactile" feeling (for lack of a better phrase) for what a register like "eax" or "rax" is. I hope that learning more about the megaprocessor might help get a glimpse of this.
If the ISA is sufficiently efficient, 8kHz is fast enough to run interpreters. An 8kHz can be useful as a calculator, running thing similar to FORTRAN and, if is has suitable I/O, maybe run a BASIC or CHIP-8 interpreter.
This is so good. I've just watched his 8 videos explaining from transistors to logic and memory. Wonder why he unfortunately stopped at SS8 : Time and Memory now...
It seems to have an address width of 16 bits, there's no mention of a memory management unit, and the website lists its RAM size as 256 bytes, so I'm going to say no, regardless of speed.
The RAM size is of course the most limiting out of those three, but even if it were larger (and one could somehow build it without resorting to integrated circuits), you'd probably run into problems with the other fundamental limits if you wanted fancy things like memory in the order of megabytes.
Of course that's not the point, though, because building an entire general-purpose CPU from scratch at such a human-visible scale and from basic components is a feat in itself.
Hah, of course not. The PC acts as a terminal/controller for this machine. Running Windows 7 on a 8 KHz CPU is impossible, even on a x86-compatible one. WinXP has been shown to run on an extremely underclocked 8 MHz Pentium CPU, booting in half an hour: https://winhistory.de/more/386/xpmini_en.htm
It definitely does not run Windows. It has 32 kB of RAM, 16-bit registers and a custom instruction set. That will never run Windows 1, let alone Windows 7 :)
The megaprocessor is just absolutely wonderful in how it bridges from 'here is a transistor, it lights an LED' to 'here is a computer, it plays tetris'. I always struggled to unwind the layers of abstraction in a modern computer from atoms in the CPU to running python, but being able to just look at a bunch of literal transistors (with LEDs on each gate!) wired up playing tetris shows how a computer really works in such a profound and awe inspiring fashion.
Magic-1 is sort of the next level higher complexity, where it is made out of very simple TTL (most complicated chip function is the ALU--a circuit I had to build as an EE undergrad out of or- and and- gates) and it hosts a webpage. It currently seems to be down, but you can see it on the wayback machine https://web.archive.org/web/20210815180101/http://www.magic-...
I will never forget when I came across that site and realized that I was interacting with a wirewrapped pile of ram and nor gates over the internet. There was even a time when you could telnet in and play some retro text-based adventure games, To this day, the only time I have played Adventure was on Magic-1.