Unfortunately, domestic Soviet computers were effectively killed by a 1970 decision to base all future efforts on a "Unified System" which was really a clone of IBM System/360:
https://en.wikipedia.org/wiki/ES_EVM
Playing it safe by shooting yourself in the head...
I had always wondered if someone had ever built a ternary computer. Would it even be practical to build one today? Given that binary computers rule the computing world these days, I'm not sure there would really be much use for it.
I pulled out my copy of Knuth vol 2 to answer this (pages 190-193):
* You can negate a number by interchanging 1 and ̅1.
* The sign of a number is given by its most significant non-zero trit.
* Rounding to nearest integer is the same as truncation.
By the way if we had balanced ternary computers today, then you wouldn't have programming languages with signed and unsigned integers. They'd be the same thing.
Of course the above might be academic if it's more difficult to implement in CMOS ...
Representing binaries numbers with explicit sign bit and the rest unsigned gets you an even better version of that; this is how floating point handle the sign.
I'm skeptical of value of ternary. The key behind the success of binary computers is that binary has great noise immunity; having a _single_ threshold is way simpler to get working than two or more. (Yet we do go there for NAND Flash, but note how speed and endurance worsen dramatically as thresholds are added).
The "most economical radix" is often cited but it's a red herring - it ignores practical concern of greater importance.
suppose that we got robust practical 3 level logic elements, then potentially we'd need less of them for the same bit-width of computation thus less energy consumption, and the multiply÷ can probably be done in lesser number of cycles.
Wow... the writing, the prose... wish I could put words together like that.
Sample:
""" Fame is partial with her favor, and has not seen fit to bestow any upon the creators of the panel switch, the type E relay, the crossbar marker circuit. There are no biographical anecdotes we can summon to illuminate the lives of these men; the only readily available remains of their lives are the stark fossils of the machines they created.
"""
Number Five Crossbar is well remembered in telephony, and studied by people who do high-reliability systems. It was one of the first systems that was more reliable than its components. No one person is associated with it, though; it took a big team to develop.
That was definitely one of the first things I noticed about the writing; the author is very skilled in this regard.
There are times when I can lapse into such prose, but I try to balance my writing with my perceived audience's expectations. One could compare my postings here with that of other forums I frequent, like /r/justrolledintotheshop, and they would definitely notice a difference.
I don't believe I could summon the level of prose this author has, though.
According to the about page, he is a software engineer that also wrote a philosophy PhD thesis at Princeton on the history of computing. Sounds pretty fascinating.
Finland had strong import controls directly after the war, so imported electronics was very expensive. Industry automation still used legacy hydraulic logic (fluidic logic) to control complex automation in pulp mill processes in 70's. It was very steampunk.
Pneumatic signaling has been commonplace too... set points for valves, etc. can be sent by varying the (low) pressure in an air line to the valve itself.
One downside to this is the case where there's condensation in the line and temperatures go below freezing. You can lose control thanks to the ice that forms. I remember this causing trouble with powerplants in the 90's during an ice storm in Houston (which is usually quite warm and humid, so they didn't think too much about icing.)
Yes, there's lots of building automation / HVAC stuff that uses pneumatic signalling/actuation (like the infamous T-4002 thermostat, look up images of it, you've probably seen it before!).
Amusingly enough, lots of those pneumatic systems use 3-15 PSI signalling, which works the exact same way that 4-20 mA signalling works -- with a live zero so breaks in circuits can be detected!
Early in my career, I did some work on what was essentially a Foundation Fieldbus to 4-20ma adapter. To get it to control our pneumatic valve for a demo, we wired it to a 4-20mA to 3-15PSI converter and then to the valve itself. It was a bit Rube Goldberge-esque, but it served our purpose well.
Automatic transmissions in American cars were used hydraulic fluid based computers in the 1980 - I recall taking apart a transmission with my dad and looking at the hydraulic controls. I don't know when they changed over to electronics, my guess in late 1980s. I suspect all auto manufactures stuck with hydraulic controls for a while - transmissions are ultimately controlling hydraulics so electronics didn't save complexity (CPUs give other advantages that make them worth it in the long run)
I accidentally discovered a year or so ago that Thomas L. Dimond (working at Bell) invented core rope memory (which the wikipedia page says was "first used in the 1960s") in the 1940s. His memory got used in phone switches as a fast and reliable read-only lookup table. Here is a paper about the "Dimond ring translator": http://etler.com/docs/Crossbar/articles/30-AMATranslator.pdf
"Writing" was done by manually stringing wires though those big rings. Those could be changed, but it was a big headache.
It's like classic reverse DNS. There was already a mapping from A to B (phone number to outgoing wires) and a mapping from B to A was needed for billing purposes. But B to A info couldn't be obtained from the switch fabric. A physically separate B to A mapping had to be built and maintained in sync. There was nothing which inherently made the two match. That was all done by hand.
The early history of computing was a struggle to find a usable memory device. Relays were very bulky as memory devices, and none of the relay computers had much memory.
My favorite on this subject is "Reckoners: The Prehistory of the Digital Computer, from Relays to the Stored Program Concept, 1935-1945" by Paul Ceruzzi - sadly long out of print.
I worked on a project that replaced an electromechanical Telex switch. Four racks of microcomputer based hardware replaced an entire floor of mechanical relays.
However it was a marvel to see and hear that thing click and clack all day, with technicians with their ear trained to detect issues and replace relays by listening to the switch.
Of course billing was done through a guy on top of a ladder taking large format B&W pictures of the bank of user's (mechanical) counters.
I wish I had. That was my first job out of school, and I developed the UI for commands and reports, all through the BAUDOT-code based terminal. Our field test switch sat on a corner, with some more modern electronic telex terminals (they were blue), along with EPROM burners for patching, etc. The big switch was the rest of the floor. So many things I participated in 30+ years ago and didn't have the idea of taking at least a couple of pictures. Such is life.
Even though I only scanned the article, many of those computers appear in Turing's cathedral G.Dyson[1], book which I enjoyed, given that there is not a lot of history lessons at the universities (at least the one I went to in Buenos Aires... UBA).
I always thought it would be neat to have a CS History lab where every couple of weeks the students would program with a particular generation of computer, moving up through time and getting a true sense of the ingenuity at each step.
Might be expensive to house and maintain something like the SSEM or the CSIRAC, but universities have spent money on sillier things.
Emulators are good enough for most students. I did "circuit design" in an entirely emulated environment and got what I needed out of it as a computer programmer. It would not have been adequate for a real EE, because for instance it offered unlimited fan-out and fan-in, but from my perspective I wasn't losing much to abstract that away.
True. Mixing emulation in would probably save a lot of work and resources. I feel like having to do things like manually punch up your programs on cards and manage with tiny amounts of storage would be a great experience and really force students to work towards elegant solutions.
I love getting my Dad talking about some of the labs he's worked in over his career and kind of wish I could get a sense of how much has changed over such a short time.
The other nice thing about emulation is that you get to use modern IO hardware, particularly keyboards. I can imagine having fun programming on a Commodore 64; I have a much harder time imagining having fun doing it with an actual Commodore 64. The keyboards were not all that great by modern standards when they were brand new, don't get better with age, and certainly don't get better in a student lab environment :)
At my high school circa 1980, the introductory computer course (on a PDP-11/34 running RSTS/E) had us write our first BASIC programs onto mark-sense cards (the same size and shape as punch cards, but you marked them with a pencil). I recall a field for the line number, another for a keyword (pick from a list), and then the remaining part of the card was for variables and operators.
After a couple go-rounds with that, we were then allowed to use the terminals, mostly Visual 200, with a few VT100s, an LA36, and an LA120. The Visual 200 wasn't a particularly good terminal; it seems there was always at least one out of order at any given time.
It's kind of scary that a $5 Raspberry Pi Zero is much faster and has vastly more RAM than the 11/34, which cost well north of $100K back then. Not to mention that there was a grand total of 28 MB of storage on that 11/34 in the form of two RK07 disk packs - you could fit thousands of RK07 images on a single 64 GB MicroSD card.
At least from an EE perspective its interesting to contemplate "here's the highest performance discrete RTL technology that could be designed in 1960 using 1960 transistors... You have free reign to design a RTL NAND gate using any 2017 production transistor you want. In the spirit of IBM mainframes, no power limit. Can your NAND gate break 150 MHz? Can your NAND gate break 15 GHz?"
"You can buy FETs that have 1970s performance specs in 2017, very cheaply in fact, but in 2017 you can also buy FETs that output many watts of power at 10 GHz x-band microwave freqs. Here's a DEC flip-chip module containing two flipflops that topped out around 30 or so MHz in 1965, your assignment is to use modern transistors in the lab to make a modern work-alike operate over 10 GHz"
At that speed the simple fact of using discrete components kills it, surely? Multiple picofarads of parasitic capacitance on the leads, plus gate capacitance itself?
> buy FETs that output many watts of power at 10 GHz x-band microwave freqs.
Part number? Gate drive for that must be 'interesting'.
Agree this this is a good idea, although not necessarily on real hardware. There's enough both CapEx and OpEx that would be necessary for the hardware that it would mainly tend to restrict access to the course.
That said, on emulation, it's a great idea. It's easy to forget how many of the assumptions of C-like languages and Unix-like operating systems are built into the hardware, particularly when there are other ways to go about the task. There are reasons the industry has gone the way it has, but it's worth keeping those ideas around for posterity.
If this interests you, I've got to throw in a plug for the Museum of Communications in Seattle. They've got a bunch of old mechanical telephone switches running. There's something truly awesome -- in the correct sense of the word -- to be standing inside a computer and hearing the signals passing around you. It is a unique experience that I strongly recommend.
I built a relay computer for a science fair in the 60's. The power supply was from a pinball machine. The relays were mercury-sealed, from an aircraft.
It had seven words of five bits each (that's all the relays we had). It could add, subtract and store results.
I did that, too.[1] 10 words of 7 bits, addressed with a Strowger switch. Add, subtract, shift, branch, conditional branch, so it was Turing-complete. The "paper tape" reader had so much drag I had to use vinyl seat-cover material for tape.
BTW - how old were you (if you care to tell the world!) when you created this machine?
What made you take pictures of it? Did you keep any part of it, or was it all relegated to the "junk bin"?
I have found that it is extremely rare that these machines like yours ever have pictures, much less the writeup like you have created. It doesn't appear many people created such computers back then, and I don't know of any who published how they created them. For instance, I have yet to see any old "Popular Mechanix" style article from the 1960s on "Build Your Own Electronic Brain" (as I would imagine it would be titled) - but such an electro-mechanical project would certainly fit those kind of pulp magazines.
Which I find odd. I don't know why these machines - few as they were - were never publicized; perhaps there wasn't an audience, or because there were so few, those with the ability to write such an article were fewer? I do know there was some interest in computing at a "lay-person's" level, because there were several books on contemporary forms of computing and programming available in the 1960s (most had enough information to allow a person with sufficient skills and knowledge to design and build a simple machine - perhaps you got inspiration from such a source for yours?).
These early "hobbyist" computers, along with early hobbyist robots - represent the very earliest dawn building toward the microcomputer revolution of the later 1970s - but the vast majority, if not all of them, are lost to time, unfortunately.
Fascinating read, especially The part about Bell Model III.
The machine had lookup tables, which were actually tab-separated paper. It was an actual table. They further implemented in the functionality to jump forward and backward in that table (they called it "hunting").
I don't have much experience in assembly, but reading this article made me appreciate it more. Assembly basically operates with the same principles as the first computers.
There's another kind of "lost" computer that many don't know about. Actually, I hesitate to call it a computer, as it didn't compute anything, and none of the actual machines had anything like a conditional branch operation that I am aware of...
...they're called "reproducing pianos" (also "reproducing player pianos" and "reproducers"). Not many were manufactured, due to their complexity, need for a lot of maintenance, and sheer cost.
Basically, they were a kind of player piano that strived to reproduce the actual mechanics and technique of the person who "recorded" the original paper roll. They did this by having additional tracks which handled certain nuances of the player and such, such that when the roll was played back, the piano could play in the same manner.
These player pianos were much more mechanically sophisticated than regular player pianos, and those extra tracks acted like a form of control structure for the notes being played. I believe that on some of the models meant for public performances, you could select the song (and it would "wind" itself to the song, sensing when it had located the piece), and I think they also had an auto-rewind function - but that was about the limit of their operations.
I've always thought of a CPU - in it's simplest form - as nothing more than a sophisticated and fast "player piano", with memory being the roll, the word at an address being the holes in the roll at a certain point, and the CPU being that which controlled the operations and were instructed by those same holes. This in fact was actually implemented in some early electronic computers (known as "drum-based" computers).
The history of computers and computation is a fascinatingly deep and varied field of study; I encourage everyone to delve into it a bit.
We built relay computers in HS in the early 60's. There was this street in London: Lisle St, in the Soho district. One could buy surplus PO 3000 multi-pole relays there real cheap - up to 10 pole IIRC. These were the relay type used on the Post Office Telephone exchanges, pre-electronics (up to #5?). In 66 I also discovered a recently published book by Russian authors: "Introduction to the Theory of Finite Automata" by Trachtenbrot and Kobrinskji https://g.co/kgs/EqZYC2 which taught me the formal of the topic using relay logic. I believe the robustness of Russian space tech was due to use of relays as well as tubes. That book gave me my career, effectively; turned out to be equally appropriate for electronic logic circuitry.
These types of articles always puts the amount of computing power I spend to watch videos of a cat jumping into a box and falling over into perspective.
Around the time of ENIAC, there was a quote to the effect that you could do a lot with ten million multiplies. I loved the fact that they were thinking in terms of absolute number of operations (as opposed to rate), and that the scale is just so vastly different.
https://dev.to/buntine/the-balanced-ternary-machines-of-sovi...
Programming with trits and trytes!
Unfortunately, domestic Soviet computers were effectively killed by a 1970 decision to base all future efforts on a "Unified System" which was really a clone of IBM System/360: https://en.wikipedia.org/wiki/ES_EVM
Playing it safe by shooting yourself in the head...