Scientific calculators usually provide constants such as e and π but there was no space in the ROM for these constants. The Sinclair Scientific used the brilliant solution of printing the constants on the calculator's case
The Visual 6502 group enjoys dissolving old chips in acid, photographing the die, and reverse-engineering them.
Unfortunately (or not, depending on your perspective) it's not just the Visual 6502 group that enjoys that. Any leading chip in the marketplace these days is almost immediately reverse engineered in this way. I haven't been in the chip business for a few years but I recall a company which offered this as a complete service - they'd send you detailed circuit schematics and so forth.
The difference is the Visual 6502 team does it for fun. It makes me wonder what cutting-edge technology from today will be hobby projects 40 years from now. My predictions: scanning tunneling microscopy, lots of genomics / molecular biology, "big data", some current cryptography, a lot of AI.
And to reply to the parent comment - designing the calculator in the first place is much more amazing.
If you go back through their blog posts, they have some cool techniques for bypassing "security" metal meshes to read out the secure FLASH/ROM sections of chips.
IIRC, one of the guys behind FlyLogic was involved in the high-profile reverse engineering of satellite receivers (ie. satellite pirating).
Unfortunately, as calculator prices collapsed, so did Sinclair Radionics' profits,
and the company was broken up in 1979 after heavy losses.
He fought Moore's law, and the law won.
But this whole thing reminds me of Woz's work in the first Apples. Why wasn't his genius work similarly wiped out? Soon after the Apple, there were dozens - hundreds - of new personal computer manufacturers.
I think it's software. The value of a platform is what you can do with it. Software increases what you can do, therefore increases the value of the platform. There's increasing returns, so once it gets started, it gets harder and harder to stop.
Branding is also important (Jobs), which is why the Apple eventually fell to the "PC" - because the strongest computer brand in the world for decades was IBM.
One unnamed reason for Sinclair Radionics' falling apart was their digital watch, Black Watch, which was the ultimate example of Sinclair's infamous cost cutting and lack of quality control. http://www.nvg.ntnu.no/sinclair/other/blackwatch.htm
At the time of the C5 I worked for one of the then government owned utility companies which had shops selling the C5. Was able to get a staff discount but was told of of the 2 seater version with a roof that was planned for release and saw pictures and spec etc, so I waited.
Great idea, ahead of it's time engineering and attitude wise as well. Did not even have rare earth magnets, seen C5's with rare earth magnet motors instead of the stock motor and they are dangerously fast. Today we are slowly getting there, though cars still designed for everybody with a 4-5 member family with a boot of shopping all the time and the other end is a motorbike. Still a car/bike electric combinations do appear, albeit not mainstream and most put of from the safety of getting hit by others as you are smaller on the road effect. Which makes you think about the limitations of a vehicle that is surrounded by taller and larger ones, why we now start to believe in bike lanes.
Which was a factor back then and why the C5 never kicked off enough to make the C10 and C15 go into production beyond the prototypes sadly and C5's started coming with a flag pole akin to a football/soccer corner post style pole and flag.
30 years later, and the electric car market is still miniscule.
There might start to be a market demand for them when petrol is at least twice as expensive as it is now. But there's some wiggle room to reduce the tax on it a lot.
Fair point given the whole model RC market users end up going for petro based models over the electric ones bases upon usage per charge and charge time and conveniance of charging factors. Maybe when that area changes things will progress, but that need a nice leap in battery technology.
I still feel though the whole electric vechile area needs more personal cheaper options and sadly even the electric bycicles options out there are not exactly cheap and flooding the markets.
When gas/petrol goes to 2x what it is now it'll be on a journey that is very fast and shoots to 3x, 4x and beyond. This will cause huge economic shocks and there won't be the time then to start building electric cars.
You could be right. For most European cities this wouldn't have a terrible effect, as everything is close together and we have things in walking distance.
I know this is a predictable remark, but I miss the good old days. The Apple II was so simple and understandable that I was able to write a program that played two musical notes at once (I called it the "Electric Duet"), in tune, and a fairly tolerable sound, even though the speaker was driven by a TTL output (meaning no analog level control).
If the Apple II had had interrupts, my program would have stopped working. I know this because an Apple II successor supported a mouse, the mouse required an interrupt, and it ruined the performance of my program.
Oh, I almost forgot -- Tom Clancy wrote "The Hunt for Red October" on Apple Writer.
That program inspired me to (cough) take it apart and understand how something so cool could come out of the Apple II's speaker. That exercise taught me 6502 opcodes and I've been working in embedded systems ever since.
Looking back, I realize I must have spent two weeks just writing the Electric Duet player, because the time through the player loop had to be constant. The loop read through the notes to be played, synthesized them, and changed their duty cycle, but had to use the exact same number of machine cycles regardless of what it did, so the music stayed in tune.
Near the end of the project, I had two problems: the music was slightly off-pitch -- middle A wasn't exactly 440 Hz -- and the number of machine cycles wasn't quite constant (it varied between 40 and 41 cycles depending on which branches the code took).
Finally I realized I could insert a single NOP (no-operation) instruction at a critical location. The NOP forced the number of cycles to be a constant, regardless of the path through the loop, and I then realized that it also slightly changed the output frequency to be tuned exactly right.
Horribly off topic, but what is the current status of the Electric Duet player routine? I read somewhere that you had placed it in the public domain in the 90s, but I can't find any confirmation of that.
I didn't really "place" it in the public domain, I just let nature take its course. The inevitable result is that there are now only versions of the program online in which my name and copyright notice have been removed:
I searched for the original program online, but couldn't find it -- only the pirated version. By "pirated" I mean someone has removed my copyright notice and offered the program as their own, successfully replacing the original in all online archives.
Yeah, the Apple II era seems to have been particularly destructive with removing copyright notices in favor of "crack screens" and other drivel. Bleah.
More recently, I've been working on my own Apple II game (really). I'd like to release it fully open source, which makes it particularly challenging to use third-party libraries since open source wasn't really a "thing" when they were written. Some of them (notably old Beagle Bros. disks) have since been explicitly placed in the public domain.
So... I would very much like to have some Electric-Duet-based music and sound effects in my upcoming open source Apple II game. (Yes, I just typed that.) Would you be willing to relicense your amazing player routine under an open source license, or to place it in the public domain? I would, of course, give you full copyright attribution in the scrolling credits screen or wherever you'd like.
The original was copy-protected, I believe, which means it cannot be archived in the standard .dsk format (there are some more complicated formats that represent a disk at a lower level, but they are not very well supported by emulators). In a few decades, the only surviving copies of many 80s software programs will be the ones that were cracked and distributed widely. In this way, crackers and pirates acted as inadvertent archivists.
For what it's worth, the pirated version I had (back in the old days) had your name on it (plus a crack group's name and probably BBS phone number). I remember thinking "Hey, that's the Apple Writer guy, what else did he write?" and that's how I found GraFORTH. Bought Leo Brodie's book and had a blast learning the language, then actually implementing small Forth-like languages for the fun of it. There was a Byte book, "Threaded Interpretive Languages" that gave me several epiphanies about programming in general and language design in particular.
I used Apple Writer for years, until eventually my typing speed increased so much that the program couldn't keep up with me anymore when typing long paragraphs (the word wrap algorithm, that was run after each character, apparently took time proportional to the length of the current paragraph, with a big enough constant that my 1 MHz Apple II started dropping keystrokes regularly despite the program having a 32-character buffer). I eventually resorted to disabling word-wrap while typing, and re-enabling it prior to printing (Control-Z, I think it was). Later, I bought a Zip Chip that increased the computer speed to about 3.5 MHz, and the problem disappeared.
I wasn't using Apple Writer by then, because I had discovered
another word processor, a weird, Rube Goldberg contraption called Gutenberg Sr. that used double-high-resolution graphics at a time when few programs did, and had a more powerful markup language (troff-inspired, I found out later), could two two-column printing and had some page layout capabilities, and had great support for printer-downloadable fonts, including user-defined ones. The interface was atrocious but the software itself was powerful.
I've had similar situations, but never as victorious as that. That's awesome. Thanks for sharing that story....(now I want to get that code back somehow and look for that NOP)
I think I said this before, but thanks for writing GraForth and TransForth. Besides limiting the Dijkstra damage from my exposure to Basic, it probably inspired the processor i designed in college - it was a stack machine and its assembly language was very Forth-like.
A good question. I'd say that the Apple II family survived longer than it deserved to because the hardware was so nice. Woz combined classical hacker insights with a look-and-feel quality ethic that the Commodores and Ataris couldn't live up to, being designed to target a price point beyond all else. Sure, those computers could do tricks with graphics and sound that the Apple couldn't match -- but if you wanted sharp, readable 80-column text for word processing or VisiCalc work, your only options were an Apple //e at $1500 or a similarly-equipped Tandy or IBM PC for well over $4000.
Apple's huge array of (relatively) low-cost peripherals also helped it compete against other platforms in the early days. These aspects weren't sexy but they were important to lots of home and small-business users.
Famously, the company was also smart enough to target K-12 education, which cultivated their marketplace from the ground up. Once again, the impression of quality versus the low-cost 8-bit competitors would have helped them do that.
I agree Apple did well by getting entrenched in the education market in the US.
Combined with Tandy's failure to follow up, and Commodore's incessant bungling in the US market (such as repeatedly pissing off their dealer network by letting their dealers absorb unannounced price changes) and decision to send most of its stock to Europe where they got higher margins, that explains more of it than lack of alternatives.
In Europe, Apple remained virtually unheard of until the Mac compared to Commodore, Atari and homegrown brands.
Largely thanks to its European market, Commodore still massively outsold Apple in terms of units shipped up until well into the Mac got popular - it's just that by the time the IIe started getting traction, Commodore had abandoned the high end and gone for volume, and by the time they went after the high end again, it was 1985, and Commodore's brand and sales channel in the US was even more tied to the cheap gaming image.
But if you wanted sharp, readable 80-column text you had several alternatives at the time of the IIe launch, at around the same price range, both from Commodore and others (from Commodore, the later entrants in the PET range - one is actually on display in my local library - as well as the ill fated Commodore B128-80)
Most of the other entrants that could compete for productivity applications in the same price range were from small unknown companies or insufficiently compatible with anything to get enough software, though, and so had little chance.
Apple seems to be still around. You can google it to check. Apple's problem with PC was initially simply that IBM sold the PC. For many years PCs were more or comparably expensive and much less useful than Macs, but sold because they were backed by IBM. And Apple did very well despite its lower market share until John Sculley started listening to pundits and lowered prices in the early 90s. Even so that worked Ok until Spindler listened to more pundits and started allowing clones.
Sinclair's branding was huge in the UK, but you can only sell so many calculators and the leapfrog in sophistication in those days was breathtaking. Technologies plummeted in price within months.
But Sinclair went on to kick start the UK Microcomputer industry. The ZX range (Aka TS range in the States) was one of the most popular an influential home computers of the 8-bit era. I think only the C64 beat it world wide. it was also one of the most cloned systems as Brazil, Russia and many other Eastern European countries cloned it and extended the original design.
Apple was niche and mostly unknown in the UK at the time. The Apple II never really made it here.
> But Sinclair went on to kick start the UK Microcomputer industry.
Not even remotely true. There was a large homegrown industry (I worked in Cambridge at the time). Sinclair was a well-known early player but many others grew up at the same time and independantly. For a time it was known as the "Cambridge Phenomenon". What did spring from Sinclair was Acorn. If anything kick started the UK microcomputer industry it was the Cambridge University Computer Lab where Sinclair and others got much of their talent.
Not as a home computer as it was expensive it did have a good run in the business market but in the UK the PET was probably the no 1 business PC.
I can still remember humping various PETs in and out of Olympia for trade shows - and I still recall the various tricks you used to make sure that disks copied correctly.
> Branding is also important (Jobs), which is why the Apple eventually fell to the "PC" - because the strongest computer brand in the world for decades was IBM.
IBM's deeply entrenched position in supply office equipment (pre-PC computers, sure, but lots beside) to business played a role in securing PC dominance over Apple in the personal computing market, but so did the fact IBM didn't protect the PC from clones (IIRC, this was not intended) or acquire exclusive rights to DOS, which resulted in both more variety and more low-price options in the "IBM PC-compatible" family.
Good point, how calculators differed from computers. Branding surely helped IBM establish the PC, but against Apple what mattered was its years-later entry to the 16-bit world, with a relatively closed system, with high margins. Apple had a shot at 16-bit dominance but missed. (They came out with the Apple 3 and Lisa first.)
While clever and inexpensive, the low speed and low accuracy of this device made it unusable. It was billed as "3-figure accuracy", but in fact it only got that on some cases. A bright undergrad or grad student could quickly uncover useful problems that it flat out couldn't solve.
In short, it was a toy. Anyone basing one's academic grade on this thing was a fool. You really did have to spend the money for an HP-35, or the later Texas Instruments SR-5x calculators that were less expensive.
The bit that is probably most striking to modern eyes is the data representation. With 320 instructions there's simply no room for the "obvious" code to translate to and from a display representation. So everything was stored in BCD and operated on one (decimal!) digit at a time using a 4-bit ALU.
You're right that BCD is very common for calculators. BCD was also commonly used in microcomputers, since you save all the binary-to-ASCII code for I/O. This is why x86 has a bunch of BCD instructions like AAA (ASCII adjust after addition), which was important enough to be a single-byte opcode.
The 6502 is notable for its highly-efficient and patented (https://www.google.com/patents/US3991307) decimal arithmetic mode. I'll write up its interesting circuits sometime. One consequence of the patent is the processor in the NES video game is a 6502 clone that lacks decimal mode.
I had thought the 2A03 didn't have BCD to save silicon-- but apparently they just crudely disabled it by removing 5 transistors[1]! Would Ricoh's second-source 6502 license have been more expensive if they included BCD?
"You might be surprised to learn that the calculator chip cannot perform multiplication natively. There's no floating point unit to multiply two numbers."
Nowadays the silicon real-estate cost for floating point math is trivial, and chip area is filled out with RAM cache for lack of anything better to do ...
Nicely done, especially compared to the fits that HP went through trying to figure out how they could prove or disprove that all 11 digits of their calculation were correct.
40 years ago these calculators were looked upon with a bit of skepticism. Engineers that were used to seeing the log tables with their own eyes and hand-manipulating slide rules were being asked to trust the results coming out of these calculators.
And lives depended on it, really. If you were a civil engineer designing a bridge and you needed to be absolutely sure the numbers you were using were accurate to 7 places and totally correct, would you suddenly put all your faith in this small brown box with no way to examine the inner workings?
HP went through a lot to build that trust and it was rightly earned. HP's reputation for building solid accurate calculators kept that business going for decades to come.
Trusting the numbers coming out of a calculator was ALWAYS a bad idea, especially if lives depended on it. It wasn't just the calculator making a mistake - you could have made a mistake keying in the numbers.
This is dealt with by:
1. running the results through the inverse equations to verify that you get the inputs back again
2. calculating the results using an independent method to verify them
3. having a different group of people independently check your numbers
4. have your results pass a "reasonableness" test, i.e. do they make sense
5. put the resulting design on a test rig and verify the numbers experimentally
How do I know this? I worked on critical flight control systems for Boeing.
Any engineer who just punches numbers into a calculator and bets lives on the results ought to be fired.
These days, we can also have our calculators keep track of the units associated with a quantity, which helps greatly with sanity-checking. If the units come out wrong or mis-match anywhere in the calculation, you'll know immediately.
It's nice to have extra checks, but that changes nothing about what I said. If you are making designs that lives depend on, it is not HP's responsibility, it is YOURS and you should be using the techniques I outlined.
I'm sorry to be so blunt about this, but I feel strongly about it. I too often run into engineers that either argue that they can design a perfect system that is not subject to human error, or they try to shift responsibility onto other people or organizations.
BTW, I think every engineer should watch the series on the Smithsonian channel "Air Disasters". Each episode chronicles a particular disaster, and then painstakingly goes through what went wrong and how the problems were fixed. One can learn a lot about systems design from these case studies on how and why things go wrong.
In your opinion, if doing step 5 is cheap enough, do you still need 1-4? (That is, is it "okay"--in a pragmatic sense--to have sloppy thinking, if you can just throw numbers at the wall to see if they stick?)
A very old school surveying teacher (head of department as it happens) wasn't entirely convinced about calculators in the late eighties. I showed him how to get an extra decimal place of seconds of arc by subtracting the whole number of degrees.
The real sticking point was getting a real arctan2 function (returning the angle in the range [0,360]). He really liked some BBC Basic programs I hacked up. Fun times.
I owned Sinclair's programmable scientific calculator from a few years later which was similarly "interesting". It had 36 program steps (vs. 72 for the HP65 iirc), so storage system, and only one memory. Each constant in a program counted as N+1 steps where N was the number of button presses were needed. So it taught me about multiply re-entrant code.
I also owned a Sinclair ZX-80 (aka Timex) computer, later upgraded to a ZX-81 ROM with 16kB of memory.
Overall, Sinclair's problem seems to have been going a little too far in the "worse is better" direction with every product.
If you need to explain this too less geeky friends then
telling them that this amount of storage is less than a single letter on a modern display. Which on a 32bit display at 12x12 you would be on 576 bytes (8 bits) and this is compared to a 320 11 bit word (320x11/8=440) 440 bytes.
This makes chess on a 1k zx81 including display seem like bloat-ware now :).
Nowadays we have more storage on the keyboard controller chips, heck the older ones during the 90's had 4 KB storage, so almost 10x more ROM alone to work with - for a keyboard.
Given that Sprite managed to get Linux booting on a hard disks processor [0], I'm increasingly aware that the embedded device we use are overpowered. Makes for some interesitng hacking though, I bet you could run a webserver on my washing machine.
So is 'reversing' a common lazy shorthand for 'reverse engineering'? The title confused me until I realized what it was about. Feeling like an old fart...
I really wish someone would make a modern scientific calculator - Imagine what would be possible on a modern ARM processor. Even the new HP calculators use old ~70mhz ARM processors emulating the even older saturn HP48 code...
You could build it on top of Android, and have the software be open-source while making money from selling the hardware (so you could run it on a touchscreen, but if you wanted a keyboard you'd buy the calculator).
Processing power hasn't been a bottleneck for a long time. Even the 4MHz Saturn processors were able to do reasonably fast 3d wireframe graphing. The only reason graphing calculators and scientific calculators still exist outside of schools is because they can provide a specialized user interface: a physical keyboard with a layout designed around calculation tasks, and a software environment that is similarly optimized for that narrow range of use cases. Neither of those aspects benefits significantly from Moore's Law. Instead, they can only benefit from the complementary products that other gadgets have made commodity components: high-resolution displays, lithium-ion batteries, SoCs that replace ASICs and make features like USB and SD support free.
There are modern calculators: Mathematica, Matlab, and the most used: Excel
For tasks that need to be done away from a computer sure a calculator is handy but that has rather dwindled down to basic calculations. Schoolwork sure calculators are great but I never needed more than my HP-48 and when I did it was usually an assignment meant more for MATLAB/Mathematica.
Now for your point of building it on top of android, what I want is not another device but I want a new calculator built for android. Everyone has a smartphone these days why should I have to buy a calculator when I have a strong computer in my pocket. We just need an interface for smartphones and the appropriately strong (Programmable) back end.
What's more interesting than how the calculator works is how Sinclair was able to write the code for such a chip, which the article doesn't attempt to guess at. I wonder if he used some kind of boostrap on paper, looking at the algorithms.
I don't know about the Sinclair development specifically, but for the HP-35, they wrote a register-level simulator in software that let them develop the algorithms and single-step through for debugging. My favorite quote from the article: "Correcting a problem was a simple matter of changing a punched card or two."
Rather Ian with harder/stronger I than John.
Well Polish it is not that hard.
It's only second hardest language on the world (China I look at you)
If it is hard for you. Well try to spell this thing correctly
"Zażółć żółtą jaźń" or "Chrząszcz brzmi w trzcinie"
Sorry for OT.
All thing look really amaizing. I'd love to see more in this topic.
After I got used to RPN (I had a lovely old HP-21) I found it much more intuitive. Stack-based thinking. No parentheses! (And I've used LISP for years, but that's entirely different.)
RPN vs Infix/Algebraic used to be a holy-war topic. Like emacs vs vi, or iPhone vs Android.
The one issue I've already had with RPN is that you require a separator between numbers anyways - in RPN it's too easy to parse "23 4 +" as "2 34 +", for example, especially if you're writing quickly.
On the HP calculators the separator would usually be the enter key. I think the old ones displayed just the top entry of the stack, on the newer ones with bigger display you see the stack with each entry on one line.
The joke has been going on for quite some time now - Lisp and related languages use Polish Notation everywhere.
(The main benefit is that (R)PN is parenthesis-free, so it makes it easier for a calculator to process expressions with multiple operators. The use in programming languages is nonexistent but to make clear that +, *, .. are just like any other function call, where you would naturally use polish notation.)
Forth is arguably the best language in terms of expressive power/language complexity, though that's largely due to the denominator.
It's a great choice for tightly constrained devices, though far fewer devices have such constraints now. The mental overhead of having to track stack-effects for each function makes it difficult to scale to large systems and maintain productivity, but if you're trying to eke every iota of power out of a chip with a tiny amount of ROM, it's hard to beat a direct-threaded (or token-threaded...) Forth.
I don't know about less ambiguous, it can still require only single token lookahead and be context-free without any parentheses, but Lisp-ish languages are easier to parse because you have fewer productions to worry about compared to more complex languages.
Scientific calculators usually provide constants such as e and π but there was no space in the ROM for these constants. The Sinclair Scientific used the brilliant solution of printing the constants on the calculator's case