and people still reach for their phones to use the calculator even when they were already operating a mind bogglingly quick calculating machine. Or even get up to go find a desk calculator!
Yep. For any calculation of up to a dozen or so numbers it's far easier, quicker and more pleasant to use my HP than the computer, and even more so when in a meeting or on the phone. UI matters, physical interfaces matter.
I used to use a TI-83+, but now I've gotten into the habit of opening a Python or Node.js REPL for math. That way, I can copy-paste values to and from emails or chat. It also helps that I already know the syntax and math libs.
I tried bc[1] as well, but it has some really annoying defaults. For example: While it's capable of arbitrary precision, it calculates out to zero decimal places unless you tell it otherwise.
I've tried that but for me a REPL for calculating occupies an annoying middle ground: not as efficient as a calculator for quick calculations and not as powerful as say MathCAD for exploratory work. IPython's a step in the right direction.
Pilots lament their passing in the cockpit. Plenty of readings and buttons that were available instantly are now only available after the perusal of menus used to multiplex a single display or button across many readings or actuators.
Actually microprocessors and microcontroller we're pretty obvious ideas at the time, engineers knew they we're coming , and they didn't even deserved a patent.
It's probably just at the time tech and design tools matured enough to such complexity, and the potential market was ready.
The start of a new era. I was just an infant when it came out. By the time I got into computers the 6502 and 8080 were out, by the time I was a teenager the 8086/8088 came out but all my family could afford was a Commodore 64.
I learned the 8088/8086 instruction set in college when I learned assembly language. I was too young for the 4004 chip. But it changed everything.
http://www.vintagecalculators.com/html/busicom_141-pf_and_in...