Hacker News new | past | comments | ask | show | jobs | submit login

Not new at all. Remember that Commodore disk drives had their own 6502s in them, for example (some of the big PET drives had two). A joke at CBM was that the 1541 was the best computer they ever made.



Ironically, in the early 80s, we did it mainly because of the shortage of computing power, so we needed lots of co-processors. And now we are still doing it, but mainly due to our excessive computing power. Want to blink an LED? Let's put Linux, Apache and Ethernet in the circuit. ;-)

Well, on the other hand, it's unfair to say it's an abuse of computing power, since if you do have the power, making autonomous devices does make a system much more elegant and manageable, and not even talking about the advantages of general processor over specialized chips.

Take another look, it also tells us something about early home computer designs - the actual CPUs were usually irrelevant, there were no technical difficulties on the CPUs themselves and you could easily put TWO CPUs in a hard drive. And unlike modern computers, it was not even the most expensive part and the choice was limited, a cheap "trainer" computer with only hexadecimal keyboard, and an expensive home computer with a color CRT often shared a single CPU. The vast majority of the cost and troubles was system design, or putting these parts into a usable computer, with your own solution of input, storage, interface, graphics, software, or a chassis. The standardized IBM PC hasn't came yet, and one needed to create a solution of every subsystem. Every computer was a unique design, often with custom-made ASIC chips for peripherals.


> Ironically, in the early 80s, we did it mainly because of the shortage of computing power, so we needed lots of co-processors.

I'm not sure I entirely believe that explanation.

Back in 1982, the Commodore 1541 [1] was basically another C=64 with no keyboard, and a built-in floppy disk. It was big and heavy and hot and not terribly reliable.

4 years earlier, the Apple's Disk II [2] was tiny, light, and had only 5 ICs [3] -- it was described as "a state machine made of a prom and a latch" [4]. The computer's CPU drove it -- it's not like the 6502 would otherwise be running other threads or processes during I/O!

[1]: https://en.wikipedia.org/wiki/Commodore_1541#/media/File:Com...

[2]: https://www.youtube.com/watch?v=ESDANSNqdVk&t=28m30s

[3]: https://www.folklore.org/StoryView.py?project=Macintosh&stor...

[4]: https://www.folklore.org/StoryView.py?project=Macintosh&stor...


Apple didnt run a microchip fab manufacturing 80% of chips inside 1541, hence no incentive to sell you overpriced garbage (like C128D with 3 CPUs, 2 doing nothing all the time).


Yet Commodore's "overpriced garbage" was sold at retail prices consistently lower than Apple's, and had graphics and sound capabilities that far-exceeded the comparable Apple machines.


Um, the C64 sold 25 million units to the Apple II's 6 million. The VIC 20 was the first computer to sell a million units.

The problem was that the C64 never really sold to businesses. If you were buying Visicalc, you bought an Apple II.


Also, Apple made sure they got into schools and managed to push out the PETs with Apple IIs. Kind of too bad, because the Commodore (C64, post PET) and Atari computers of the day were cheaper and (arguably) better in many ways.


I have such fond memories of my kindergarten and first grade computer lab being full of PETs! The Apple IIs started coming in a year later, and I loved those too.


To be fair something like an ESP32 runs circles around the PCW1512, which we had on the school computer club, where we spent countless hours playing Defender of the Crown.

So we don't have to necessarily constrain ourselves to Assembly and C on those tiny devices.

Naturally it depends on the use case.


> So we don't have to necessarily constrain ourselves to Assembly and C on those tiny devices.

Good point.

> something like an ESP32 runs circles around the PCW1512, which we had on the school computer club

I think as long as people are having fun with them, the hardware already served a good purpose. Even if the LED blinking is implemented by running Linux.

But in my opinion, people who are interested in these popular embedded boards should also learn about the basis of electronics, or just be aware of its existence, and understand the things one could do without buying a premade "board", and that it's possible to blink an LED with a few transistors. Well, I think most will find out by themselves, it's just a matter of time and guidance.


My problem with using things like the RPi for such simple LED blinking projects is that it contributes to the cult of overengineering that plagues the industry. People seem to grow up thinking they need innumerable layers of abstraction to do things, when often they have actually made their solution not only inefficient, but actually less stable and more difficult to develop.


+1. Another criticism is consumerism - the hype that some of these devices created was more about purchasing and plugging boards and calling a library, without actually help hobbyists to learn something.

Or as John Carmack has said, low-level programming is good for the programmer's soul, it's hard to imagine that programmers do have souls if they has never been exposed to low-level programming. Unfortunately, in the modern computing world, the chances are becoming fewer and fewer, most systems are far beyond a complete understanding for most people. I think I'll never completely understand the x86_64 instruction set (w/ SSE, SSE2, FMA, AVX), or the signal path of the PCI-E interface on my motherbard.

Recently, I think some popular microcontroller projects may be the solution for a bare-metal programming experience.

But we can also looking it from another perspective - the proliferation of mobile devices changed everything, nowadays, even using a general purpose computer is not an experience that many members of the new generation may have (there was a Hacker News article about it). Some don't understand a hierarchical filesystem of files and directories anymore, and others have problem typing on a physical keyboard. In this sense, to blink an LED on Raspberry Pi is already a big step forward, and it was exactly the motivation behind Raspberry Pi.


Yes, it is unfortunate that in the pursuit of both compatibility and performance we now live in a world where even the CPU is so maddeningly complex that it is basically impossible to reason about how it will perform on any given piece of code. Even writing assembly leaves you a few layers of abstraction above where it would have 30 years ago.

I like to think that it is possible we'll one day develop a CPU architecture that is both simple enough to reason about and also highly performant.


Yes, and it's also the my reason of mentioning microcontroller projects. For example, some find the AVR instruction set is clear and powerful, optimized for both ASM programming and C compilers, and can be used as a good introduction to both hardware and assembly.


> I like to think that it is possible we'll one day develop a CPU architecture that is both simple enough to reason about and also highly performant.

RISC-V is getting there, with open-hardware cores such as Rocket (in-order) and BOOM (out-of-order). Too bad that many and perhaps most of the peripheral components even on a general-purpose SiFive SoC are still closed hardware blocks. But people are working on opening these up as well.


I don’t see a problem with this because this only happens in individual projects. Who cares if some teenager wants to control her reading lamp with a RPi instead of an ESP8266? Or if some dad wants to make a noise detection circuit for his nursery using an old laptop instead of an Arduino? What’s likely to happen is that they will both learn a lot of new things, including about the inefficiency of this. Just because you know better doesn’t mean it is bad that they did this. Besides, when you go for controlling your lamp with an ESP32, someone can always point to a simpler and cheaper processor to use instead. Then we can run into circuits designed without any software at all. Eventually you’ll be left with a MOSFET and a proximity sensor and your inner purist will be happy.

I say the point is that you should try to do trivial things with complex control systems and vice versa. You’ll do it better and more efficiently the second time. And the third. And if you decide to scale it, you will quickly learn the cheapest way to blink 10,000 LEDs.


Last year I've seen project that used two RPis for somewhat simple industrial automation task. After I pointed out that the whole thing is huge overkill the original author did some optimalizations and redesign which resulted in design that still had two RPis, with one of them being used as one bit 3.3V logic invertor and nothing else.

Needless to say the whole thing was replaced by single ~$20 chinese FX1N clone (which also neatly solved how to drive 24V industrial loads from RPi/Arduino/ESP...).


Could've been probably solved with ATTiny and a breadboard. Even cheaper and even more lo-tech.


For this kind of problem you don't want breadboard. But on the other hand board with ATtiny, 12bit PIC or smallest MSP430 and few relays and optocouplers would solve the problem. And on the griping hand random chinese PLC exists and does not involve few man-days of NRE involved in designing such a thing.


"My problem with using things like the RPi for such simple LED blinking projects is that it contributes to the cult of overengineering that plagues the industry."

I disagree a bit with you (I do agree that if all the RPi ends up doing is flashing a LED, then that is a waste) You have to start somewhere and if it is flashing a LED with a Pi then cool. The path to ESP32 or ESP8266 is not far away and there is a lot, lot more to discover and play with and learn and frankly have fun with and maybe do something useful or change the world (or a tiny bit of it) with.

I had some of the best opportunities available to me in IT and electronics and electrics, schools, teachers etc etc etc. but I didn't have today's opportunities. If I was growing up today I think I would probably have started with a Pi and blinked some LEDs.


But on the other hand Arduino is still doing just fine for non-networked projects, so it's not all just comically oversized hammers.


I fondly remember the 555, but nowadays it's actually cheaper to use a super-low-end microcontroller.


Gently, friend, the 555's less than 10 cents at Digikey in medium quantities [0]. One of my hardare cow-workers happily placed one on the same board alongside a sub $ ST ARM MCU. Sometimes the 555's still the best option.

[0] https://www.digikey.com/product-detail/en/texas-instruments/...



Disclaimer: I'm not very knowledgeable about DIY electronics and pretty bad with a soldering iron (shaky hands). Regardless I think it's super interesting and cool and thus my question:

I checked that site and "medium quantities" is 5000 of them :) (for a total price of $493). Somewhere on the page it says minimum amount is 2500, but I tried and wasn't able to successfully add that to my cart (only 5000).

So I guess that's nice, but how are you going to put even 2500 of these chips to use? That's a LOT of soldering to do. Unless you have some kind of machine to do that for you? But even then what are you going to use 2500 of them for, if not for reselling?


I believe you're asking a genuine question. It's not DIY (well, after we've prototyped it's not), these components are elements of goods for sale. Specifically the 555 was used as a watchdog. Runs of 5000 are barely enough to get the attention of stateside distributors so long as they arrive with orders for everything else on the board. As to how they're used, someone sticks a reel into a pick-and-place robot that populates the board, and the assembly shuffles into a wave soldering machine for the dirty work. It's very conventional stuff to the extent that $DAY_JOB's factory manages it from a third world-country.


Oh damn, this was the first desktop we had at home :)

I remember struggling to find the right settings to play monopoly and double dragon. And IIRC there was a version of MSWORKS for this too


An impressive trick at Commodore users group meetings was to run Fast Hack'em for the 1541. This was a low-level floppy copier that ran on one or more 1541s, without the need for a host C64 once the process started.

i.e. you'd start connected like this: (drives were daisy chained)

  [C64]--[1541]--[1541]
Then remove the C64 and dupe away for the rest of the evening, swapping in disks as they completed.

  [C64]  [1541]--[1541]


And the original Apple LaserWriter was the company's most powerful computer at its launch!

"With its 12 MHz 68000 CPU, the LaserWriter was 50% more powerful than existing Macs, and with the 1.5 MB of memory necessary to download Postscript fonts and print a page at full 300 dpi resolution, it had three times as much memory as the top-end Mac."

http://lowendmac.com/1985/apple-laserwriter/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: