Hacker News new | past | comments | ask | show | jobs | submit login

I love it. I have noted this article for my talk about how CPUs are free. To appreciate that, you have to understand that when the first microcomputers came out engineers were still in "compute" mode[1], we were lectured that you wouldn't use a hard coded loop to check for a switch closure, you had to use interrupts because otherwise you were wasting all those CPU clocks. And computing at the time was often billed in weird units like "kilocoreseconds" (which is the number of seconds * the number of 1024 word pages of core (RAM)) that were consumed when your program ran.

The logical extreme end of Moore's law was that you could put a CPU into a very, very small amount of silicon and that meant they were essentially free. (Chips cost by die area & layers). Another article like this is Bunnie Huang's discussion of the ARM CPU in flash chips[2].

There have always been jokes that it is cheaper/easier to use an 8 pin uController than it is to use a 555 timer, and the argument has often come down to the current and voltage ranges that a 555 can work under are different, but at some point I expect to finally see the "blending" of analog/digital chips that allow for a wide range of voltages (on board switching PMIC), and analog pins that have few if any compromises for being either digital or analog.

[1] The Chip Letter -- https://thechipletter.substack.com/p/tiny-computers-from-tex...

[2] On Hacking MicroSD Cards -- https://www.bunniestudios.com/blog/?p=3554




> at some point I expect to finally see the "blending" of analog/digital chips that allow for a wide range of voltages

The irony is that the developments which made dirt-cheap MCUs possible have at the same time basically ruled this out.

Digital logic is almost trivial to scale down. With Moore's Law the compute core itself is indeed becoming basically free. However, IO does not scale down: modern chips have far fewer analog pins, far lower current limits, lower voltages, and are increasingly sensitive to ESD & over-voltage events.

An ATmega32u4 from 2008 is designed to operate on 5V, can handle 40mA per pin, and has 13 analog pins. It's rather sturdy and can take quite a beating. On the other hand, the RP2040 from 2022 runs on 1.1V, although IO is 3.3V. It can only handle 12mA per pin, which a chip total of 50mA. It has only 4 analog pins, which lack a lot of protection present on the digital pins. Basically, you'll damage it if you look at it funny.

I think it's best summarized by a somewhat-recent change in the USB 2.0 specification: originally the data pins were supposed to handle a 24-hour short to 5V without any issues. This requirement was dropped because such a short is incredibly rare in practice, and dropping that single requirement led to 16% reduction in silicon area for the transceiver and a 3x standby power reduction.

In today's world of ever-shrinking transistors, dealing with (relatively) high voltages and analog voltages is getting more and more expensive.


I don't disagree at all, this analysis is spot on. The size of discretes on silicon has not shrunk nearly as much as it has for transistors. However, where I am coming from is that because the transistors have shrunk so dramatically it becomes possible to put an entire CPU in the "left over" space after you've placed the discretes.

There was a talk at either Hot Chips or ISSC in 2011? about a mixed mode chip where the die was 2/3rds analog parts and 1/3 digital part. Xilinx, the FPGA maker, came out with the "RF SOC" which has a "huge" analog section with multiple high speed ADCs and DACs and analog reference logic, plus and FPGA fabric, plus a quad-core AARch64 CPU. As I recall Cypress had something similar but the part family is escaping me at the moment.

But I am still looking for chip that integrates an SMPS so that they can run on a very wide range of voltages like the CD4000 series did (and still does). Combined with the ability to source 10's of milliamps like the ATMega and PIC chips did (and still do).


Plenty of chips integrate a voltage regulator so you can run digital logic from as much as 300 volts. Used in the controllers for some dimmable AC Led's for example.

I believe the linear regulator is implemented as a large resistor followed by some zener diode or something. I assume that's so the high voltage doesn't need to touch the silicon, merely the other end of some not-very-good insulator put on top.


The Cypress part family you are likely thinking of is the PSoC line, it’s a MCU mixed with very configurable analog front end.


Yeah that was what I was looking for, apparently it has an Infineon[1] part number now : https://www.infineon.com/cms/en/product/microcontroller/32-b... which is kind of cool.

[1] Apparently Infineon closed their acquisition of Cypress in 2020.


Made me realize that IO is the fundamental bits-atoms interface, thank you for writing.


> I have noted this article for my talk about how CPUs are free.

If teeny computers are free, and if I want to re-program them for my own use cases and personal applications, then why do I have to still spend nearly a thousand dollars or two on embedded systems development equipment like microcontroller development boards, JTAGs, ICEs, ROM flashers, UART-based bootloading solutions, and other delicate programming interfaces for small microchips, microcontrollers, and tiny computers? And don't forget microscopes to do power analysis for reverse engineering some old toy that was made to emulate or fake a real life candle's intractable flame properties.[1]

If you can't write code, then what's the point? How would no code be an agent of freedom and expression?

Reprogramming a microcontroller unit with a USB cable connected between it and a laptop computer is convenient. But too bad that's not really the standard for old technology and resources laying around the planet, isn't it? You have to basically be uncanny like MacGyver or inhumanly intelligent like Tony Stark to reprogram the apparently free teeny computers laying around the world.

[1]: https://cpldcpu.wordpress.com/2024/01/14/revisiting-candle-f...


Perhaps this helps, perhaps not, but the Cortex-M architecture from ARM defines, as required, a build in debug unit. I can build a standalone JTAG/Development tool for it[1] on a $3 breakout board, and program/debug it for free using GCC.

It has been a pleasant side effect of competition in the embedded space that proprietary (and expensive) tooling has become a problem for getting a chip adopted and so there is more pressure to support open source solutions.

[1] Blackmagic Probe -- https://black-magic.org/


First: don't conflate NRE and tooling with the cost of something. Plastic spoons are close to free, but making a plastic spoon factory would be expensive.

Second: you don't need most of that stuff. Dev boards that are a few bucks and debug probes for under $20 are credible and usable; fairly good compilers are free.

> But too bad that's not really the standard for old technology and resources laying around the planet, isn't it? You have to basically be uncanny like MacGyver or inhumanly intelligent like Tony Stark to reprogram the apparently free teeny computers laying around the world.

USB DFU is pretty dang common. It's not the absolutely lowest end stuff, but still pretty dang close to free.

Compare to doing all of this ages ago, where you'd have an 8051 with an expensive, crummy compiler and need a lot more tooling to do anything.


> Compare to doing all of this ages ago, where you'd have an 8051 with an expensive, crummy compiler and need a lot more tooling to do anything.

That depends... back in the day, I could buy an (UV) EPROM programmer for several hundred $$. Or I could study the datasheets, build my own EPROM programmer for a fraction of that, and write some software. Guess which route I took.

With uC's it wasn't much different, and still is. Vendor supplied programmers / debug probes etc, are just a quick & easy way to get started.

What is different these days, is that a lot of those 'vendor' tools are (more or less) generic tools, applicable to a whole class of devices (eg. JTAG), often come as cheap 3rd party clones, and with free software to use them.

So personally I don't understand parent's "1000s of dollars" complaint. That only applies when using niche products, outfitting a pro-level electronics lab, or plain doing it wrong / uninformed of the wealth of stuff out there.


> back in the day, I could buy an (UV) EPROM programmer for several hundred $$. Or I could study the datasheets, build my own EPROM programmer for a fraction of that, and write some software. Guess which route I took.

Even in 80's dollars, that's a big opportunity cost as a grownup. Now you can buy a $3 STLink and call it good. It's changed.

He said nearly a thousand dollars, which isn't that hard to get to-- but it means that you're doing a pretty wide variety of stuff.


Yeah. There's a small set of prepackaged micro- or teeny- computer programming interfaces. Or the plug-and-play if you will. In fact, that small set of convenience products only serves a market of kids that want to play with toys. They're literally toys. Ten or fifty dollar ARM microprocessors or microcontrollers coming in a box with integrated debugging features and integrated WiFi modules. And their complementary three dollar programming link handhelds. All from off the digital Amazon.com or AliExpress shelf. The "in-band" programming interface at accessible prices and stores.

And that's fine.

It's just that for me, on the other end of the spectrum, I prefer a little bit more adventure. Some less constraints. So, I need an "out-of-band" microchip programming solution for my aims.

Outside the kid world, you're required to be more knowledgeable about the way the world really works. You learn a whole lot more with out-of-band computer modifications than if you were to just plug and play some prepackaged handheld programming device into a little chip. You get more intimate with the microchip and its internals. You get concerned about its voltages and current needs, in order to achieve a proper relationship between your curiosity and the microchip's capabilities.

I want to dig into the raw power contained and hidden in unimposing millimeter (or centimeter) wide circuits. The re-programmability of microcontrollers or teeny-tiny computers, specifically.

There is no current documented solution for that. Beyond going your own way in a very long study and practice of electronics engineering and salvaging.


I'm going to translate your comment in how it sounds to me:

"I've spent a lot on embedded development. In large part, I've done this because I've sought to make things unnecessarily complicated and because I like playing with this stuff. I will deride the typical tools used today by most embedded developers as toys. I will use these views to try and support an assertion that computing isn't effectively 'free' in a monetary sense"

It's not like any of this is that complicated. I've spent plenty of time building my own programmers for things; I've bitbanged SWDIO, programmed EPROMs and micros with a parallel port and shift register or GPIOs on other micros; made pogo pin things, etc. If I were looking to get things done, odds are I can spend a few tens of dollars and just get going, and design in a part that costs a few tens of cents for a whole lot of computing in historical terms.

> I want to dig into the raw power contained and hidden in unimposing millimeter (or centimeter) wide circuits. The re-programmability of microcontrollers or teeny-tiny computers, specifically.

Very little of this is arcane on modern devices. Even a couple of decades ago the "hardest" thing in common use was the need for higher voltages for EEPROM erasure. IMO, where things get interesting is where you abuse peripherals to do things they weren't intended to do, but even that isn't usually equipment intensive-- a 4 channel oscilloscope and a debug probe will get you a long ways.


Yeah, you're right. Someone can spend at most five hundred or seven hundred dollars on a complete embedded systems development combination-set which maybe consists of something like one or five or twenty ARM microcontrollers and the convenient hardware application programming interfaces that are compatible with them, the small computers. Micro computers?

Anything else, anything outside this standard specification you've shared with me, is where some hardcore hacking goes on, in my opinion.


> Yeah, you're right. Someone can spend at most five hundred or seven hundred dollars on a complete embedded systems development combination-set which maybe consists of something like one or five or twenty ARM microcontrollers and the convenient hardware application programming interfaces that are compatible with them, the small computers.

You can literally become equipped to develop for microcontrollers and have a bunch of boards for less than $50 (excluding a laptop).

If you want to go a bit further, you can get a lab supply and a halfway decent DSO with logic analyzer capability for under $500.

> Anything else, anything outside this standard specification you've shared with me, is where some hardcore hacking goes on, in my opinion.

I've done plenty of hardcore hacking, and ... even then, not really. FPGAs? You can get ICE40 boards for <$20. A microscope is nice, but $40. Soldering iron? Pinecil is pretty great for $40.

I spent so much money on equipping myself to do EE stuff 25 years ago. Now you can do much more than I did back then for peanuts. Heck: I just built 20 little embedded ARM computers for students with LCD, cherry switches, and a debug monitor for $300, and the biggest expense was the blank keycaps. It was trivial to get going. That includes manufacturing. We are spoiled. https://github.com/mlyle/armtrainer

Where things get expensive is doing anything fancy analog, RF, very high speed (which is really also analog ;). Computing itself is cheep cheep cheep.


Impressive looking development board. It's beautiful even.

But a feature rich integrated development environment would be better. Especially if it, both a hardware accommodating and software accommodating development environment, operates on more than just something like the toy ARM Thumb instruction set and its ARM based microcontrollers.

After all, you don't need an architecture like ARM or even x86 to do some simple things that should be as accessible as alternating current mains electricity or sunlight from the Sun.

Computing is cheap, but only because it's easy to clear the low bar for having a Turing machine. Turing machines even occur naturally. Conway's Game of Life is Turing complete and subsequently you can build a computing machine with it. No ARM or x86 emulation or JTAG-ing necessary. Here, it's unnecessary to even summon a UART to USB adapter.

So, although computing is cheap, it's being locked behind some proprietary bars right now. I'm just looking for the keys to free some computers. Particularly I wanna free very teeny sized computers like microcontrollers.


> operates on more than just something like the toy ARM Thumb instruction set

This is about the most common instruction set in common embedded use in the world. IMO not too toyish.

> Especially if it, both a hardware accommodating and software accommodating development environment,

I'm a bit confused as to your point-- you seem to be simultaneously arguing for "more capabilities" but "smaller computer".

If you're saying "more integrated development" -- and mean specifically self-hosting-- the $1.75 microcontroller running in there is capable of being an 80's era DOS computer with all of those tools, etc, self-hosted. Playing with this is on my todo list for the luls. If you just want open-source development, GNU-arm-embedded is built on eclipse and gcc.

If you're saying smaller computers: STM8, 8051, etc, are easy, too. But there's really not a whole lot of reason to design below the $5 point unless you're mass producing something. The developer's time is worth something, too. Having a big pile of RAM to not even think about putting things on stack, etc, is nice.

If you're saying "free as in freedom" (you were responding to someone making a cost argument with the word "free") -- you can go ICE40 and put any number of open source hardware computing designs on it, and control every bit. Indeed, I had a high school class just build microprocessors from scratch on this approach.


Yeah. A 1980s era Microsoft DOS computer capability running on a $1.75 microcontroller is exactly what I want, on one hand. I'm not greedy or needy, after all. A microcontroller with WiFi connectivity built in (or easily attached, imported, or included) for building a networked system of these smart little computers and hardware part controllers too. Like, a washing machine that sends a "done washing" message to a headless server sitting in the home.

But I do kinda need them to be expendable. So designs that are priced at less than $5 is kinda a requirement for me. Because I sincerely believe I'm stepping into new and unexplored territory. A lot of experiments will be done with this information technology system. Which means there needs to be a massively productive facility for having a swarm of microcontrollers. Hence, the need for turning any microcontroller encountered in the wild into a controlled and compliant robot brain for my heterogeneity of devices and home appliances.

I don't mind thinking about how to not bust open a stack that can only fit three variables on it or something. In comparison to the simple architecture which includes parsimonious memory modules or only two registers total, for example, what's complex will be the total assembly and combinations of Turing machine based codes made possible by teeny microcontrollers/computers doing simple things. Like receiving temperature levels and then relaying or sending packets of temperature or heat data to a server. Acquiring x86 instruction sets is definitely unnecessary here. Or, rather, I only need x86 code execution for not re-inventing things like WiFi. ARM or x86, for example, then, should be seen as just an imported (think Python) or included (think C) module.


> Yeah. A 1980s era Microsoft DOS computer capability running on a $1.75 microcontroller is exactly what I want, on one hand. I'm not greedy or needy, after all. A microcontroller with WiFi connectivity built in (or easily attached, imported, or included) for building a networked system of these smart little computers and hardware part controllers too. Like, a washing machine that sends a "done washing" message to a headless server sitting in the home.

OK, that's an ESP8266, then. Here's a module for $2.11.

https://www.aliexpress.us/item/3256805440432225.html

They're far more capable than you're describing-- capable of emulating an PC-XT at close to 80% speed. For throwaway stuff you could use micropython.

They're cheaper than thinking about how to use random micros you find.


So basically we're shopping for fingernail sized motherboards?


That's the starting point. If you want to design boards, you can put down ESP8266 castellated modules (easy) or the ESP8266 chip itself (somewhat harder).

Because of issues with electronics supply chains, often complete boards are cheaper than you can buy the modules and chips at low quantities (things are really optimized to sell a thousand units or more). Even buying blank keycaps at low quantities was very expensive compared to finished products of printed sets of keys.


> A microcontroller with WiFi connectivity built in

ESP32 modules are $2 on LCSC and come with a built-in wifi antenna

> So designs that are priced at less than $5 is kinda a requirement for me. Because I sincerely believe I'm stepping into new and unexplored territory

No, your requirements are the same ones that every cheap IoT device has. Open up a $5 smart switch and see how they manufactured it for $1

> simple architecture which includes parsimonious memory modules or only two registers total, for example

What are you on about? Using an unusual instruction set will increase NRC, cost per MCU, and power consumption. Low power ISAs are a scam. Race to sleep if you wanna save resources


So many products would be better if they just used a Wemos S2 mini or similar $5 microcontroller board.

I understand why everything uses custom designed stuff for cost, but I don't get why people think it's somehow the better or more "professional" approach to do everything yourself.

Modular stuff with common high level modules is just so much easier to repair, modify, and recycle.

We need an ISO standard for these little modules so we can get back to vacuum tube era level of repairability.

Nearly all modern gadgets(General purpose computers and phones aside) could be made from a common set of 30 or so modules plus minimal custom stuff.


> I have to still spend nearly a thousand dollars or two on embedded systems development equipment

wat

> JTAGs, ICEs, ROM flashers, UART-based bootloading solutions

Dude, all the popular chips these days are ARM MCUs with SWD. They can be programmed with a $3 ST-Link V2. The most you'd spend on the stuff you listed is $75 for a Black Magic Probe, but of course you can build your own for 1/10th the price.

$0.03 MCUs are the exception to the rule since they use proprietary protocols and OTP memory, but their programmers are still in the $100 range


The future of everything is basically "the upfront and ongoing costs of the tooling are infinite, while the physical deployment is free".


From the modern MCU perspective, not doing a busy wait would be about not burning the battery. Instead, it should stay in low-power mode while waiting for an interrupt.

Very small controllers in a high-power device, like on a motor or a LED, don't have this limitations.


We're almost at (or are already at) the stage where the packaging of the chip costs more than the computer inside it.


we're two orders of magnitude past that stage. a ryzen 7 is ten billion cmos transistors for a hundred dollars, ten nanodollars per transistor. so how much does a minimal computer cost at ten nanodollars per transistor?

the intersil im6100 was a pdp-8 cpu in 4000 cmos transistors, and the 6502 was comparable, but that's with no memory other than the cpu registers. for a useful microcontroller you probably need about 8192 bits of instruction memory and a few bytes of ram, so let's round up to 16384 transistors for a whole computer. an 8051, with built-in instruction eprom and 128 bytes of ram, was 50k. the arm2 without ram was 27k. an avr, with built-in ram and flash, is 140k. https://en.wikipedia.org/wiki/Transistor_count

at that price, counts of 16384 to 131072 transistors work out to 0.016¢ to 0.13¢. but the cheapest computer you can buy today is a padauk pms150c https://www.lcsc.com/product-detail/Microcontroller-Units-MC... which is 4.2¢ in onesies (and 2.43¢ in quantity) with 64 bytes of ram and 1024 13-bit words of one-time-programmable prom for the program https://free-pdk.github.io/chips/PMS150C. that's 150× more; in the day of moore's law doublings every two years that would have been 14 years, but now it's probably longer. (incidentally this same blog looked at them a few years ago https://cpldcpu.wordpress.com/2019/08/12/the-terrible-3-cent...)

ergo we've been at the stage where the packaging of the chip costs more than the computer inside it since about 02009

(obviously the ryzen 7 cpu costs a great deal more than its packaging, though, because that's what you have to do when you're competing on computrons per dollar rather than gpios per dollar. in theory for 2.43¢ you should be able to get 2.4 million transistors, enough for about 300 kilobytes of rom or a 50 kilobytes of sram, or half that together with a 486 or a quad-core arm3. presumably padauk is not doing this because they're using long-obsolete semiconductor process nodes, which is also why their chips are so power-hungry)


What a complicated way of saying 'yes' ;)


i thought it was an interesting enough question to merit deeper analysis; perhaps someone else who thinks so will read it


I agree with OP/OP/OP. Thanks for the initial point, breakdown and the entertainment :p

Can't wait for us to start seeing authentic 3d silicon tho. Gimme that 3mm^3 cube of magic. And beyond that, computronium.


I have had Microchip FEs say as much to me.

Will be interesting when chiplet technology gets to the point where multiple chips are installed in a plastic package (there are thermal expansion/contraction issues that make this a hard problem but still ...)


If and when the cooling problem is solved and we can just stack layers at will computronium will truly have arrived. Adding another dimension will unlock hardware potential that we can only dream of today.


chiplets still lose area to scribing and dicing


You can tell the age of an embedded programmer by whether they consider sampling an input to be "polling" (which to me implies blocking, but that's another discussion) and then look for silver bullets for interrupt storms.


Polling has never implied blocking. It’s actually a way to avoid blocking. I think you’re thinking of “busy-wait loops”.

The difference between “polling” and “busy wait” is whether or not the CPU is doing other unrelated things between samples.

The difference between polling and interrupts is that with interrupts the CPU can halt entirely while waiting rather than having to take those samples in the first place.


You're not entirely wrong, and thanks for pointing this out, but polling is also frequently used to mean "busy-wait loops" - you can look for it if you doubt me. I didn't really want to get into that conversation.

The other thing that polling implies is that there will not always be data present when you poll (think of a UART driver), and you might block as long as there is, and you may or may not be polling at a deterministic rate. Sampling unambiguously implies that a sample is always present and handled. It does not preclude any use of interrupts - you can sample with a timer interrupt or handle a sample with an A/D interrupt.

Many embedded programmers of a certain age have a nearly fixed mental model of a microcontroller as a sort of headless VIC-20 and a certain horror at sampling techniques that derives from enduring crummy polling peripheral drivers/handlers from the early days of personal computing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: