Hacker News new | past | comments | ask | show | jobs | submit login
The chip shortage could lead to an era of hardware innovation (staceyoniot.com)
220 points by cofree on May 16, 2021 | hide | past | favorite | 183 comments



I'd be interested in other takes on this article, I'm struggling to accept the arguments presented here.

> turning instead to software to handle functions that had historically been done in hardware

That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.

> There’s room for new startups and new ways of doing things. More of the car’s features are accessed using a touchscreen as opposed to a knob or button, for example, which turns components that once were hardware into a software program.

This has got to be one of the worst times to do a hardware startup. Doing R&D at an established company is already painful, between skyrocketing costs and 6 month plus lead times on standard microcontrollers (not to mention more specialized ICs). If you're just starting out instead of "only" facing trouble keeping production going you're going have to hope that the chips you're prototyping with will be available when you go into production. If things keep going the way they have this year, that's a very risky bet no matter what components you select.

> And now, in light of the chip shortage, suppliers to the automotive industry are going to be looking for more ways to turn something that used to be an electronic part into software.

Not in the automotive field but from what I've heard it's the exact opposite issue, the [lack of] processors they'd like to use in those fancy touchscreen infotainment systems are a big part of their problem.

The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.


> The chip shortage could lead to an era of hardware innovation It can, but people currently busy "innovating" are at least half a decade too late. 200mm capacity shortage, and year long backlogs were common for at least 5 year now.

The opportunity from either exploiting the shortage, or ameliorating it is now long gone.

> That’s because electric cars require a lot more computer and chip-based components than those powered by gas

Electric cars require a lot less electronic components!!!

> “I saw a design the other day of a lightbulb with 12 components,” said Carrieres. “[My team] asked me if this was an idea, and I had to tell them that it was a real product.”

Recently we had a rush project for redesigning a control panel for a certain machinery. The task was to replace, or remove MCUs currently experiencing shortage.

After looking at the project for a few minutes, we realised MCU was used basically just for blinkey LEDs, and an interlock.

Our engineer slapped few 555s, flip flops, and registers in under an hour. Client engineers' jaws fell off.

While the real reason client was at the mercy of their supply chain was not that individual MCU, it still shows just how much blind following of "one chip to rule them all" can eat away at supply chain resilience.

If you have a part uniquely fitting your unique needs, the likeliness it being some rare widget, and having poorer availability increases.


> Our engineer slapped few 555s, flip flops, and registers in under an hour. Client engineers' jaws fell off.

Sounds like it's more an issue of this work being outside the client engineer's competence than any kind of engineering breakthrough.

I suspect replacing the MCU with discrete components would have resulted in a more expensive product if there was no shortage.


I find it a bit hard to believe that a whole microcontroller can be cheaper than a few electronic components on their own. Probably what did happen before the shortage was that the MCU was cheap enough and gave you a great flexibility in prototyping and coudl be suited for many different tasks. After the prototyping stage, it should've been replaced by simpler things, but since good enough was cheap enough, it was left as it was.


When it comes down to very simple microcontrollers with only a few pins, the package costs starts to dominate. Consider that each pin on an IC costs a few cents. If a small pincount MCU is replaced with a 555, some discrete logic, and passives for each of those ICs, it certainly can be more expensive to have discrete logic as opposed to an MCU, not to mention the higher cost from increase in PCB size for the extra components.


Assembly cost too from what I understand, if you need to go up to a more fancy pick'n'place machine because of the number of distinct component reels.


You’d be surprised. Some ARM M0 chips are in the $0.20 range for 1 (can’t see volume prices right now because of availability).


Yeah, I guess economy of scale means that you can manufacture 1,000,000,000 identical MCUs which are flexible and multi-purpose. Up to the point where you can't and it turns out some (most?) of them can be replaced by simple circuits with a few components.


There are MCUs costing less than a cent.

It was likelly cheaper, but now they paid likelly hundredfold of what could've ever been saved.


Also remember that a bigger bill of materials in itself potentially complicates supply chain. The other thing that comes to mind is that more components may mean more passives or a more complex circuit.


> I suspect replacing the MCU with discrete components would have resulted in a more expensive product if there was no shortage.

And what? Complexity, and BOM size are not a problem by themselves alone. You should not view it in isolation from the context.

This is exactly the thinking which led the client running into this — "Trying to save money at any cost"

The point I make is that they easily spent a multiple times the price of this panel entire production run as a punishment for that.

Blinkey leds, and an interlock in between few buttons is really an overkill for MCU, and should've been treated as such no matter how many cents can be saved.

You can save a few dollars on a screwdriver if you can put in screws with a hammer, but you are certainly better off not doing that.


There are an infinite number of ways to make a project less fragile to changing market conditions. But you can't maximize all of it, it's all trade offs. Would making a project more resilient to semiconductor shortages increase vulnerability to other factors? Perhaps the increased electrical load pushes the battery requirements above the budgeted space. Perhaps the increased costs puts the product above the budget for customers, and making the project no longer viable in the marketplace. There are too many factors that I don't know about to possibly comment on all of it. The replacement of the MCU with discrete components was mentioned, and I commented on a few of the factors with could be involved with the decision, and did not claim to have made an exhaustive analysis of the product spec to determine why the choices were made.


> And what? Complexity, and BOM size are not a problem by themselves alone. You should not view it in isolation from the context.

Sure. So what's important about context? Is it impossible to have a 555 shortage?

> Blinkey leds, and an interlock in between few buttons is really an overkill for MCU, and should've been treated as such no matter how many cents can be saved.

Would you refuse to use 1% resistors if they were cheaper than 10% resistors? Overkill isn't bad by itself.

> You can save a few dollars on a screwdriver if you can put in screws with a hammer, but you are certainly better off not doing that.

It's more like a set of 100 drill bits being cheaper than the specific 20 you need all wrapped in individual boxes.


If you will ever have a 555 shortage, than any EE can conjure something similar in a few days from parts at hand.


I can confirm that this is an extremely challenging time to do a hardware startup. In addition to the supply and freight shortages that are deadly to startups stuck at the back of the line:

1. Travel restrictions make it extremely challenging to evaluate suppliers, do joint development, and bring up manufacturing.

2. A bunch of recent high profile failures have caused VCs to cool on the space.

3. Google, Facebook, Amazon, and Microsoft are all hiring armies of hardware people, making the job market more competitive than it once was.

All that said:

1. There’s still no shortage of interesting and worthwhile problems to solve with hardware.

2. Like web companies that survived the dot com bubble bursting, the hardware companies that make it through 2020 and 2021 are going to have been strengthened by it.


4. Certain major contract manufacturers are imploding.

5. China-West tensions, including HK/mainland border being locked down, are creating additional difficulties.

6. Current China anti-platform/monopoly drive has cooled domestic VC activity and appetite for vertical integration.

(Full disclosure: Cross-border hardware venture founder in China.)


> 4. Certain major contract manufacturers are imploding.

Which ones?


Please provide examples of (4). I have worked for HW companies pretty much all of my career and (4) is news to me. I would very much like to understand what you're seeing.


Let's just say regarding a major facility for a major CM over the last few years myself and agents have dealt with them a few times, the first time a well-meaning manager said "stay away, this company is bad news". Everything I have since learned (tour of facility witnessing old tech, dinosaur client base, partial selloff of site, bizdev literally stating their reason for not taking on simple projects was complexity, employees covertly selling use of software the company has licensed on the market) reaffirm this perception. Whereas, the stock price doesn't. May as well be dogecoin.


Definitely, chips and other components shortages are getting increasingly serious, and costs are going through the roof; voltage regulators that normally cost 50 cents have been selling for as much as $70.

There is indeed a lot of demand for chips and electronic components of all kinds but I understand that manufacturers claim is not real demand, is just a spike caused by uncertainty in international supply chains. https://www.taipeitimes.com/News/biz/archives/2021/04/22/200...

Just like we all saw a lot of people buying toilet paper like crazy at the beginning of the pandemic, well, there is also a lot of hoarding by big tech companies that make this shortage likely to last well into 2022.

So, they're not exactly willing to expand capacity for a demand that is most likely to go away eventually.

So indeed, it is a challenging time to design and manufacture hardware; all the big players are buying it all.

The only advice I could give to anyone that is starting to design their electronics to make the design as flexible as possible, leave room on the PCB for 2 or more alternative components.

You could even try to make 2 different PCB layouts and choose one or the other based on price and availability of components, oh and try to delay components selection as much as possible, that's the most useful advice I've read https://titoma.com/blog/reduce-component-lead-time


> A bunch of recent high profile failures have caused VCs to cool on the space.

VCs have been "cool" on the hardware space since 2001.

Why fund hardware which can take 5-10 years to go somewhere when you can fund the latest social garbage fad and it will take off or fail completely within 36 months?


> A bunch of recent high profile failures have caused VCs to cool on the space.

Could you expand on this?


A successful startup allows others to claim that they’re “Uber for pets” which encourages VC money - but if there are famous explosions in the ears all of a sudden you have to explain why you’re not “Wework for chips” which makes it harder to get funding started.

When a gold rush is on nobody gets yelled at for buying shovels even if they lose money. But going against “conventional wisdom” requires either entirely independent VC or investors who really trust you.


Can you provide examples of (2)?

If we think it's bad now, wait until the dozen or so AI chip startups have their moments of reckoning.


Ye, I'm with you. To give the benefit of doubt to the author, maybe they were thinking of "turning to software innovation" as in optimizing / longer dev cycles that can result in less resource wastage. i.e. rather than needing the latest chip to run Electron based applications everywhere they'd spend a bit more time developing software that can run well on older chips.


For the integrated hardware, like IoT, cars and so on, it makes sense to attempt to reduce the amount of chips needed, and the glue the remaining bits in using software. I don't see how there's much else you can do to deal with a shortage, other than perhaps starting to remove many of the "smart" features and accept that perhaps we didn't need all of them to begin with.

For computers, desktops, laptops, servers and phone, we could start focusing on more efficient software. Replacing laptops every two or three year is no longer cost effective. We start to look at 5 to 8 year lifespan for phones and laptop, meaning that developers need to focus more on efficiency.

It's two different ways of dealing with rising chip prices: Use fewer chips, and use them longer. It's also a great way of shifting to greener computing.


10year old laptop chips work well for the vast majority of users. The problems are in shoddy cases/integrations that fail, and lack of repairability/upgradability/planned obsolescence (even in chassis that have no thin-light excuse), both of which lead to waste


One possible idea is a rebate that is given back to manufactures if the device last long enough.


Had they not explicitly said hardware innovation I would have thought that as well.

I hope it does lead to debloating though, when mouse config software weights in over 200MB we're long overdue for some trimming of the proverbial fat.


> which turns components that once were hardware into a software program

this is a fun one, because there is no shortage of knobs and buttons, just chips, and knobs don't need chips to work. Even if you do need a chip for it to work (eg. rotary encoder to canbus), most of them are still available, because the "old tech" is good enough for them (compared to newer, smaller transistor sizes, needed for modern cpus to process a lot of data - eg graphics cards, modern cpus and even touchscreen radios for cars).


I stopped reading after the car bit. Anyone who's advocating more eye-requiring touchscreen controls for cars instead of less hasn't put enough thought into their opinions for anything else they've written to be worth reading.


>> turning instead to software to handle functions that had historically been done in hardware

> That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.

Possibly, but the point is to reduce the number of expensive integrated circuits. A classic example:

https://en.wikipedia.org/wiki/Integrated_Woz_Machine


You're talking about two different things: eliminating hardware by doing more in software (what the article was stating) vs consolidating discrete logic to an ASIC (both the IWM and the discrete logic it replaced performed the same task in hardware. The IWM just reduced it to one chip allowing them to eliminate the controller PCB)


> The IWM just reduced it to one chip allowing them to eliminate the controller PCB)

From the Wikipedia article: "The floppy drive controller was built with 8 ICs, one of which is the PROM, containing tables for the encoder and decoder, the state machine, and some code."

(My emphasis.)

What made it at all possible was moving most of the logic to the CPU:

https://retrocomputing.stackexchange.com/a/7188

The IWM chip, used in the prototype Mac with double sided "Twiggy" drives, and later in the original production Mac 128k with Sony 3.5" diskette drives, is basically Woz's original disk controller design (converted by Wendell Sander and Bob Bailey to NMOS), with the addition of buffer registers to allow the processor to run a bit more asynchronously. But, except for not having to run (* almost) synchronous with the disk data rate, the 68k CPU did all the raw data decoding (GCR nibbles to bytes, etc.) and head stepping in software, very similar to the 6502 code in the Apple II (DOS) RWTS.


No, IWM replaced floppy controller ASIC integrating pll, track and sector ID detectors, bit shifter, FM/MFM decoders/encoders, and CRC logic. All of this was replaced by simple bit shifter, state machine and software running on main CPU of the computer. Amiga employed similar trick reusing Blitter to encode/decode tracks.


> The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.

I think there's some truth to creativity being enhanced by constraints. Certainly, if supplies are limited, and especially if the limits are uneven, there's going to be incentive to design around the chips that are limited, and some of that might be innovation that could be useful even after supply goes back to normal. Of course, CPU shortages are hard to design around, especially if all CPUs are in short supply; but some applications might be able to make use of different CPUs that might have more availability.


Technically, sure. But the extent of the innovation I've seen so far is people going back to using inferior chips because that was all they could get within their budget.

Microcontrollers aren't exactly interchangeable, even within the same product line. You could design for flexibility and use the arduino framework to run your code on most microchip/atmel, ST, and a million other chips but that comes at enormous cost -- to put it nicely that's an incredibly inefficient library if you're not doing anything demanding, and damn near worthless if you are. Any multiplatform framework that abstracts away the inner workings of microcontrollers is going to be too heavy to work for a huge percentage of people's power and complexity profiles.

It's not just MCUs and firmware either, any time you replace a component due to shortages you need to revalidate your design. Constantly redesigning and revalidating boards based on available stock is what people are doing right now to keep the lights on. It's hell.

If you don't need a microcontroller to do whatever you do, then sure. Pop it out and save a few bucks. But that's hardly innovation, it's more rectifying a mistake made when doing the initial design.


I think you're like 98% right. Swapping a MCU is a lot of work, and other chips are too... I just wonder how many people are going to have all the chips but one, and figure out how to wing it, and how many of those solutions end up being interestimg/useful/kept past when the missing chip becomes available.

I'm thinking of stuff like (at least some) dishwashers with 'dirt' sensing don't actually sense dirt at all; instead the pump has a thermal overload, and they sense how many times the pump cycles to indicate dirtyness.

If you used to have a dirt sensor, but it was delayed 18 months, you might figure something like that out, and maybe that's handy. Or maybe there's some other thing you'd like to measure, but there's not a good way to measure it, but it causes something to misbehave and you can measure that; but you wouldn't have thought about it except that you ran out of dirt sensors.


>> turning instead to software to handle functions that had historically been done in hardware

>That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.

I'm personally not at all convinced hardware accelerated GUI toolkits are more power efficient than software rendered ones. Weather it's due to the abstraction or some weird idea that drawing is now "free" you end up with way more drawing being done once something is hardware accelerated and it tends to more than offset the efficiency gains. The only place it really works are video games because they're actually paying attention to where the limits are. (and they don't really care about power efficiency.)


> I'm personally not at all convinced hardware accelerated GUI toolkits are more power efficient than software rendered ones.

It really depends on what is being drawn and how. If a GUI toolkit is all flat colors, basic shapes, and doesn't use antialiasing then it can be very power efficient on modern CPUs, even low power embedded ones. If it instead uses gradients, alpha blending, antialiasing, and drawing complex shapes it's going to far less efficient all in software. The difficulty increases with increased demands for fill rate (display size x frame rate).

On modern CPUs (even embedded ones) a software rendering to a QVGA display would be no probably and likely no less efficient than hardware acceleration. However as the fill rate demand and drawing complexity increases software rendering will quickly hit a wall.


As well the argument about power management chips not benefiting from Moore’s law. Yes, but that’s been the case for decades. Thermal and current requirements demand a certain amount of silicon regardless of how small you can make features in a low-power chip.


>> turning instead to software to handle functions that had historically been done in hardware

> That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.

Not necessarily. For example, the original idea behind RISC was too move lots of complicated functions (and instructions) from hardware into software, so that the hardware could concentrate on doing the most common and simple functionality quicker.


Those common computationally complex functions generally get abstracted away via CMSIS or the like, I've yet to beat the hardware implementations of common signal processing algorithms in software despite actively trying. Anyone trying to fill in for missing DSP hardware is going to have a very bad time.

I can't imagine it's much better for other hardware-->software transitions, given the cost of specific hardware implementations they're generally only used when actually necessary.


Or, we could just ban crypto exchanges in the USA (and allied nations), watch prices tank 95%, and all of the fab capacity which is now going towards Bitcoin and Ethereum mining could be restored to proper productive use (alongside a big reduction in energy expenditure and ransomware).


I think the biggest pitfall in your plan is thinking that prices would respond by falling, let alone by 95%.

There’s a long history of countries imposing capital controls. And it’s rarely favorable to the home country’s exchange rate. What you are proposing is basically just capital controls towards the crypto economy.

Yes it will make it harder for new buyers to bid up the price of Bitcoin. But it almost certainly will induce current holders to hoard any Bitcoin they own, thus reducing supply and putting upward pressure on the price. Without a liquid market, miners are likely to hoard their rewards, creating even more Bitcoin scarcity.

It also may create panic buying before the law comes in effect, or by people in foreign jurisdiction fearing the export of the policy.


It can also just as likely create panic selling as speculators look to shed an asset that's dead in the water. Remember when Robinhood banned GME shares from being purchased, and the share price cratered as speculators tried to rinse their hands of the asset?


Luckily, US economy is not the only one out there. Europe and China will be happy to eat their lunch.


Maybe a better strategy is to discourage crypto through fiscal policy: Imagine 90% capital gains tax on crypto profits, no credits for capital losses on crypto, 10% crypto wealth tax. Put those taxes to use fighting climate change.


At 90% taxation why wouldn’t wealthy crypto holders just leave the tax jurisdiction?

Like, it’s not a bunch of gold bricks or a briefcase full of paper money...It’s not controlled by a bank who can deny your transfer. Anyone can take all of their crypto anywhere on Earth, and can do so instantaneously and virtually anywhere with an Internet connection.


If it was the US tax jurisdiction that someone is trying to leave, then that person would need to give up their US citizenship, not create any suspicion that they are leaving due to tax reasons, and also pay taxes as if they have sold off everything they own.


It’s a crime?


Argentina has implemented many of these "solutions" trying to keep the dollar under control. The result is prices keep going up, local currency keeps going down and there's an entire parallel market to avoid the taxes. I expect the same to happen to Bitcoin if that is implemented.


There is no crypto economy. It is just one giant pyramid scheme.


What's a pyramid scheme?

How do you define one? What attributes does a pyramid scheme have?

One could easily ask what's a currency that's backed by nothing, is inflated to oblivion continuously robbing marginalized peoples' life savings, is often used for illegal drug/weapons/sex-trafficking/and worse crimes, is involved in a good percentage of the world's pollution, and that may soon be worth very little?


Alternatively, by following the same logic we could:

1) ban computer games to free up people’s time, and grow the economy,

2) ban ML research to improve overall privacy.

All these will also affect GPU prices favorably.


Your two suggestions have more positive effects on the society than cryptomoneys.


Let’s ban computers itself

No computers No chip shortage

Fixed :D


And nothing of value will be lost. The only issue is that the powerful that have invested in crypto will fight with their lives to prevent that.


Nothing of value to you, but thankfully nobody made you the global arbiter of what people value.


Right, otherwise people couldn't speculate on crypto prices, the horror.


Might as well make it so people can't speculate on stocks or commodities.


Pork bellies have inherent value, cryptocurrency has no inherent value.


Neither does cash, then?


What percentage of merchants accept cryptocurrency and what percentage of merchants accept cash? "Fiat currency" may not have inherent value in the same way as gold or pork bellies, but stable currencies (e.g. Euro, US dollar) are backed by governments which have incentives to be good stewards of their responsibility and if those currencies collapsed, I don't think the world would be in a state where cryptocurrency would retain any value.


Gold, and Silver, can be made into jewelry.


So can bitcoins.


Thankfully?

The problem is that we did make rich people the global arbiter of what people value, by giving them more units which can be used to value things.

This is a fundamentally undemocratic system. Ten billionaires have way more ability to decide on something's value than ten people making minimum wage.

(This is also the fallacy in "If you don't like it, why not short it and make money instead of complaining?" I don't like it, and I know my short positions aren't going to influence the market as much as some rich guy's much larger long positions.)


This is a nonsensical take on how value is determined. It's not analogous to voting (or making political decisions in general) -- it's incoherent to call it "undemocratic".

The overwhelming majority of valuation / price discovery does not take place in a coordinated or intentional manner. People buy what they want or need and in aggregate this determines price. Billionaires in aggregate are a minor portion of transactions that take place -- they can participate in larger transactions than most people but can't just will any market configuration into existence. They can create localized distortions, but if they're doing something unsustainable then they won't stay billionaires long and it will end up a blip.

However, I suspect your position may be a cryptic way of saying "There are some human activities / endeavors that I don't like and it's easy to blame the rich people invested in that / making their money off that." This would not support your point. It takes more than just some rich people to create and sustain an industry.


> They can create localized distortions, but if they're doing something unsustainable then they won't stay billionaires long and it will end up a blip.

My point is precisely that this assumption is not true.

A more precise way to say that you're "doing something unsustainable" is that you're persistently overvaluing something. But how is it determined that you're overvaluing something? The rest of the market assigns it a lower value than what you assign.

If you and your billionaire buddies just decide to value something highly, then simply because you have money already, you can do that, and you can influence enough of the market to make sure that you're not overvaluing it relative to the market, and thus, it is sustainable.

The more money you have, the more leverage you have to pull off this trick. Sometimes it works, sometimes it doesn't, but the average person on a minimum-wage strategy cannot even attempt it.

People don't solely buy what they want or need. They also buy, in very large quantities, things that they expect to be able to resell later. Sometimes this happens at small scale (think toilet paper or gasoline, or even think houses); sometimes this happens at large scale (think commodities markets or mergers and acquisitions). These sorts of secondary markets have a huge influence on price discovery.


> This is a fundamentally undemocratic system.

All modern democracies are representative. People themselves don't decide about things, their representatives do. This is the price we are willing to pay to get effective governance.


Sure, but not all representative systems are democracies. A king may represent his subjects (in foreign policy and war, for instance, and even sometimes in rituals) and generally has way more leverage to be effective at however he chooses to govern than a president or prime minister does, but it's not a democracy.

One of the usual goals of a democracy is equal representation ("one person, one vote"). You can transfer your representation to someone else - either directly, in the sense of proxy voting, or indirectly, in the sense of voting for a representative - but you don't give it up permanently. When elections next happen (and if elections don't regularly happen, it's not a democracy), you get your vote back and can freely choose what to do with it, again. If your representative isn't doing what you want, you can choose another one.

That's not true of the system of money. Even if it were theoretically been true at some point, once you transfer money to someone else, the associated power of that money stays with them or whoever they choose to transfer it to. If you bought Windows 95 for $209.95 a quarter century ago, that fraction of your ability to "vote" on how society values things left you a quarter century ago, after spending it exactly once on valuing Windows 95, and it probably never returned to you (unless you became a Microsoft employee or supplier).

So, someone who makes something that is (rightly or wrongly) highly valued thereby accumulates the ability to value things in the future, and have disproportionate influence on what is then going to be highly valued. This process repeats continuously and has repeated for centuries.


Democracy does not mean "one person, one vote" -- that's tyranny of the majority, and gameable in an open system with procreation, immigration/emigration, and even murder. Democracy means "rule by the people" which is far more nuanced.

More simply, the US Constitution was not approved by "one person, one vote", not even with a dozen caveats and asterisks.


Sure. I'm not saying that a democracy always means one person, one vote. I'm saying it requires an ideal of trying to be meaningfully representative.

One person, one vote is certainly gameable. One dollar, one vote is even more so.


The problem is the representative frequently don't represent the interests of their people, they represent the interests of whoever will donate more money to let them keep winning elections.


By that logic we should ban stock exchanges as well.


Stock markets help allocate resources to the companies that will make good use of it. We can't say something similar about crypto until speculation stops being 99.9% of the use case and it starts being used for something more practical. We may get to that point, but that's still not a certainty.


… Do they? After an IPO, how is the glorified trading card game of stocks actually helping the companies themselves?


By making stock grants valuable and thus aligning the incentives of the employees with the company?


In many cases, shareholders are represented by a board of directors, to whom the CEO is beholden. Also, executives often see much of their compensation via equity in some form.


Companies can issue stock (whether through employee grants or secondary offerings). They can also buy other companies with stock.


Plus the value can effectively act as a form of collateral for loans explicitly or otherwise.


The dream.


Maybe not ban, but heavy regulation and taxation sounds good to me. Too much of our economy is based around ensuring high numbers for the DJIA and Nasdaq, which is not healthy for society.


I can agree to that.


And watch decentralised exchanges that are even less efficient replace them. There's as much chance of successfully banning crypto as there is of successfully banning drugs (in a big part because of how useful crypto is for online drug transactions).


The US could simply ban US banks and financial institutions from doing business with crypto, and from doing business with any foreign entity that does business with crypto. That would make it mostly unusable except as just another aspect of the criminal underworld shadow banking systems. That's based on the current state of crypto though. If it becomes a common medium of exchange with many people having some or all if their transactions denominated in crypto, that option kind of goes off the table.


Essentially an embargo against crypto? Similarly to how the US treats a company that deals with Iran?


That's precisely what I'm thinking. Of course Iran, as a country with an economy of its own, can still function internally, and there's a black market, but there are also risks to that. Crypto certainly couldn't hit mainstream transactional usage under those conditions, and there would be safer places to keep your money legally & avoid the risk of having your assets frozen.

The window is closing rapidly on this option though. The more it becomes just another financial asset class with trillions in market cap, the harder that option becomes. However, if a country like the US ever decides it's a real threat to it's sovereign monetary policy, it could take that option anyway and simply bail out those impacted financially.

I think the most likely scenario is simply that it becomes regulated like any other part of the financial system. Heck thinks like CTR's SAR requirements. It will be difficult to ever get Bitcoin or others to their dream of decentralized currency unbound by governments when governments can make regulations controlling it's usage. The average business owner isn't going to risk their business and jail time trying to work outside those regulations.


Regulation will never work, never. It's not like trading human organs is outlawed or anything.


If crypto only ever rises to the level of the human organ trade, it will have failed to deliver on pretty much all of its promises.


If you think the human organ trade doesn't exist, some Uighurs would like a word with you. https://en.wikipedia.org/wiki/Organ_trade#Illegal_organ_trad...

"Despite ordinances against organ sales, this practice persists, with studies estimating that anywhere from 5% to 42% of transplanted organs are illicitly purchased."


If you think outlawing that market doesn't drastically reduce its scale, your reality is so different than mine that there's nothing more to be said.


It's wild how many people on a tech forum don't understand the next major innovation of the Internet. Crypto is up there with Netscape and smartphones.


People said similar things with: AI, torrents, VR/AR, quantum computers, memristors, drones, chatbots, distributed social media, Internet of things, self-driving cars, robots, dapps.

Some of these technologies managed to produced something useful. On the other hand, I cannot use crypto to anything but speculation.


Imagine thinking that’s a large part of the chip market

Its a squeeze, no different than what happens to rental prices when there is 1% vacancy rates versus .5% vacancy rates


Do you have any numbers on how much fan capacity actually goes toward crypto mining? I would think its relatively small in the grand scheme of things.


The real answer, I think, is not to ban the nebulous idea of "cryptocurrency" but to ban (or highly tax) proof-of-work coins or similar schemes where the network requires a large amount of some physical resource to maintain its function.

That will spur innovation on approaches that don't have that requirement.


Just tax electricity and tariff countries who don't.


There’s already rapid innovation in this direction without government innovation being necessary. Ethereum is switching to proof-of-stake by the end of the year, and partially because of that is already 50% of the way to passing Bitcoin by market cap.


Why not ban military exercises and automobiles instead?


I don't think crypto hardware represent a significant potion if the hardware market. Consumer electronics alone constitute about $3 Trillion.


Anybody who follows the industry knows semiconductors manufacturing is a boom and bust industry.

This video covers it quite well: https://www.youtube.com/watch?v=Z7QkIECEkVc

Automakers cancelled their orders during the early days of COVID-19 and lost their place in the queue (and now are crying foul).

Also many companies have double-booked orders from multiple suppliers making the shortage look worse.

There's a huge amount of semiconductor capacity coming online in the next few years, much of will be barely profitable once this current shortage dissipates.


Double booking is a consequence of shortage. You can't separate it out. Longer lead times make it harder to predict future demand, and perception of scarcity leads to overbooking that exacerbates the negative consequences of underbooking.


You can forbid companies from canceling orders - which discourages it.


LOL, no. It's hard to implement hardware functions in software if it's impossible to source the MCUs my software stack was built on.

What's really happening is that I (and likely many others) realised that Chinese-brand chips (which are not affected by the shortage) works just as fine as the western ones.

The other thing I realised was that non-authorized sellers works just as fine as well. Yep, I had to implement more thorough testing/QC, but I would not call that extra work innovation.


Creation of entirely new brands of compatible hardware is innovation too - or maybe let's call it progress.


Co-incidentally you can buy all Western chips you want from Chinese sellers in any quantities. Just that you have to pay 10x.


My experiences are different.

I think most assembly-houses self-source some parts, and they do it in large quantities (like 1M+ parts/year). As such, they buy at extremely good prices. They overbuy themselves, and in order to get the same good pricing next year, they keep overbuying. Some of their inventory is then sold locally, and ends up in the Shenzhen market or at unauthorized resellers.

For the quantities I buy (sub-1000 pcs typ.) the pricing is usually 0.2-2.0 times the known good parts from known (authorized) sellers.

The problem is that I can not be sure what I get, and I have to implement really extensive testing for each batch.


Shortage is a driver to creativity. This is what made Picasso become famous and outstanding on his way: Picasso intentionally constrained himself to only use lines, very simple shapes and single colored areas as his toolbox. Although he could paint photorealistic portraits when he was a teen, he intentionally constrained himself to unleash creativity. It’s the limited amount of tools that we have, also when programming (a few constructs like if/then, loops, boolean, integers, etc.) which make us think “how to express this certain idea best?”. If you have boundless resources, there’s no importance to be creative.


Anecdotally, I've seen a lot more job postings for firmware engineers lately. I assume these are coming from companies that cannot source their preferred microcontrollers and need to port code to a new chip before inventory runs out. Hopefully this leads to some modernization of embedded software.


Modernization? The deadlines are tight and the focus will be all about getting done as fast as possible before the inventory of replacement parts dries up too.

I have ended up raiding the warranty return bin for fixable boards in my small startup. Its easier to repair the returned boards right now than order new ones.


That's pretty awesome. I moved out of embedded development many years ago, but it's always been my true love as an engineer.


Why aren't chips recycled more? (Not that recycling could address the immediate problem.)

Or perhaps there is already a lot of recycling, but it is not apparent to me and probably not apparent to most people.

Is pulling and sorting chips from e-waste inefficient? Damages too many chips?

In China a lot of reverse engineered info on proprietary chips floats around among engineers. Are hardware engineers in the west too busy to adapt to recycled chips? Is the supply too precarious?

The only one I know of working in this area is https://en.wikipedia.org/wiki/Andrew_Huang_(hacker) I wonder if he gets lonely.


Chips are recycled, at least the valuable ones. Go on AliExpress and you'll find lots of expensive FPGAs reballed, iPhone chips for repair, classic computer chips that are no longer made, etc.


Ftlog not another round of 'let software do it', please.

How about, when a couple of dead simple switches will do it, DO NOT stuff an embedded Linux SoC in. A toaster, for example, should not require one single bit of software.

Cars. Omg. Do we have to turn them into rolling smart phones? DO we? I dread the day when my car has to OTA update every 3 weeks. There should be no need for that. Yeah, I can see you might want something to handle engine timing or active suspension. But I do not need or want all these 'apps'.

We seriously need some sanity in product design these days. Don't blame engineers. Blame the product designers.


Cars have been increasingly more computerized during the last two decades. If car firmwares were more open it would be possible to deploy software improvements to reduce emissions and increase performance.

I agree that having to update too frequently would be a hassle, but we could make far better use of existing vehicles if they were more easily upgradeable and updateable.


> If car firmwares were more open it would be possible to deploy software improvements to reduce emissions and increase performance.

Better yet - have the actual "car" layer (steering, acceleration, brakes, safety critical parts like airbags and ABS) entirely self-contained on a separate system that's written in a safety-critical language. Force all the "extras" (AC, seats, radio, windshield wipers, maybe blinkers etc.) to be controllable via a user-supplied module that might as well run on a consumer-grade electronics motherboard.

That way the part of the car that gets you from A to B gets to benefit from being self-contained and as optimised for emissions/performance as possible for the time the car was manufactured. The part of the car that merely provides comfort and convenience is something the users are free to monkey about with because it doesn't compromise any regulatory or safety standards.


Then how will you listen Spotify while you drive? I'm not about to run a wire to the aux-in on a dumb radio like some kind of savage from 2013.


Having a dumb Bluetooth receiver wired into your stereo system is not very complicated, and mainly foolproof. Let the end user (moreover, force them to) pick which end device of their own they want to use to stream music to the car.


Can we get back to reliable, analog devices that aren't smart and don't spy on you instead?


There is a tiny but interesting community around the concept of low-tech that is about that, and more : objects you understand (since you build them) and can repair yourself.

About the industrial production of such analog objects, I have the intuition (maybe I’m wrong) that we lost the knowledge of how to build complex mechanical objects at scale. Those objects used to require a huge design work where now, all you have to do is a plastic case enclosing a chip and some software developers to bring it to life.


What we lost in the existing body of knowledge, I'm sure we gained back in the ability to simulate the entire mechanical system virtually. Trial-and-error is much easier when your turn-around time is shorter, and when you don't have to contend with losses of expensive raw materials/components expended on each trial (stripped gears/unmatched springs, etc).

And this time round, the record can be made in a digital format, so we won't lose it for future generations.


Objects such as?


You know, that's a good question.

Maybe sewing machines? Typewriters? Linotype machines?

I admit that it's all a mixed bag. Common use of microcontrollers allows more cleverly designed electric hand drills at lower cost, but they are mechanically more poorly made than in days of yore.

Cars strike me as having the same problem, partly in the ability to design with computers, not just having a computer in the loop. Major components (transmissions, rear ends, etc.) used to be built in a much more beefy way partly because they were just guessing.

Luckily we are all free agents. We can buy old or we can buy new.


Try talking against big tech openly on HN and see what happens to your account.


In my experience talking against big tech openly gets upvoted to the tops of the comment threads


Any examples? I can't really think of anything analog that's more reliable (or even as reliable) as digital...


maybe because you cant remember anything analog at all?

from top of my head some superior analog devices:

* kitchen/body scale * oven * radio * light switch * thermostat

granted, it's mostly not per se because they are digital but that their UI is now a touchscreen with input lag and without any tactile response.


Yeah, by digital I mean anything using electronics instead of mechanical tech.

And I really can't remember any even though I grew up with them, weird huh.

Digital scales are smaller, lighter, more reliable. Radios and thermostats, too (those bimetal plates analog ones use are pretty bad).

Touch screens... I hate them but they're reliable, moreso than hardware buttons if not stressed beyond specs.

I've never used digital light switches, analog are great.

Ovens, yeah, I have a really old one that just won't die, I've worked on hundreds of them and it's always some electronics fault (high heat and poor cooling will kill a lot of components).

Good examples, thanks.

I will never miss carburetors and non-hydraulic clutches and brakes (shudder) though :D


Digital scales have their quirks, and sometimes very poor UX. For mine, the calibrate button is placed in a way that makes it hard to depress without putting pressure on the top of the scale - which kind of defeats the point. An analog scale will not have such a quirk - there will be a single slider or knob that can be turned without affecting the system.

Mechanical buttons: sure, if you're talking about those junky, flat momentary push-buttons then yeah - they will go out of service after a number of presses. But, a proper mechanical push-button or radio button, such as on old radios and on industrial machinery? Basically indestructible. They could likely last a century, and if they go bad, you can replace them because they are generally possible to service.


I wish, but the economic incentives move towards more spying and less control from the user :(


Pre ECU cars are famous for their reliability, for example.


This. <rolls eyes> Anyone remember carb cleaner or adjustments?


and fantastic fuel economy, not to mention eco friendliness.


Here's hoping that it could at least rid us of the inefficiency that is web apps, seeing as we've seen unprecedented growth in CPU power over the last decades, yet are still very much doing the same stuff, if on other people's computers.


I'm surprised that this doesn't get brought up more often.

Somehow I have to upgrade all my hardware every few years, which of course increases the demand for chips, even though the software I use doesn't have much extra functionality over the software I used ten years ago. Nicer UI, yes, and I would argue that modern software is often more reliable than software ten years ago, but these things shouldn't require faster hardware.

This might be against the ethos here on HN, but I'm still of the opinion that 95% of the webpages would be much better if they were written in plain HTML.


I’m hoping it encourages game developers to spend a bit more engineering time on optimizing their games for a lower minspec.

Or, knowing that users won’t be purchasing new computers at as fast of a clip will encourage software made for a wider audience.


Or you know, maybe we can decide that we don't need every appliance and lightbulb in our house to have network access. Or that we can drop the price on many base-model cars by $1000 by removing the infotainment unit.


"Chip shortage" means a lot more than just microcontrollers. The ripple effects of it have affected all sorts of analog semiconductors and integrated circuits. I only do small quantities of electronics but I've seen the lead times on bog standard parts like relay controllers and op amps shoot up and stocks are starting to dwindle.


TI are in an especially bad spot and they make lots of things from microcontrollers to battery chargers to power converters. Their parts are pretty common but many of them are currently unobtainable from standard distributors.


I’ve been thinking about it. I think it could also lead to an era of developers learning the lesson of constraint and writing effective code once again :) they call me a dreamer...


Or just better compilers.


I would love to see smart household appliances to vanish from the market completely. Potentially less e-waste, less unsecure devices connected to the net. Let's hope this shortage will push towards that


Many "smart" home appliances are also substantially more efficient... The fridge which does defrosting at night to save energy and be able to have a smaller compressor, the washer which measures the incoming water temperature and clothes weight and adjusts the water level and runtime to reduce energy use, etc. Many of these devices might overall halve energy use compared to a device from a decade ago.

Sure, some of those energy savings are a result of better design and materials rather than clever electronics and software, but often the two must be used together - for example the washing machine that can be lightweight because it uses software to rearrange the washing in the drum for the perfect balance... And a lightweight washer reduces energy consumption to heat it up...


The ones that youve described are simply more efficient. What I was referring to are devices that needlessly have integrations via apps, social media or can be simply controlled by connecting them via wifi. These are needlessly more complicated, have more moving parts and these parts are now scarce.


Smart == IoT for the current-day market. The fridge that defrosts every night does not need an Internet connection, yet today's market will make sure that it does without an actual need.


I'm really hoping this forces some "stagnation" (for lack of a better word) in the chip market. If you look at what happened on consoles like the PS4 where it was years and years before any hint of new hardware coming, developers had to get really intimate with the available capabilities and tease out performance no one thought was there.

Imagine getting software updates that make your old phone faster rather than slower?


Time to ban crypto. It's just another component of the extinction speedrun.


Banning the unbanable. How did that work out for the US War on drugs?

Crypto is here to stay, stop fighting it. Instead try to fix its issues.


Crypto's primary use is as a speculative asset, and for tax evasion/capital control evading/money laundering.

Drugs are a consumable, crypto is an item of exchange. If you cannot exchange them, then they have no value.

About 60%+ of market activity is in the USA: https://www.statista.com/statistics/1195753/bitcoin-trading-...

The USA could ban online exchanges in the same way it has banned online gambling.

Sure, trading could still occur via foreign exchanges, but the pool of people willing to put money into the network would shrink dramatically and the price - since the whole thing is a ponzi scheme - would also fall dramatically.

If the price of Bitcoin or Ethereum was 5% of its current value, the levels of electricity and advanced manufactured goods dedicated to mining them would also fall to the same extent.

Turkey (an economic minnow) banning exchanges, and a few tweets from Elon Musk have been enough to send prices tumbling 10-20%. Can you imagine if the USA banned the exchanges?

Its also likely that allied nations eg. European countries, Australia, Korea, Japan, India would follow a USA ban with their own.


Exchanges are perfectly usable from Turkey.

What is banned is 1) paying for stuff with cryptocurrency and 2) sending money to exchanges via "digital wallets", you have to use bank transactions.

Looks like they are paving the ground to trace and tax crypto investments.

So yeah, this supports your point anyway.


>About 60%+ of market activity is in the USA: https://www.statista.com/statistics/1195753/bitcoin-trading-...

I don't know where that report got its numbers from but they're bullshit; the majority of trading takes place in unregulated exchanges (that explicitly ban US customers) like Binance and Huobi: https://www.statista.com/statistics/864738/leading-cryptocur.... In large part because the US exchanges charge outrageous fees. The US exchanges are popular with large corporates because there's less counterparty risk, but the majority of retail volume is traded outside the US, on exchanges not hosted by US allies.

Plus, China is actively subsidising miners, and is deliberately supporting the crypto ecosystem due to its potential to further destabilise the US dollar's role as global reserve currency.


The main issue is just in time delivery with very low to zero stock levels that has been popular in many industries. Toyota startes just in time delivery everyone else copied. Just in time delivery is more vulnerable to supply chain disruptions.

Thus there are some value to having a warehouse. Ie something between just in time delivery and keeping lots of stock.


This is been a relatively cheap way for the world to notice this. It could have been much more severe--people can wait an extra year for a new computer or car. Automakers also get to look at this and decide of the added cost of carrying more inventory is worth it for them. They're right to complain to pressure governments and chipmakers, but that just means they lost this bet on just-in-time delivery and cancelling their chip orders. With all the times things worked out for them, they might still come out ahead.


Replacing hardware components by software is not a new trend, it was already a common process when I was hired to work as a real-time/embedded software engineer in the automotive industry more than twenty years ago...


It already has happened.

I have been asked to optimize to return vms to the hardware pool ..


Not only hardware. I think one of the things that can come of this are also big software innovations, because one of the things we may see are people reciclyng old chips and old tech. And lets be honest here: most people dosent't need the latest and the greatest dor daily tasks. They need more efficient software and better interfaces for achieving this tasks in less time. You take a look around you, most people who are not power users can survive just fine witha 10 years computer. They just need access to better software. The same goes for most business cases.


It would be great to think that this would result in more recycling and use of old components, but I doubt it.

I’m fairly sure there’s enough old unused raspberry pies in peoples drawers that could be reutilised.


The problem with cannibalizing existing parts is the following scenario:

Chip X exists in Product A, which costs $10. Without Chip X, Product B cannot be sold. Chip X is suffering from a shortage. The gross margin on Product B is $1000. As a consequence, the manufacturer of Product B buys a ton of Product A on Alibaba in order to cannibalize it and harvest Chip X. Now the gross margin on Product B is $990 (not so bad), but Product A is nowhere to be found.

In other words, the shortage moves to the cheapest available finished product that has Chip X in it, until marginal profit reaches 0.

The consequence is obvious. Product C, which cannot be manufactured without Product A, eventually becomes unavailable. And so forth.

This is particularly gruesome in the semiconductor sector because supply chains for finished goods are so long.


Good point, but I was referring to recycling discarded/end of life products rather than cannibalising/harvesting from unused hardware.

Obviously this would only be viable in low end devices (tooth brushes rather than graphics cards). But there is already an industry based around desoldering old parts and selling them as 'fake' new ones - if this was done in scale with the parts tested and a company could order 20K there would be a market.


You just described current Macbook charging chip situation. The only place to find Intersil isl9240 proprietary charging chip, the one Apple forbade Intersil/Renesas to sell to anybody but them, is in smart battery case for the iPhone XR (A2121). https://twitter.com/hashtag/isl9240

Every time you need a $3 charger chip (real price of isl9241, one at the end instead of zero) to fix a $3000 computer you have to buy a $130 apple gadget, desolder it, and throw that $130 apple garget including battery into garbage bin.

$3000 laptop turned to paperweight due to ISL9240 unavailability. https://www.youtube.com/watch?v=HJ2jyo7pAmE


You cannot stick used chips in a new product, no?


Why not - as long as the chips are tested and product works and is guaranteed - this would actually be a selling point to many people.


Isn't it like selling a "new" car with used parts?


I think the marketing friendly term is "Recycled". About 25% of the steel and other metals in todays cars will have been used before and we'll probably get to a point where batteries are heavily recycled in electric car too.


Entirely different topic, but thinking about how many people got all hyped about the first pi and then realized it was just a very slow PC and not something full of magic is depressing.

There must be a huge amount of pis that ended up in some drawer after a few weeks, and because it's cheap it was never worth it to sell it. And if you wanted one it was so cheap that it was more convenient to order a new one than to deal with private sellers on Craigslist just to save five bucks.

It would be great if we could repurpose all of them in some halfway meaningful way, but at lest for commercial products it's just not feasible.


The article's lack of any assessment about the likely duration of the chip shortage seems naïve if they're trying to make a case that major change of the kind they propose will happen. The closing point about it being a tough time for chip companies also seems odd given the demand for their products and the predicted growth in revenue.


"That’s because electric cars require a lot more computer and chip-based components than those powered by gas"

EVs have less fundamental systems than an ICE. I can see that EV designs being the most modern and entailing a lot of clean sheet thinking would result in more computers, but fundamentally I would think it takes less computers to run an EV.


I can see that EVs may need fewer sensors (or fewer types of sensors, anyway) because of not having to deal with hot gases and flammable liquids. But fewer computing elements, not so much.

A battery pack may be made up of 20 or more modules that each have their own charge management logic, and there are battery heating and cooling systems that need controlling.

Most systems are the same in ICEVs and EVs - cabin control, entertainment, steering, transmission, throttle and brakes, suspension, lights, safety systems, and so on. To me, the difference doesn't seem very large.


Almost makes you wonder how many old, but working desktop CPUs could be underclocked and repurposed.


Yes, except the decade of soldered CPU and RAM behind us.


In desktop computers?


“desktops”: it’s somewhat common with mini-ITX and Intel NUCs


That's news to me. Since when do mini-ITX mainboards have soldered in CPUs or RAM?


The first I've seen was boards for AMD APUs (2011+).

https://www.manualslib.com/manual/628934/Sapphire-Audio-Pure... (soldered CPU in the low res picture)

But it's not specific to AMD mbs, many Intel Atom have a soldered CPU.

https://www.mini-itx.com/2017/10/09/intel-atom-c3958-gets-be... (Atom C3958 [..] is a fully fledged server-grade (!!)SoC)

RAM is usually upgradable. For processors it depends on what you buy.


erm nope?

modular desktop computers were and are a niche.


Most innovation which happens due to a chip shortage is going to be along the wrong path once the shortage is alleviated. Historians will look back on the bizarre, malnourished designs of the "chip shortage era".


Scarcity breeds ingenuity. Opulence breeds complacency.


Isn't it processors that are the main focus of the chip shortage? Surely this means moving away from software, not towards it?


The biggest innovations will come when small companies (somehow) get access to IC fabs.


... or higher prices for devices making use of said chips.


Any updates on the optical CPU front?


Photonics require even more precision engineering. Binning strategies would not work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: