Hacker News new | past | comments | ask | show | jobs | submit login
How will 'chipageddon' affect you? (bbc.com)
164 points by throwaway4good on Feb 5, 2021 | hide | past | favorite | 226 comments



All these articles seem to miss the point between why the car industry is facing shortages, and conflate it with the shortages on high end silicon devices (which have very different causes).

High end devices have seen a massive bottleneck on TSMC and in some of the required materials like ABF, combined with very high demand (new console cycle, people buying new computers for work/study from home).

The automotive sector suffers from another problem. For decades all manufacturers have converted to JIT (lean manufacturing), which means that most of them counted on almost daily deliveries of parts to keep operating. With the transport restrictions and difficulties on many countries, this has become unmanageable and unwieldy, and factory operators have made the obvious call, increase stock margins. Typically a factory might keep stock enough for a day of operations.

A manager sees that they will have problems getting their deliveries in time, so increases their orders to cover a week. This depletes the local distributor that also tries to avoid keeping large stocks. Now a lot of other managers in other companies see that part vanish, and fearing that they’ll be out of parts, they increase their stock keeping too. With boards having from 100 to 1000 unique references, the effects propagate from one manufacturer to another, while everyone runs to get enough stock to avoid having to close.

There has been no less parts than one year ago, but the massive “de-risking” means that there’s now too much demand to keep everyone happy.


Are your really sure this is how it works even for atutomotive electronics? I am involved in the automotive industry, more in tech aspects than in comercial ones, but what I know is:

- car makers don't make their electronics. I don't know of exceptions. What they normally do is to design the architecture, the requirements and buy from Tier 1 supppliers: Bosch, Continental, Vitesco, Valeo, Delphi, Magneti Marelli etc.

- the contracts, when awarded, are multi anual and usualy specifiy the units per year during project lifetime production and in serial life (after the product is not produced, the supplier MUST GUARANTEE to be able to provide the part)

- these contracts usualy go in numbers like N million parts (ECUs) for 4-5 year total (SoP date to EoP date) + serial life for X years. Then they get detailed, even the plants where the parts are made ar negotiated. Slips on both sides come with penalty. I don't know all the details.

- lots of electronics are safety relevant (iso 26262) especially in this case, once HW design is frozen, is frozen. You cannot change silicon components easy as this will make the product undergo a long series of product and design validation which take many months => the Tier 1 supplier MUST SECURE it's own supply from chip makers.

- usual "suspects" to supply the Tier 1 are chip makers like Infineon, Renesas, NXP, STM. Probably at some point, maybe TSMC comes into the picture. There are many providers of small electronic parts.


Yes, I’ve worked with a lot of these different tiers. The thing here is that these subcontractors are squeezed so hard on margins that lean manufacturing is the only way to go.

Most automotive electronics is built around COTS (components off the shelf), which means the fab are a long way away. For instance the manufacturers you’ve mention, work with TSMC for some things, but those products are certainly not the ones for automotive. Most automotive still works on 45nm. NXP announced about 18 months that they had a design for a M7 processor on 28nm (i.MX RT1170, M7 1GHz), and for most automotive vendors, this would be considered too expensive (NXP markets it for the edge computing sector).

This is a far cry from the 7nm EUV processes that are slowing down the Ryzen 5000 rollout.

This past month I’ve had to redesign a board myself, because we were using a component from Texas Instruments targeted for the automotive sector (a DC/DC converter). Not only have all stocks vanished, some “big client” (around my area this means automotive) has already bought the whole stock (3 consecutive shipments) until Q3. And that’s from Mouser, a distributor that for a lot of these companies, is a last resort (like Digi-key, mouser provides parts and fast delivery, which of course means that you pay a premium).


> the effects propagate from one manufacturer to another, while everyone runs to get enough stock to avoid having to close.

Sounds a lot like the bullwhip effect, https://en.wikipedia.org/wiki/Bullwhip_effect


In retrospect they should have called it the toiletpaper effect.


Yes. I thought everyone learned a thing or two in Pandemic with toilet paper and flatten the curve.

Nope. No one cared. They just decide to make an excuse or an enemy to blame and point fingers at.

No one question why Apple didn't have the same problem, considering they are selling 200M+ iPhone ( Likley 220M+ this year ), 20M Mac, 50M iPad, or together 300M+ of their own silicon?

No one question why Apple don't have the same problem with "their" supplier? i.e Broadcom with their WiFi, Qualcomm with their Modem, Skywork, USB4 Controller, etc etc.

Everyone thought TSMC was 7-11, or Circle K or whatever Corner Shop you have in your region. They somehow would have unlimited amount of shelf space for "you" and for everyone else.

Supply Chain and Logistic Management isn't hard. All it needs is some logic and common sense. But over the past 20 years all I have seen is that Common Sense is rather uncommon.


>Supply Chain and Logistic Management isn't hard. All it needs is some logic and common sense.

I find your comment very ironic, since I remember when Apple was notorious for logistical problems that led to shortages of new products. They are very strong in that respect now, and that's how they got to be a >$1T company.


>problems that led to shortages of new products.

That was because demand always exceed supply, and Steve Jobs refuse to give up on the JIT management system. After iPhone 4 they finally give up. Something just dont scale.

That is not to say Apple doesn't use JIT anymore. They still do it with supply guarantees.


This is exactly what's happening currently to an extreme. If you talk to people in the semiconductor supply chain you'll find that this exact effect regularly happens, but it is currently compounded by the lack of production from the beginning of 2020.


Interesting and clear explanation, thanks! Do you also see this as a possible driver for general inflation? Or is this JIT production mostly restricted to the automotive sector?

On a tangent: Is there any good place to learn about the current state of supply chains? I searched youtube occasionally, looking for talks / interviews with people who have in depth insight in what is happening right now (/during the last year) but I could barely find anything. Shouldn‘t this be one of the hottest topics right now?!


> Or is this JIT production mostly restricted to the automotive sector?

JIT is a major trend everywhere where logistics are involved, but in car manufacturers it's the most pronounced - simply because the sheer physical size of many components required for a car would mean that car manufacturers would need big warehouses, which they have torn down or sold many decades ago.

Instead, society has picked up the cost of storage - by building ever more and more bigger highways for all the trucks.


Not sure about jit requiring bigger highways. The components all need to be shipped sooner (and warehoused) or later (jit). Name amount of matter is transported either way.


Holy sh*t. It is so obvious. Yet I have been deceived by the media that the evil autocorps jam our roads to save a few bucks. It is very tiring that at this point we have to be sceptical about every statement that is dealt to us and peel it apart from first principles. I thought that would be the job of proper journalists...


It's not as easy. In ye olde times, car manufacturers had railway uplinks for shipping stuff around... they still have these, but trucks are (way) cheaper than railway.


But trucks should not delay, so they stops/drives near manufacturer.


> Or is this JIT production mostly restricted to the automotive sector?

It has more to do with Human side of things. No one wants to take the blame. Or similar to the old saying, no one gets fired for buying IBM. The concept is a lot easier to understand once you have worked in a large enterprise and witness it firsthand.

It you put too much inventory as backup. Someone will point the finger at you why are we having so much inventory. The world is at peace. We dont need that many.

Your CFO doesn't like your department? Great!. You have now given them enough ammunition to fire the bullet in your direction.

Your inventory might also depreciate in price. How do you know if your component won't drop price in three months time? Although that is also the case with hiking price. But politics is more about protecting your downside, not upside.

You want to put a contract that keep stable supply for at least a year with an accurate forecast of your sales. Which department is in-charge for that forecast? Let me tell you one little secret. There is no such thing as "forecast", but only educated guess.

You want a stable supply and guarantee? But your manufacture will also have to safe guard their Interest. You cant just forecast your buying volume to be 10M / year and only buying 6M, leaving 4M capacity that were allocated for you but now going empty. If you dont fulfil those requirement, you have to paid a penalty. And If you follow news around Apple, you might have heard Apple paying Samsung this penalty because their OLED panel did not meet the contract's minimum target.

In most circumstance, unless your Vendor absolutely hate you, your penalty will be paid back to you and deducted in your next order over large unit volume. i.e You are paying less for the same component next year. So your manufacture will be happy to keep you as a customer for a longer period of time.

None of these are specific to silicon or tech. Substitute the above with Walmart, Cargrill or any player from any industry. Supply Chain and Logistic are pretty much the same everywhere.


I broadly agree, especially with your description of internal politics.

But I also think it's important to admit that JIT actually has real benefits. It's not "just" a finance or paper magic exercise. JIT is an approach to answering to the question of "how much inventory do I hold at each stage of distribution", where inventory held really IS money that you can't spend else where. As in, it's money you can't spend on salary, or R&D or line expansion or anything.

The risk management part is where shit goes astray.


While big company politics play a role, IMO the main driver is that all of these companies have shifted their activities.

It used to be the case that these companies would build stuff (this doesn’t only apply to cars), and charged for that stuff, and thus had an incentive to have their well guarded IP. This era is long gone.

The manufacturing business has become a race to the bottom because this IP is no longer considered profitable. There’s a mantra in MBA parlance “externalize cost centers, internalize profit centers”. For the manufacturing sector this meant keeping the sales business, while externalizing engineering. This has gone to the point that even the subcontractors have undergone that process (that’s why there’s a “tiered” model). Nowadays most of the time Engineering is 4 subcontractors down. This also means that most manufacturers are buying from the same tech providers, and are sharing most of the IP (and their costs).

The only reason this makes sense is that car companies don’t sell cars anymore, their real business is loaning, because it turns out that manufacturing big things is an amazingly good way of generating free cashflows. Look at the earnings report of any carmaker (or GE for that matter), more than half of their earnings come from finance.


JIT is used everywhere, as it improves financial indicators because companies aren’t carrying inventory on the balance sheets. It’s also a big reason why companies in an industry cluster, as proximity to the supply chain de-risks jit.

The shift to modern inventory management coincided with offshoring. It became easier to source parts in Northern Mexico and now China as a result.


I thought it was also because they cancelled their reservations around March when the outlook was quite pessimistic and that resulted in their alotted capacity being picked up by others?


Car industry canceled a lot of orders during the first virus wave in Q1 2020 and when they wanted to increase their orders in Q3 and Q4 their former capacities were already sold. The german carmakers already tried to gain more capacity by government lobbying, Taiwan answered chip capacity for covid vaccines - so doubtful if that helps (https://www.taiwannews.com.tw/en/news/4114962). Volkswagen and BMW already wind down some production capacity and use their chip stock primarily for the margin-high models.


Graphics card prices really are something. In late 2019 I purchased an nvidia gtx 1060 used for $120. Now just yesterday I sold that same card with an additional year of wear on it for $220. This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.

I'd also been keeping track of used i7-4790k CPU prices, looking to upgrade from my i5-4670k. They started out at $120-$130 earlier in 2020 and are up to $160-$170 now. For an eight-year-old CPU!

It used to be the case that computer components aged like milk, price wise, but with the death of Moore's law in the 2010s that is less and less the case. I remember RAM prices spiked a couple years back too.


Well the 4790k is special in that regard. It is the best processor you can buy for the LGA1150 socket. Laid my eye on it too, my plan was to get the biggest performance boost for the smallest capital expenditure. The 4790k would have been a simple swap in and enjoy an at least 33% faster processor in my case.

But instead of splunking 140 for one i got a used workstation with a Xeon E3-1271 (the 2. fastest Xeon without GPU) for 180. It included other parts, which made the upgrade free (if you don't count the work i did) by selling my old CPU+Mobo+GPU. The GPU (RX570) i had was at the same price level on ebay as when i bought it.

If you badly wan't to upgrade (from a 4670k the delta is way smaller than from the CPU i had), i'd suggest looking for an E3-1281 or 1271 if you have an dGPU, if you need an iGPU replace the last digit with a 6

https://en.wikipedia.org/wiki/List_of_Intel_Haswell-based_Xe...

(edit: clarify, add last paragraph and links)


> Well the 4790k is special in that regard. It is the best processor you can buy for the LGA1150 socket

Little known fact and even less available, but: The best processor for the LGA1150 socket is not the i7-4790K, but the i7-5775C. Faster IPC, significantly bigger cache and a stronger iGPU, and despite not being called a K cpu it's overclockable. https://www.pc-kombo.com/us/benchmark/games/cpu/compare?ids%... shows a comparison with many gaming benchmarks.

The Xeon route can be cheaper though, it's a good alternative.


Well, it has the faster iGPU thats for sure. But if you use it for work it is usually slower:

https://openbenchmarking.org/vs/Processor/Intel%20Core%20i7-...

You can't just swap it out too, you need a mainboard with a 9-Series chipset, which you usually don't if you are running a Haswell.


> You can't just swap it out too, you need a mainboard with a 9-Series chipset, which you usually don't if you are running a Haswell.

There are many Z97 boards out there that currently run a Haswell processor, all of them do support this broadwell alternative.

> Well, it has the faster iGPU thats for sure. But if you use it for work it is usually slower:

The pc-kombo collected benchmarks do not use the integrated graphics, that's all cpu performance with usually a high end gpu.

The 5775C not only has a better IPCs, it has 128 MByte eDRAM "L4" cache. That's just cool and helps some applications a lot. That it is slower in some benchmarks is just because of the lower turbo clock. Overclock it and it will always be faster than the 4790K - it likely won't reach the turbo clock of the 4790K, but it just needs to come close, together with the higher IPC that should be enough. Then you have the performance advantages that the gaming benchmarks show and match the performance of the 4790K in the multithreaded application benchmarks.

It's just seldom available, but sometimes when searching for used processors like the 4790K the 5775C turns up and can be had for less.


Sure, there are many availlable, but it adds cost. If you don't want the hassle of swapping mainboards the 4790k is a drop in replacement that gives you 4-4,4GHz out of the box.


If you have a Z87 board. If you have a Z97 board it already is a drop in replacement. Z97 sold like crazy, it's a common starting point for upgrades right now. Just to make sure that comes across :)


I bought 12 2070's for our render farm, and the account manager thought I was crazy. The 3000 series had already launched and would come back in stock next month he said. Who's laughing now, I don't think they ever got enough stock of any card to serve our needs. A week later they didn't have any Nvidia cards in stock, just none at all, luckily I had already bought all their 2070's or we would've run into delays.


One month of 12 2070s today produces a hell of a lot more rendered frames than waiting until next month to maybe get our hands on something. Let me just run and ask the client if we can push their deadline back while I'm at it.


Moore's Law isn't dead. Dennard scaling died, Intel's process hit a wall, and CPU architectural improvements stalled while Intel was stalling, but transistor density improvements have stayed steady year after year since 1970.


Moore's law is about transistors per dollar. If the cost of chips goes up, Moore's law is not holding.


I hate that you're getting downvoted for this. The original paper[1] isn't even 4 pages long, yet most haven't bothered to read it. If they would, they'd see he was speaking about fabrication costs. In particular, it's this passage that got distorted by media:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000."

Somehow what became known as "Moore's Law" excluded any reference to costs. For a similar distortion, see the "Turing Test".

[1]: https://newsroom.intel.com/wp-content/uploads/sites/11/2018/...


That's not really true. Moore's 1965 paper was the first reference to periodic doubling in ICs but nobody called it "Moore's Law" then. Moore put out a few other papers then at the conference where Dennard gave his famous presentation on scaling laws someone coined the phrase "Moore's Law" in an interview and in that interview and in usage since then it was always ambiguous exactly which doubling it referred to because until the breakdown of Dennard scaling all good things doubled together.


Moore's Law is about transistor count per unit area doubling every two years. Nothing to do with money.


No, the parent is right. Moore's law is about the transistor density of the least cost process. This is a quote from Moore's paper [1]:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year [...].".

When the cost part became tough the "cost side" was progressively dropped by being more and more creative on what cost is considered. First ignoring the NRE part to look only at the marginal cost per transistor. Then even this got problematic and the cost aspect was just ignored.

The "true" Moore low creates a virtuous cycle: everybody move to the next node as it's better and cheaper, so there's no point to stay back. Also, because cost gets down the addressable market gets bigger, sustaining more costly fabs. This virtuous cycle operated for a long while, so much that many takes it for granted now.

But if Moore's law die by not being lower cost (the other way is by not increasing density, and this is still progressing if slower than before), this virtuous cycle is broken. You can get higher density, but at a higher cost. With cost increasing only a subset of the market will move on to higher density. Little by little, the market size supporting the costlier high end will get smaller, while the fab cost is still increasing, contributing to a slower progress too. The virtuous cycle becomes vicious. This is happening now, although it's not too bad yet with enough very large actor able to sustain higher price (thanks Apple, I don't use you products but do benefit from all the money you're pouring into TSMC ;).

And in the end, if this gets to a serious slow down, then there is the specter of commoditization. It won't happen soon, only after a possibly very long transition period as the tech is really hard. But if progress slows a lot, the lagger may eventually catch up. With more competition would come lower margins. Which is definitely bad for stock valuation. We're still far away from this, but it's still a possibility. Look how the process leader has always been the biggest cheerleader for the "Moore low is still alive and well" (and let's not talk cost between friends, shall we?). See how it was Intel when they started selling their 10nm future, and look how more discrete they are now, and how TSMC has replaced them as the "everything is fine" voice.

[1] https://hasler.ece.gatech.edu/Published_papers/Technology_ov...


No, Moore's law is transistor density (technically the ‘Number of Components Per Integrated Function’) at the minimum cost per component. Since cost per transistor is still falling on newer nodes, it holds.

Cost per transistor is still a relevant property, but it's harder to analyze since cost numbers aren't public, verifiable figures, plus the estimates I've seen were heavily criticized by people with better information, but from what I can tell it would just offset the rate of growth, not negate it.


It is dead. Best for everyone to move on to the third stage of the seven.


>Now just yesterday I sold that same card with an additional year of wear on it for $220. This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.

a 6Gb version of it would generate about $3/day on Ethereum as of today, so it would be sweet, even by cryptocoin standards, ROI of ~70 days. The rise of Ethereum prices makes even the 2-3x GPU price growth of the last couple months still a good deal. If one wonders where all the GPUs are i think that provides pretty good idea https://cdn.videocardz.com/1/2021/01/GeForce-RTX-3080-Mining...


I’ve been considering using my 3060 Ti for this due to it. But electricity here in Aus is kind of painfully expensive, so I dunno if it’s worth it.

That and it’s already hot enough in my office haha


Is $90 a month even worth the electricity and nuisance cost?


A few years ago I made a tool, that removes nuisances except heat: https://losttech.software/Downloads/Mine/

It stops mining when you use the PC.


that depends on your personal situation obviously - for some running say a hundred of these cards makes for a nice pocket change :)


I sold a 10 year old Asus Rampage III Black X58 motherboard last month on eBay for north of $650, which is more than what I paid for it new (even after accounting for inflation.)

Granted, this was a limited edition board, but it is still 10 year old tech!


CPUs are immortal, it's the motherboards that die. If you want to run a certain chipset, after some amount of time it's motherboard availability that will keep you from doing that!


CPUs are not immortal. One source of failure is electromigration: https://en.m.wikipedia.org/wiki/Electromigration

With RAM it’s a more common problem. There’s even software to check for it (memtest86)


I read that the reason Intel switched to LGA pins from standard sockets is the chip is more expensive and less likely to die than the motherboard, so it makes more sense to put the fragile part of the interface on the motherboard.


> CPUs are immortal, it's the motherboards that die

Can you elaborate?


There are more separate elements of lower quality testing than CPUs (which go through serious qa and binning). For example capacitors on motherboards break/explode, PCB traces degrade/crack more often than a CPU will break.


CPUs have a lower failure rate than motherboards, which makes sense if you think about it. (One is an IC, one is a complex assembly of many ICs and passive components.)


Anecdotally, I'm still running a computer I mostly bought in 2011, i5-2500K still going strong, but the motherboard has a few issues: one of the PCI-E slots works sometimes, one of the RAM slots is completely broken, and the RTC seems to lose a few minutes a week.

CPUs should go pretty much forever, provided they're run on clean power. They're a little square, entirely encased. Motherboards are bigger and with physical connectors you need to use force on.


I'm also on an i5-2500K from 2012, I only upgraded the GPU so far. I recently ordered a new pc for roughly the same price as the one i built in 2012, curious to see how much of a difference I will notice day-to-day.


Back in the day the story was that a knockoff manufacturer of capacitors used an incomplete recipe—the caps started out fine but didn’t last. Wikipedia says the story is unverified but still includes it under the “Investigation” section:

https://en.wikipedia.org/wiki/Capacitor_plague


Anecdotally, I built 4 computer labs at a college with Dell optiplex desktops, and a year later, 30 of the 104 had their motherboards replaced due to blown caps. And at the time, the nearest certified tech was 70 miles away, over a mountain pass.


So litterally up hill both ways


The plague is true... I was repairing motherboards by replacing vrm capacitors around 2008.


No moving parts, small and stable voltages, relatively stable temperatures, no capacitive sections to experience electromechanical strain. There's essentially nothing that can "wear out" in a CPU. Basically the only things that can degrade a CPU are if you run it on a noisy voltage source (unlikely, behind a power supply), or if you regularly turn it on and heat it up a lot and then allow it to cool. But if you keep a CPU well cooled or just run it constantly it will last essentially forever.


This isn't true. Aging is now more of a concern at 16/7nm to the point there's extra margin specially for it. But all that margin does is delay the impact beyond the realistic lifetime of the chip (normally approx 10 years for consumer stuff). Sometimes you can't afford the extra timing margin and you have to use extra anti-aging circuitry instead.

https://semiengineering.com/chip-aging-becomes-design-proble...


> There's essentially nothing that can "wear out" in a CPU

There's electromigration which is harder and harder to mitigate with smaller process nodes


I assume RAM is the same in this regard. If it passes memtest86, it's golden.


CPU pricing is weird. If you look at the 6th-10th gen Core i7s they are not too far off in price, [1] - for example the 6700K is currently $272 and for ~$30 more you can get the 9700K. The current generation usually command a bit of a premium.

[1] https://www.newegg.com/p/pl?N=100007671%20601351803%20601321...


The 6700K has the same "most powerful CPU for its socket" situation going on. When I was looking to upgrade my 6600K, it was absurd how little extra it was to get a brand new Ryzen and motherboard compared to the 6700K.


I built a computer with a AMD 5600XT over a year ago, and was having a hell of a time getting it to not crash, so I bought a Nvidia GeForce RTX 2060 to maybe replace it with (which apparently scared the AMD card into working properly). I feel like a fool right now for not just sticking it in my garage for a year.


Ha! I do believe they cleaned up the drivers roughly around that time. I got my 5600XT in April 2020 and have not had any issues with it. Of course there's luck of the draw, too, but I read a lot of early adopters of the 5600/5700 cards had some issues with drivers.

Looking on eBay, it looks like I could sell that $290 card for about $500...


I have been looking to upgrade my i7-4770k rig to a Ryzen 5900x, but after seeing this insanity and paying attention to what the mfg's are being vague about, looks like I'll be waiting until June (or later). By then, there may be an entirely new generation of chips ready.


CPUs aren't too bad. I've got a 5950X I bought at MSRP and I just found a site with a 5800X in stock for MSRP. 5900X on the same site is backordered but didn't see how long the wait is but I doubt it's months considering the 5800X was in stock.

Intel is releasing Rocket Lake shortly but it won't have any options with more cores than the 5800X. Zen 4 isn't launching until late fall. Not sure availability will be any better on either of these launches anyways.

GPUs are a little bit more ridiculous. You can get them if you don't need it to show up tomorrow but getting one at a reasonable price any time soon is much more a challenge than doing the same for a CPU. That being said I did manage to get a reasonably priced 3070 in a build early on but I put that to luck more than anything.

Motherboards I haven't had any issues with.


There's okay stock for 5600X + 5800X but the 59xx series is nowhere to be found and probably won't be until later this year if you read between the lines of what the industry is saying.

I broke down waiting for a 5950X and bought one off a scalper last week :/ luckily got it before the recent spike in price.


I want an 5900x, but I will end up buying another Intel CPU if AMD can’t get their act together and provide stock that I can buy without jumping through hoops or encouraging scalping.


That is the interesting thing I think covering AMD on the GPU side and Intel on the CPU side, there is so much demand vs supply that less performance per price products are viable because MSRP has gone completely out the window.


I refuse to pay/encourage scalpers (doesn't bother me others are) so I will wait, unless my system simply explodes.


Yeah, I went to ebay to buy a used AMD card to replace a very old Nvidia card, and I was shocked at how expensive 5 to 7 year old cards still are.


> components aged like milk

Sour milk is not worth very much but it then becomes yogurt and cheese which are much more expensive. So I guess your analogy still holds? ;)


Huh - wow. I think I have a 1060 in my closet. I bought 2 for 100 each in 2019. Was planning to use one for GPU VM pass through and could never get it to work properly.


Might be sitting on a good chunk of change with those! Especially if they're 6gb models.


GPU prices are really insane. 13 years ago I bought a brand-new Radeon HD3850, which at the time was one of the best GPUs available; the budget version of a line where the HD3870 (and later the HD3870 X2) was the high end.

I paid €128 for it.

That's not going to get you anything today. 3 years ago I paid €350 for a GTX 1050 TI. Now, everything new starts at €500, and everything older is a waste of money, says everybody.


Mindfactory is one of the biggest sellers of hardware in Germany. They always have a budget GFX card for around 50 EUR on their starting page, but recently they have replaced it every week or so with an older and worse model at the same price. Right now they're featuring the 7-year-old GT710! Two years ago, I'd have thought this could only happen in case of total economic collapse.


GTX 1080 is the most bought, and resold GPU card around.

Still not dying, and it was made in enormous quantities because nVidia made a big bet on crypto craze.


According to steam hardware surveys, the 1060 has been king for years


No need to aim for the top. Xeon 1231v3 will run you ~$70 and be 2x faster than current i5-4670k.


My wife uses a 1060 and runs just about everything at high/60fps. It’s still a great card


I have a 4790k it's still pretty good work horse but finally getting to the point you'd be worth going up to a older Ryzen instead.. the pricing is similar and you get a memory boost etc. Though you do need a new mobo/ram but still.


Yeah I was weighing just upgrading the whole motherboard to a ryzen chipset or something, but I have an ITX build so it would be pretty expensive for the features that I want.


I have a 4770k and because of the shortage rock a 3060ti I just got out of sheer luck. My friend might have snagged a 5950x but until it ships he’s not taking any chances ordering other parts. He’s been trying to get one since launch.


That is really interesting. I've got a GTX 1060 3GB that I purchased for $280, looks like I could reasonably sell it for $200CAD on ebay. $80 for 3 years of GPU usage wouldn't be a bad return.


But what are you going to do in the meantime? It's like people happy that their houses are increasing in value, but so do all the other houses. You've still got to live somewhere.


Interest rates have gone down though. Presumably you sell your house and break the current 3.5% mortgage. Then buy a bigger / more expensive house and lock in at 1.5%. Overall your monthly payments remain similar.


> Overall your monthly payments remain similar.

This is not a good way to measure things. Did you just exchange 10 years of monthly payments for 30 years of "similar" monthly payments?

My mother protested to me over the cost of her new house, defending the idea that she could afford it by pointing out that monthly payments were similar -- property taxes on the new house were about the same size as mortgage payments on the old one. I was kind of horrified by this; conceptually, when you make a mortgage payment, your net worth goes up. When you make a tax payment, it goes down.


This is how we get a housing bubble. The expectation of never ending credit that gets cheaper and cheaper. One day interest rates will go up and your house will not be as valuable as your mortgage says it is.


In the meantime I'll continue to use my RTX 3070. The GTX 1060 is sitting in a box.


Transistor have gotten smaller and transistor counts have continued to rise over the last 10 years. You might be thinking of frequency scaling, which is not the same as moore's law.


> This is a four-year-old card design that can barely run modern games at 1080p/60fps/medium settings.

Nonsense. It runs all games on High/60fps or better, with the possible exception of Cyberpunk and RTX marketing gimmicks. That's why it's still a relevant card 4.5 years later and goes for $200+. I own one.

> I'd also been keeping track of used i7-4790k CPU prices, looking to upgrade from my i5-4670k. They started out at $120-$130 earlier in 2020 and are up to $160-$170 now. For an eight-year-old CPU!

Funny, since I own one as well (4670k). And I haven't ever felt CPU-limited. Mine is mildly overclocked to 4GHz, and it's been able to crunch everything I threw at it since I bought it in 2014, again with the possible exception of HEVC encoding. It's still a very relevant CPU and handily beats Ryzen 1st gen for example.

It's also the last generation of Intel CPUs that lets you run a non-spyware version of Windows.


Ray tracing is a marketing gimmick? Just because high end GPUs aren't able to do it well doesn't mean it's a gimmick.


As far as it related to gaming, it's absolutely a gimmick.

I'm sorry you fell for the marketing.


You pretty much need ray tracing for reflections of objects that are not directly visible - hacks like SSR cannot give you that. Even if you don't care about graphics, RT can absolutely have an impact on gameplay too. And if you care about graphics then RT makes realistic lighting so much easier and with much less restrictions. Modern games have to fake so many effects just to make things look good which do come for "free" with RT.

Now for the current iteration of RT graphics cards, it is still very limited so calling it a gimmick is a bit more warranted. But the hardware will improve (and has already from the first RT generation) which will allow more games and engines built arround RT instead of having it tacked on as an option.


Ray tracing is a significantly better graphical effect.

Ray traced reflections are realistic reflections and look outstanding in Control.

Because they're based on the real level geometry.

Calling it a gimmick just means that you personally don't care about graphics.

I've got a £450 PS5. I'm playing Control right now and I choose the 30fps ray tracing mode because it looks much better.

That's probably sacrilegious to you.


> It's also the last generation of Intel CPUs that lets you run a non-spyware version of Windows.

Can you elaborate on this please?


I think they mean it is the last Intel CPU generation supported by Windows 7.


It's stressing me out. I'm an EE at a startup and I'm scrambling to pre purchase whatever's left on the market to get us through the next 6-12 months. Thank goodness I already laid in a huge stock of processors, the parts I'm scrounging for right now are for battery management, signal isolation, and switching voltage regulator control.


I was doing electronics design and prototyping a couple of years ago when MLCC capacitors were getting impossible to find in common values. I feel your pain.


I remember that, there was a shortage of caps and shortages of low end CPU's across manufacturers.

Placing Planned Purchase Orders for your current products helps a lot to weather supply problems.


What are you making, if not a secret?


The shortage is amplified by the us china tech war that forced key supplier asml to limit deliveries of chip making equipment.

Also tsmc cancelled euv equipment orders due to the huawei ban:

https://semiwiki.com/forum/index.php?threads/asml-reports-20...


Current shortages impact processes that aren't EUV. For example, the consoles, AMD CPUs and GPUs all use non-EUV N7. And Nvidia is on Samsung's 8nm, also not EUV. There are delays and shortages all over the industry if you read DigiTimes. Packaging in particular. Capital and lead time sensitive industries like this suffer greatly when demand increases unexpectedly.

Second that linked forum thread is really unclear. Some even thinking that Intel was the customer who delayed EUV machine orders (which would make sense given their 7nm problems).


Current shortages shortages are not even because of 300mm foundries.

200mm is what most of automotive chips are made on.

Cheapest 300mm equipment is many times more expensive than 200mm.


[edit: this post appears to be getting a lot of downvotes. please consider replying if you disagree]

> The shortage is amplified by the us china tech war

I keep hearing this vague accusation, over and over, with no evidence provided.

Taiwan is not part of China -- at least as far as the sanctions are concerned.

China exports lots of finished electronics but very, very few semiconductors -- and virtually none for the automotive sector. SMIC is not competitive. They're kept afloat by the Chinese government and supply almost exclusively for domestic products.

> that forced key supplier asml to limit deliveries of chip making equipment.

Source?

> Also tsmc cancelled euv equipment orders due to the huawei ban:

How would that make any sense? TSMC is completely booked to the hilt right now. Would any sane person think "gee, that's a great cancel our expansion plans"

That $150 million ASML EUV scanner order that got cancelled was from SMIC, not TSMC:

https://www.electronicsweekly.com/news/business/asml-delays-...

Again, China not Taiwan.

> https://semiwiki.com/forum/index.php?threads/asml-reports-20...

I'm not going to read the whole thing, but the word "Huawei" appears only once on that webpage, as part of an anonymous blog commenter's assertion. That is not a credible source.


The source is asml in their latest conference call. It is a bit much to read through so I linked to this forum thread.

The forum thread links to the conference call transcript in the very top.

Smic got their euv scanner blocked. Tsmc cancelled an order. Both due to the tech sanctions.

https://www.fool.com/earnings/call-transcripts/2021/01/20/as...


> Also tsmc cancelled euv equipment orders due to the huawei ban:

> https://www.fool.com/earnings/call-transcripts/2021/01/20/as...

The word Huawei doesn't appear on that page either.

Please stop spreading misinformation.

The sanctions are on Chinese companies, not Taiwanese companies. China's chip exports are somewhere between zero and insignificant.


You need to read the transcript more carefully.


Maybe you could quote the parts of the transcript that you think support your assertions.


Cancellation of production facilities at TSMC due to Huawei/HiSilicon ban:

"... as clearly our key foundry customer came back and said, listen, our key customer for N3 is now blacklisted. So we cannot ship. So we need to adjust our 2021 outlook for EUV systems."

Could sell more had it not been for the US sanctions:

"... but clearly see potential upside to these numbers where we can disregard any further impact of export control regulations resulting from the current geopolitical situation."

The CEO is business person and for him to do any kind of politics or worse geopolitics would hurt his business. So he clearly cannot just say: "Listen, let us sell a bunch of these bad boys to the Chinese and the chip shortage will be a thing of the past in 6 months." Even though that just might be the case.


> Taiwan is not part of China -- at least as far as the sanctions are concerned.

Also as far as anything else is concerned (except spineless organizations like the UN).


Thanks for sharing. Key quotes from this:

"it's supply chain limitations by design, because the customers told us, we don't need them, and then coming back and said, oops! we might have been wrong."

From the CEO it's due customers thinking it was going to be bad (for various reasons) and then it wasn't. The process have long lead times and when you pause them you can't just expidite them, they have a fixed lead time.


The article says it is wafer limited, which wouldn't be related to asml would it?


That's what happens when you centralize silicon production to a few players. Integrated circuits are a requirement for modern life. Every country should have its own fab, even if not competitive, to cover its own needs during shortages like these. What if the shortage was longer and more severe? Are people living in smaller countries willing to go without electronics?

I hope people will learn from this.


Fabs already run 24/7. Decentralizing them across more countries wouldn't automatically give you any spare capacity.

As far as I see it, we need more capacity, and for a situation like this it wouldn't matter if that extra capacity was in 50 different countries or if it was all in a 50 km radius in Taiwan.


It would be nice if silicon required to run critical infrastructure is manufacured localy or at least not by an opponant in a (cold or economic) war. IPhones are pretty low on my list of concerns but I would like to have working powerplants, power distribution and communications.


Powerplants, power distribution and other industrial tasks outside of IT don't require latest-generation parts. There are dozens of fabs in many countries that make ICs on older nodes for these applications. Cheap, proven and quite ubiquitous. Anyways, anything safety-critical would likely require re-certification when ported to newer hardware.

Edit: just realized that also supply of these ICs stalled. Right now, it's hard to see whether this supply shortages are just due to temporarily increased demand or not. If the latter is the case, it is still easier and cheaper to set up fabs for older nodes.


I agree but I worry that some of this capacity will go away because it's not profitable enough. I also worry that although the core functionality of, say a water treatment plant, may run on quite old an easily sourced hardware (most likely Siemens PLC:s where I live) some of the monitoring and coordination stuff is dependent on latest gen hardware.


At least Taiwan isn't a country in any kind of war (cold or hot) with the west. They are, however, in a cold war with China.

The current technical situation does make peaceful cross-strait relations critical to global technology, however. A hot war between China and Taiwan would send the silicon state-of-the-art backwards by quite a bit, as would any trade embargo post-annexation if China somehow accomplished that peacefully.

Taiwan's silicon fab prowess is a pretty potent strategic weapon against a much larger adversary!


Natural disasters, wars, sanctions, might be a few differences.


> Every country should have its own X...to cover its own needs during shortages like these

This line of thinking gets very impractical very quickly. It's also a recipe for war because interdependence makes countries play well together.


While your idea is good, the cause and effect is wrong.

Countries don't go to war because they are independent. Countries do have incentive to maintain peace when they are dependent.

Two very different things.


> ...interdependence makes countries play well together.

This line of thinking led directly to WWI.


Not really. People thought that countries were too interdependent to wage war, but they were wrong and that had nothing to do with the myriad of reasons for the war ( rivalries, historic and current, Russian development (which put Germany waging war on them on a timeline), Austria-Hungary falling apart and trying to compensate, the UK being skeptical of Germany, France wanting revenge, etc etc etc etc)


This is not what led to WW1. It merely failed to prevent WW1.


Fabrications are capital intensive yet the returns are not high. Something that many VCs/countries quite abhor.

Also, mass sanctions last year doesn't help.


Funny because VC was invented to raise money for semiconductor fabs


Yeah, but 20 or so years ago they realized that luring new grads with foosball tables and free soda to work on getting people to click on ads is a faster and easier way to make money than making chips which is a slow, capital intensive business where you need to think long term and respect gray beards, so there you go.


I wonder which kinds of chips would result from minimizing capex instead of transistor size. There are some people targeting DIY garage fabs but from what I know they basically target old processes and old chips; I wonder if, once that takes speed, current knowledge, techniques and material availability, that differ from the 70s, will show some proper DIY innovation...


Article say it is wafer and substrate limited, how would fabs in more countries help?


> Are people living in smaller countries willing to go without electronics?

Are people living in smaller countries able to afford redundant fabs?


Maybe not, but simply having workable fabs producing reasonable quality on all continents from different governments / companies would go a long way.


Smaller fabs do exist, but they don't enjoy economies of scale unless they could project very strong demand down the road. Nobody invests in such a venture guaranteed to lose a lot of money.

Another issue more pertinent to automotive manufacturing is certification. For example, car makers once had to wait almost a year for Renesas to rebuild their facilities in Naka following the 2011 earthquake despite the company operating fabs in multiple countries. The latter were perfectly capable of producing the product in demand but the cost of certification is so prohibitive that moving production elsewhere was never brought up as an option.


I bet it's still cheaper to roll the dice at a 1 in 50 chance capacity is down 20% for a year than overbuild fabs. They're semiconductors, not staple foods.


And people thought Intel's fab problems were going to doom it. Maybe in 5 years, but right now, they've got a product 80% as good as competitors when there's a shortage. It's a compromise a lot of people are willing to make right now.


The real scary thing is we're down to TSMC and Samsung who can now hit the latest node; that seems to be the problem - and the cost of building fabs is insane; it looks like we really need to do research into cost.


We are doing research into cost. This stuff gets massively cheaper every year. It's cheaper and easier than ever to build a 16nm fab.

Profit oriented companies take those cost savings and choose to drive further down the technology stack (to more complex and expensive nodes), where there are fewer competitors that have enough profits to build competing production lines.

Moore's law is almost our friend here. Being at 3nm instead of 16nm is decreasingly important.


> It's cheaper and easier than ever to build a 16nm fab.

Is anyone doing ~16nm microprocessors besides Samsung/TSMC/Intel/GloFo? I wouldn't call it "cheap and easy" if only they can do it.

(After a quick Google, it seems that China's state-owned SMIC is also starting to do ~16nm, but that's all I could find, unless you mean memory and not processors).


UMC and IBM, not sure if IBM has a full fab or just the IP


IBM no longer owns any fabs, it sold its chip business to GloFo in 2014. You're right about UMC, I missed it before.


I wouldn't call it "cheap" if it's not cheap for a new player to enter the market, but instead only marginally cheap for established players.


I've noticed the used video card prices have skyrocketed. I've put up an eBay auction for a GTX 1060 6GB and it ended up selling for twice than what I bought it used two years ago. Also lots of people DMing me asking if I have more to sell. Is there some mining craze going on with old cards?


> Is there some mining craze going on with old cards?

Combination of mining craze and massive demand for entertainment electronics and GPUs since everybody has been stuck at home due to pandemic and decided to build/upgrade their computers.

Which also lead to the situation that scalping outfits have extended their services from covering mostly designer appeal to now also include electronics like GPUs and consoles. The popularity of this also attracts a lot more people to use these kinds of services as a form of investment, which most certainly does not help an already supply constrained market.


On ebay I bought from a seller unloading five RX 580x previously used for mining to power an external GPU to game on a mac.

I think a lot of people are getting into gaming again and a lot are looking for cards that can play their game just well enough.


And mining. Ethereum is still mined often on videocards. And in wake of Bitcoin skyrocketing, ETH prices are way up too. So it pays more (in dollars) to mine, now.


Seemingly another side effect of quarantine


A car doesn't need a computer, its more needless complexity and more points of failure. A classic car can't be hacked, locked or broken by software. The crisis is direct consequence of adding useless bloatware into cars, which only need simple microcontrollers - not latest 7nm CPUs.


I worked in automotive 10 years ago, and I completely agree. It had become insanely bloated since.


No car is using anything even near 7nm. The automotive sector is very cost sensitive and if anything, there’s too much complexity because they insist on using parts that are too old.


In that case the car companies would have no problems with CPU supply https://www.wsj.com/articles/ford-other-auto-makers-cut-outp...


Where do you see that the shortages only affect 7nm parts? Because they don't, a wide range is impacted.


Market is dictated by 7nm as its the current gen lithography tech(with some 5nm fabs operating now) . You don't see them rushing to build fresh 40/65nm fabs just because cheap processors are in deficit, because its not profitable long-term.


And still, parts larger than 7nm are currently in shortage. 7nm is obviously extreme due to large competition around limited fab capacity, but not the only area with problems.


Current 2020 automotive microcontrollers are nowhere near 7nm. 65nm and 40nm is where it's at right now.


No, most of automotive is on 8bit MCUs, which are made on anything in between 90nm, and 180nm.


It's already stalled onboarding of HFC customers in Australia due to lack of modems. Apparently the supplier of modems (Arris) are unable to source enough SoC's from Broadcom.

Due to this, people needing replacements and any in-flight new orders apparently have wait times of up to 9-10 weeks...

Not helping this is that some people try to sell them in the secondary market despite them being premises (or rather segment) locked.

[1] https://www.nbnco.com.au/utility/temporarily-limiting-the-nu...


For others who didn't know (like myself), HFC in this case does not refer to Henry Ford College, nor to Hydroflorocarbons, but to Hybrid Fiber-Coaxial networks:

https://en.wikipedia.org/wiki/Hybrid_fiber-coaxial


I am waving my money to 3 companies - AMD, Nvidia, and Sony for new computer parts and a ps5. Alas, haven’t even been close to getting any of them.

Come take my money


According to the efficient market hypothesis, if there's profit to be made and those companies aren't taking it, someone else will. Have you considered eBay?


We're obviously not in an efficient market, because supply is limited and market entry is nigh-impossible for truly new suppliers, and also because scalpers are viewed as evil and explicitly acted against.


It’s a product with limited elasticity. You can pay a bunch of money for a PS5, but the product is such that few people are willing to do so.


I would rather the mentioned companies run freaking dutch auctions on lots than what we currently have.


Efficient market hand-waving magic goes out the window when scalpers wildly and artificially distort availability.


What are you talking about? Aren't scalpers doing exactly what efficient market hypothesis would suggest by buying products that are underpriced (graphics cards, consoles) relative to demand and selling them at the highest price people are willing to pay?


They only become that valuable when you completely restrict the supply by scalping them. If I went to the shop and bought all the food and resold it at a higher price that's not 'the market at work', I'm just not a dick head.


This! Particularly when we've reached a point where "scalping as a service" has become a thing.

There are outfits out there that charge a monthly subscription for access to prioritized acquisition channels in combination with bot networks.

This is streamlined to such degree that people don't even need to hold physical inventory themselves: The scalping outlet handles all of that for them, all they need to do is supply the capital for the bots and set the parameters for buying/selling, everything else will be handled by the service.


What you are talking is market manipulation not scalping. How many scalpers do you think truly have $30 million to buy 100000 GPUs?

The situation is far simpler. Consumer demand exceeds supply which drives prices up. Some people figured out where the current equilibrium price is and it turns out that it is really high.

Those high prices encourage consumers with old GPUs to sell them on the market and thus provide more supply instead of throwing their old GPUs away every time they buy a new GPU. If you were to artificially restrict the price it would result in less people selling their GPUs and thus you would end up with less GPUs for everyone.

If the manufacturer is selling a product that is worth X at market rate for a lower price Y then it basically gifted the difference X-Y to the buyer. If the manufacturer is selling the product at market rate prices they will have higher profit margins which encourages them to build a slightly bigger fab next time. It would also ensure that existing manufacturers make enough money to continue running their business despite the paper thin margins outside of a shortage.


How long until the TSMC in Arizona gets built? Or other similar things.

I feel like we need to find peaceful ways to resolve our severe differences with China. Either that or stop depending on China's neighbors for critical technology.

I just hope that there can be some kind of progress with the new administration in Washington.


[flagged]


TSMC is Taiwan. They aren't genociding.


Taiwan aren't, but the OP was saying that they could be the next to be genocided: "stop depending on China's neighbors for critical technology"


I was not trying to get into the genocide thing. I just know that Taiwan and South Korea are very close to China. And so I feel like even before China really builds up their military much more, the proximity of those places to China is a pretty significant military issue for US and its allies. Its hard to see how it does not provide military advantages.

More generally, it seems like we want to hold out hope that in the long run serious military confrontation can somehow be avoided down the line. But if not, chip production is a fairly key item that you don't really want to be happening on your enemy's doorstep.


I just need a gt 1030 for an old pc and the prices just seem outrageous for such a low end card. It's more expensive now than it was at release almost 4 years ago!


I would think "right to repair" has a lot to do with this. When you have to replace everything only after a few years, a lot more chips would be needed.


Maybe this will somehow stop the insanity of premature obsolesence. A 6 year old Android tablet is hardware wise in great condition. Some games that happen to install work great.

However, the last software update was about 4 years ago.

The tablet has storage expansion, but due to artificial limitations in the Android system, only a fraction of software will utilize the sd card.

These are only minor examples.

I could rant for hours on the ”who cares, it’ll be garbage in a year attitude” that has invaded much of the tech world, but others have done a better job. I my self am some times party to this attitude.

It all results in a lot of waste of all kinds and also bad user experience.

To look at the silver lining, maybe this scarcity and some other trends will result in a small corrective trends in engineering and management.

In the meanwhile I’ll keep dreaming of a new gaming PC for a year or two.


> In the meanwhile I’ll keep dreaming of a new gaming PC for a year or two.

I was finally looking at replacing the 8-ish-year old parts of my gaming PC which has otherwise been running like a champ (Intel 2500K), and now... Well, I guess I'll just focus a lot on pixel art games :P


I certainly stress a lot less, since I decided to get off the hype and upgrade train. I've got 600+ games on my Steam account that run perfectly on my PC, even though most of the hardware is 2011 vintage and the only "new" hardware in it is an SSD and an RX560.

Life is much calmer in the slow lane.


Yeah, although I'm getting a little nervous about failure-rates now that some core parts of my system are in the 8-10 years old range. (Basically everything but the video card and SSD.)

Particularly the factory-sealed liquid-loop, which has both moving parts in a pump and gradual liquid chemical degradation to consider. On the flip-side, that particular component seems like it would be less sensitive to silicon-fabbing issues.


My view is that if a component fails, I can either find a second-hand replacement or buy a replacement refurb PC and move necessary components over. I can get by with just my laptop for a while, but obviously I don't need a PC for work purposes, that would change the equation.


It's small, and on the grand scale probably negligible, but e.g. Fairphone is making phones which are designed to be repaired and maintained. They are growing fast, year over year.

If anything, such companies show there is demand and money to be made, there.


I was going to post about as much the same. To add to this is how everything is deprecated so quickly now. I like to use my phones until they break or I am almost forced to. I am just not inclined to buy a new model every year. Same with my laptops, TVs etc. I have an old Sony Ericsson which I can tell they no longer want to support, the updates are coming much more slowly and I swear with each one they are slowing the performance of the device (but perhaps that is being paranoid), I have an old thinkpad which runs a rolling release style linux distribution and so I don't need to upgrade all the time and my TV is 720p / non smart which looks fine to me. I am honestly happy with this lot, but the manufactorers most likely see me as a bad customer.


I think you can go too far in your direction. A 720p TV doesn't look nearly as good as a Full HD or 4k. I remember watching a shot of a sky filled with stars on a blueray movie and having to pause it, just because it was that breathtaking.

There is nothing wrong with a solid thinkpad if you can replace the battery so that it still works well, but my days of jumping from outlet to outlet because I got 20min on a full charge are over. So are the days of carrying a heavy laptop with a bulky charger.

I brought a oneplus phone some years ago and I paid too much for it because I wanted a "great phone". I don't regret the ultra-fast charging and day+ battery though, and my next phone is probably also going to be a oneplus, just a much cheaper model.

So if there is something to sum up this rambeling comment, it should probably be this: know what you are missing in your current tech, and upgrade as it makes sense to do so for your personal use case.


That is why when IBM made the IBM PC based on the Intel 8088 they made Intel license the chip design to AMD and NEC in case Intel could not make chips quick enough to meet demand.


Are there specific chips, or products from particular companies or geographic regions that are in short supply, or is this just a general lack of fab capacity?


I asked this question the other day. From what I heard, it varies, but for some items like microcontrollers the lead times are out to 52 weeks.

There's a lot of noise about the shortage of high end parts from TSMC, but I'm much more concerned about shortages that can impact more common consumer goods. Having issues like this when the economy is trying to recover from a pandemic could have far reaching consequences now that so many products contain semiconductors.


Certain manufacturers have been hit hardest, examples being NXP and Infineon, at least partly due to an increase in demand for automotive components. It seems as though motor control, and 'connected home'/IoT devices may have also contributed to shortages.

Either due to maxed-out fabs or shifts in uC/uP usage, the shortages are wide-spread.


From what I've been able to read it's not fab capacity but ABF substrate shortage screwing up most chipmakers


Source? Ajinomoto Build-up Film is used only for high-end flip-chip packaging. Very few automotive ICs use such advanced packaging. Most non-EV cars probably don't have any at all.

https://www.ajinomoto.com/innovation/action/buildupfilm


From what I gather from my sources that are involved in the procurement side, it sounds like there is a general shortage that is affecting certain chips more than others. Say you are using a chip where there are only a handful of customers, then companies aren't going to make your part as they have other higher runner chips they can run instead.


Both. An example which is impacting my company is that the factory that makes 80% of the world's TCXOs burned down a few months ago in Japan.


Some versions of the ARM-based STM32 seem hard to acquire, particularly the STM32F103 which are quite popular for cheap hardware projects.


Yes, prices for low end AMD processors are crazy and stocks are non existent. Just before Christmas I decided to build a spare computer and wanted to get an AMD Ryzen 3100, prices for that was hovering around the 99 GBP mark. Then I had to put it off until the new year. Now 3100s are like hen's teeth and if there's any the prices are about 180 GBP. I'm not that desperate right now, I just hope prices will drop but the question is when?


The pricing and availability is crazy because the suppliers are afraid to ramp production.

In many categories, like cheap laptops, webcams, small printers, etc demand exploded. More Chromebook orders came in 2020 Q3 than normally happen in a year. Laptop LCDs are super-constrained as well. The dilemma is that expanding manufacturing capacity now will reduce margins tomorrow, especially since the order books will be empty as COVID fades as schools open again.

My guess is you’ll see prices in free fall as the edu backlog clears and COVID restrictions fade. Probably this time next year.


My experience as well. Bought an Athlon 3000G last year for £42, it's now over 100... I'll have to postpone building that PC for now.


So this is what purism was talking about. Seems like the wait will be longer yet again. They don't have much luck those guys, do they?


What?



Sounds like the beginning of "Accelerando" as we're slowly starting to turn our vicinity into a Matrioshka brain.


This is really hurting me, I've noticed that there are fewer inexpensive Chinese arduino-knockoffs with better specs.


Is this related to why Sony has been having a hard time catching up with demand for the PS5?


Yes, they use the same type of new AMD GPUs


The PS5 uses a custom Zen 2 processor. It uses the same 7nm node as Zen 3 though, which is quite enough to make TSMC slow in fulfilling both orders.


I'm in the market to replace our 11 year old car and the supply has just dried up with what we'd been looking at. Best case seems like maybe the summer, but at that point, I'd almost rather wait for the next model year.


Switched to Apple chips for desktop and phone. Looks like their supply chain is still intact.

Sold my old system for around what I bought it for, so if you want to game these times are whack.


Ha it's bothering me the way you worded that since you can't get those chips without the attached hardware and software! But it does get me thinking. Obviously we think of Apple systems as premium, and so we can expect premium computer chips to get prioritized - Apple likely paid TSMC handsomely to get (most/all?) of the 5nm allocation for the A14 Bionic and M1 chips. But are other high-end chips "premium"? Because the AMD Ryzen 5, 7 and 9 (5600X, 5800X, 5900X, 5950X) are all in very short supply (at least on my U.S. online stores).

BUT... those are just for enthusiast builders. You can find OEM systems with those chips.

So I guess the only lesson is that being a computer manufacturer will give you priority on high-end chips, and in times of short supply, that's where they'll go first, drying up the supply for small buyers.


This is a perfect moment for China to invade Taiwan and disrupt TSMC. That would leave only China and Korea as two main sources of modern semiconductors.



I thought this was going to be about chips made of potato. What a relief.


How do we profit off this?


Apparently there is a big shortage for ABF (Ajinomoto build-up film) substrate, which is big part of PCB manufacturing (see for example https://www.hardwaretimes.com/ps5-supply-to-be-affected-by-a..., https://www.digitimes.com/news/a20201214PD202.html).

Big suppliers of ABF substrate includes multiple companies from Taiwan Unimicron, Kinsus Interconnect, and Nan Ya PCB, which all trade in the Taiwan stock market.

I considered investing in those companies...but seems like only Nan Ya has been showing an upward trend. Perhaps we are still early?


Do second-order manufacturers ever profit from situations like this? All margins seem to go to final product makers, not to a whole chain.


I can't answer broadly or about profit specially - but the stocks of all big wafer producers have risen sharply since fall 2020, something to the tune of 30% or more in a couple of months.


Well this is really a long shot, but the chip shortage is causing production cuts across the car industry. Roughly half of all palladium is used catalytic converters. The cutback in new vehicle production should have a pretty noticeable affect on the price of palladium. If you can time the end of the chip shortage then you can make some money in the palladium futures market. How far fetched do you think this is?


Depends. Who is "we"?

Presumably this will be good for the foundries and makers of industrial semiconductor machinery, and bad for fabless companies. Hard to know the time horizon, however.


[flagged]


Oligopoly so


Sure if you want to, it would at least be more correct than implying there is a monopoly, which is categorically false.

Though i'm not sure how many cutting edge semi conductor fab players we should actually expect, the investment required is significant.


Nationalise them so


China will do it for you. Lets see how that goes, fair?

Why anyone thinks a state monopoly is some how better is beyond my reasoning.


Works well for China!


Nope, they are still significantly behind the cutting edge in terms of performance. Get back when they are leading.


Depends who you ask.


GPU improvement rate is stalling (or more likely pixel count growth is slowing), so less visual improvement easily obtained? And given that cryptocoins created a new use for GPUs, we have a price influx in even "old" cards.

My 2¢


GPU improvement rate does not look to be stalling. Nvidia is still introducing impressive new features (support for HW accelerated ray-tracing, NN acceleration for stuff like DLSS, etc.) while steadily improving performance. It's just getting harder to make games look nicer, because they already look really nice. Artists/Graphics devs have learned to do really impressive things using just rasterization. As someone who is interested in GPU programming, ML and rendering, I personally find it really impressive what Nvidia has been doing over the last couple of years.

Though I have to say, for all its faults, CP2077 really blew me away graphically (using the RTX support) like no other game did so far. Also, VR would also really benefit from better GPUs. And 4K gaming is also not quite there yet. So actually, I maybe wouldn't even say that. Performance improvements are still welcome.

The problem is more that there is an unprecedented demand for compute. ML, Crypto mining, Gaming being more popular than ever with Corona, etc. just means that demand is higher than ever. Combine that with the new console launch and exhausted semiconductor manufacturing capabilities and you can easily see why customers are biting the bullet and going for less pricy, older cards.


4k gaming is fine unless you brute force lighting by ray tracing. Even Cyberpunk which is very unoptimized. VR is a separate story. It needs both high resolution and high frame rates. 8k is not handled by current GPUs.

DLSS is a cute hack. We have better scalers than that in video market, its main advantage is speed. Instead, fixed function scaler should be better than outdated 5-tap filter. Remind me again when it matches basic scalers put in Smart TVs.

The gaming GPUs are hilariously underpowered for compute, even something like 3090, vs MI100 or A100. It's just the latter cost $10k per unit.


I’m just on 1080p and a GTX 1070, so haven’t experienced it for myself, but from Digital Foundry’s breakdown of DLSS 2.0 in Control, it sure looks to be more than a “cute hack”.

https://www.eurogamer.net/articles/digitalfoundry-2020-contr...


DLSS 2.0 has shown to in certain cases increase the detail of some textures in a scene compared to native rendering. [0]

What TV upscaling does that?

Also ray tracing produces significantly more realistic lighting. It's not comparable to traditional lighting methods in accuracy.

[0] https://youtu.be/IMi3JpNBQeM


GPU improvements are definitely happening. But yeah, pixel count is stagnant and will be for a bit.

All the work is going into memory and caching. Fine tuning the arch. Ray tracing and such have also changed the latency tightrope you balance a bit.

For an end user, the difference in visual fidelity will probably be minor. As an GPU researcher myself, this is an extremely exciting period.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: