You can tell that it originates from within the semiconductor industry by the generally chaotic presentation (e.g. the "stuff everything in and add arrows" layout of the slides and picking a bunch of arbitrary, jarring, and unnecessary colors)—on top of the already-dubious decision to use PowerPoint to explain these ideas in the first place (out of a reflex—read: crutch).
Semiconductor industry is almost all B2B. People don't invest much time into trying to make technical content look aesthetically pleasing. It only gets read by other technical experts who don't value aesthetics as long as the information is clear.
This is another half-ass attempt to preempt someone saying something pointed and true by subtly tweaking their intent until it's possible to try and interpret it—or rather: deliberately misinterpret it—as saying something that it isn't. Join your friend. <https://news.ycombinator.com/item?id=30265930> This kind of defense mechanism gets old.
Beauty has nothing to do with the criticism here. (You seem to have missed, for example, the "and unnecessary" part of the comment; people put in work to make things look this bad.) And "as long as the information is clear" is not a bar that most of the kinds of slides that are under attack can pass. That's in fact the entire point of the criticism.
Somehow other industries outside semi (involving really successful companies filled with technical people who also don't prioritize aesthetics) are able to put together something for people to read by simply... writing shit down (you know: something that actually does involve reading), instead of relying on dodgy PowerPoint slides (that of course don't involve reading so much—but do often involve suffering).
Besides that, the statement that "Semiconductor industry is almost all B2B" is a combination of nonsense (i.e., wrong) and irrelevant. The majority of the semiconductor industry where this kind of material plagues the average worker's day is employee2employer.
I am not defending anything. I am not saying it's the right way to do it. I am just reporting my own observation of how the industry currently operates, from the perspective of someone working in the industry.
The way your previous comment was worded implied a level of attempt to justify the presentation style as opposed to being worded like: "their argument is [It's B2B...]." And while it's admittedly such a minor nitpick on an internet comment made by somebody I probably don't know, I bring it up because I also work in hardware, and your argument (or quote thereof) echoes of of managers throughout the industry that justify half baked EDA tools, arcane software, disjointed workflows, and visual design akin to drawing with crayons. These things make me want to pull my hair out. We're trying to make chips on bleeding edge lithography processes, yet we're banging rocks together like cavemen when it comes to software, and for no good reason either!
And to be clear, I am not trying to justify the response you got, I am simply trying to point out the level of frustration that some engineers in the industry (like myself, and a lot of people on HN) have with our workflows. Especially when we see all of these exciting new tools on HN!
This is exactly the kind of just-so, cargo cult explanation backed by appeal to authority with a tinge of strawman-y, macho bullshit[1] that I would expect when it comes to the semiconductor industry. Having to "read" something "difficult" is not the issue.
Don't misunderstand me. If we grade on a curve by taking into consideration the typical fare that comes out of this sector, these slides are definitely A+ quality.
And it's telling that fabless is placed below the foundries. Situated between actual fabs and their equipment vendors? And beside IDMs? Total category error, but correcting this, of course, would mean that foundries don't get to be the top piece.
[1] "Pain is gain" is a phrase that I've used in the past (and will continue to use) to describe the resistance to improvement and as an explanation of the day-to-day suffocation of working at a chipmaker.
That's because you have to pay an extra licensing fee for clear and concise visual design. Besides, that would add complexity - imagine if there was something that needed to be changed after the article was published! /s
It seems like the SC industry is a bunch of Brent from the Phoenix Project where miracles happen when they enter a trance and find the right bit to flip at the right place and that their PowerPoint presentations is a snapshot of the neural network inside their brain.
Does this seem a little outdated to anyone else? Like, ImgTec haven't really existed as a top tier player for almost a decade, and AFAIK Apple doesn't license Ip cores from ARM, but the ISA itself.
Also, I feel like this diagram is just weird. The pyramid makes no sense to me, I always think of it in terms of physical and logical, you have the physical - the Materials, WFE, Fabs, and logical - IP Cores, and Fabless Semis, with the EDA tools somewhat providing the integration. The logical organisation of the pyramid makes no sense to me.
It is a little bit dated. But the core concepts are all still valid. I posted it because - as someone working in the industry - I find it is somewhat rare to find a public article that attempts to explain the "big picture" landscape of the semiconductor ecosystem.
Some of it is a tiny bit outdated or paints a partial picture (STM does 22nm too for instance, although I'm not sure if it did in 2012).
But overall I think it's pretty current and accurate.
One thing is doesn't do is point out that "x nm" doesn't have anything to do with gate length or physical dimensions anymore. It's commercial speech for transistor density (vertical FinFET are denser than planar transistor, gate length is roughly the same).
I hope it's not too outdated, but if it is I'd be really interested in a newer article that's written like this. I really appreciated the simple breakdown of the supply chain. As a total outsider, I've always really wanted to understand why TSMC and ASML (for instance) seemed to be so important politically.
No disagreeing with your point - the article does seem outdated - but Apple still does license some Arm core designs I think - eg in the U series SoCs.
Missing from this article, and rather important in the current supply chain issues situation, is packaging (testing, dicing, packaging into the chip package e.g. BGA, QFN, SOIC etc).
Packaging is generally done at separate factories from the fabs. A vast proportion of packaging factories are in China, something to keep in mind in the context of "trade wars": https://www.marketresearchreports.com/blog/2019/04/24/top-10...
Interesting information but I'm not sure about the accuracy of their chart depicting the leading edge node manufacturers. Until the last few years, Intel was ahead of TSMC and Samsung (Intel's 14nm process was equivalent to TSMC's 10-7nm from my understanding). So if Intel is excluded from the current leading edge manufacturers, shouldn't others be removed from the graph for the early 2010's?
Yeah this is one thing I don't get about the Intel is doomed crowd. Semiconductor manufacturing lead has changed hands a bunch of times in the past, with Intel being in the lead for probably most of the past 2-3 decades. TSMC has had notable problems with nodes and has been about as far behind Intel as Intel is behind now for periods.
Why wasn't TSMC doomed then? And why couldn't Intel turn around now like TSMC did?
Intel looks nothing like other vertically integrated CPU businesses that divested from manufacturing in the past, it's still making a lot of money and gross margins are still healthy (and still higher than AMD and TSMC at the moment!) although clearly taken a hit from the mid/low 60s where they typically sat down to mid/low 50s in the past 3 years. Manufacturing silicon is still a profitable business for them.
Intel also has by some important metrics, the best performing CPUs available despite competitors being on TSMC and Samsung. I don't argue the others aren't better in other equally valid metrics. It was actually much more difficult to make that same statement about competitors back when Intel had a clear silicon tech advantage.
So definitely Intel is a leading edge node manufacturer if others were in the past.
One of the things to understand about Intel's case is that they were never a foundry. Instead, their process leadership was wielded as a weapon to ensure that their devices maintained performance leadership.
In that sense it was almost irrelevant to the rest of the market what node Intel had achieved, since nobody else had access to it. Similarly, because it was core to their business model, the foundry business itself was not very cost sensitive; as long as the process tech was exclusive and enabled them to make the world's best processors, they could extract princely sums for the final products.
This has fundamentally changed in a way that will not revert. First, we have reached the "end of Moore's Law": process improvements will still continue, but no longer at a pace such that process leadership guarantees performance leadership. Second, without this guarantee of performance leadership, Intel's entire manufacturing operations are having a crisis of purpose. Their manufacturing operations are much less cost-effective than TSMC because that efficiency was never really a major driving force in the past (and to some extent was at odds with the goal of innovation and leadership).
Without the serious prospect of regaining a performance advantage, Intel will probably not be investing the kind of money needed to displace TSMC in process leadership. Instead the focus is going to be on making sure their existing fabs are not stranded assets and figuring out how to effectively compete on price/performance without a persistent process advantage.
> In that sense it was almost irrelevant to the rest of the market what node Intel had achieved, since nobody else had access to it.
That doesn't follow. Intel has been massively relevant to the semiconductor manufacturing market for 50 years. Arguably the biggest single influence and the one which gave rise to (high performance) foundries in the first place by driving vertically integrated competitors out of business.
> This has fundamentally changed in a way that will not revert. First, we have reached the "end of Moore's Law": process improvements will still continue, but no longer at a pace such that process leadership guarantees performance leadership. Second, without this guarantee of performance leadership, Intel's entire manufacturing operations are having a crisis of purpose. Their manufacturing operations are much less cost-effective than TSMC because that efficiency was never really a major driving force in the past (and to some extent was at odds with the goal of innovation and leadership).
I don't follow this either. What fundamentally changed? Foundries have been around for almost as long as the semiconductor industry (UMC founded in 1980), and have been huge players in the market since around 2000.
> Without the serious prospect of regaining a performance advantage, Intel will probably not be investing the kind of money needed to displace TSMC in process leadership. Instead the focus is going to be on making sure their existing fabs are not stranded assets and figuring out how to effectively compete on price/performance without a persistent process advantage.
Intel has publicly stated their plans to build new fabs, invest in new nodes and close the gap with competitors in the next few years so that doesn't appear to be their focus or assessment of their prospects. Whether they actually achieve that or not remains to be seen, but if you are going to say they can't, then you're left explaining how TSMC could -- back as recently as 2015 or so you can see articles wondering whether TSMC will be able to close the gap with Intel. Nothing suddenly changed between 2015 and now, Moore's law wasn't okay then and suddenly just stopped in 2018 when Intel started falling behind, so I don't think you can just handwave about that to explain it away.
> That doesn't follow. Intel has been massively relevant to the semiconductor manufacturing market for 50 years.
Well, sure: Intel pioneered FinFET, cobalt, even 300mm. TSMC and the rest of the foundry world basically followed their lead. But as any foundry customer, it didn't matter what Intel's state of the art was because you couldn't pay any amount of money to obtain anything other than a final Intel product.
> I don't follow this either. What fundamentally changed?
Limiting physics. It is very hard to continue to make devices smaller at historical rates. Nobody is keeping up with the historical 2-year node cadence any longer. And the "nodes" now are marketing smoke and mirrors, anyway; they don't actually mean that device density has doubled.
> Intel has publicly stated their plans to build new fabs, invest in new nodes and close the gap with competitors in the next few years so that doesn't appear to be their focus or assessment of their prospects.
They cannot publicly pull the plug for two major reasons:
1) They need to attract customers for their move into the foundry space (again, their fabs becoming stranded assets are a nightmare scenario)
2) They're hopeful of getting US government subsidies in order to continue competing on a basis other than pure cost/performance
Moreover, "closing the gap" could make them competitive in the foundry space, but it would leave them in a different position than they occupied historically. It was their technology _leadership_ that provided a competitive advantage, and it's why they are still so famously paranoid about their intellectual property.
> Moore's law wasn't okay then and suddenly just stopped in 2018 when Intel started falling behind
> Well, sure: Intel pioneered FinFET, cobalt, even 300mm. TSMC and the rest of the foundry world basically followed their lead. But as any foundry customer, it didn't matter what Intel's state of the art was because you couldn't pay any amount of money to obtain anything other than a final Intel product.
Well that's a walk back. You said they were irrelevant to the rest of the market. Now it's a foundry customer couldn't pay any amount of money. Of course. (Well not strictly true, Intel did try to get into the foundry business and did get a customer or two, they just didn't do well https://semiwiki.com/semiconductor-manufacturers/intel/7912-...)
> Limiting physics. It is very hard to continue to make devices smaller at historical rates. Nobody is keeping up with the historical 2-year node cadence any longer. And the "nodes" now are marketing smoke and mirrors, anyway; they don't actually mean that device density has doubled.
I don't see how the argument has been made. Limiting physics was limiting in 2015 as well. Progress has still been made, and progress will almost certainly continue to be made for at least several more node, just at a slowing rate.
There is no before/after I've seen that makes a rational case for them previously being able to compete and now not able to. "Moore's law slowed therefore Intel can't get ahead of TSMC" isn't really arguing anything.
A crude approximation is money. Investment in R&D and fabs. So that's something you could argue, but when you look at Intel and TSMC, it's not obvious that TSMC has vastly more money to spend there.
> They cannot publicly pull the plug for two major reasons:
[...]
This is going into conspiracy theory territory. You think they're lying to investors and customers and the US government to fraudulently get subsidies investment and customers?
Look, you said they were going to not attempt to invest in regaining an advantage, and that they would focus on getting the most out of their trailing nodes. That's just not what's happening at the moment. And with the amount of money Intel is making still, I think they would be mad to give up now. I don't know what writing on the wall there is everybody thinks they can see here.
> Well that's a walk back. You said they were irrelevant to the rest of the market. Now it's a foundry customer couldn't pay any amount of money. Of course. (Well not strictly true, Intel did try to get into the foundry business and did get a customer or two, they just didn't do well https://semiwiki.com/semiconductor-manufacturers/intel/7912-...)
I was maybe not clear: Intel was irrelevant from the perspective of a foundry customer because their service was not for sale; it didn't matter what node they were at. Of course for foundries themselves, Intel was (and to some extent still is) relevant as a benchmark and as a technology leader.
> I don't see how the argument has been made. Limiting physics was limiting in 2015 as well. Progress has still been made, and progress will almost certainly continue to be made for at least several more node, just at a slowing rate.
I agree with this statement and it's compatible with what I said above.
> This is going into conspiracy theory territory.
Uh.
Do you think Google is being completely transparent with the public about things that are being said about Google Cloud in board meetings?
I didn't say they're going to just fall over dead. Obviously landing huge subsidies or government support would change the equation. But while Intel management is certainly aware of the state of things, they're not going to ensure a self-fulfilling prophecy by broadcasting pessimism.
What they're not going to try to do is regain the process crown because it's a fool's errand. That's what I said. It might bear emphasis that that is different from competing at the leading edge.
> Do you think Google is being completely transparent with the public about things that are being said about Google Cloud in board meetings?
That doesn't follow. There is a big, big difference between not being completely transparent and making false or misleading statements to investors.
You said they've got no prospect of performance leadership and are going to focus on just getting the most out of their existing fabs. This is not a fact and not supported by their statements and actions so far. It's a wild guess and pretty unlikely to happen given their investment pipeline. Some time in the future, possibly. As a result of the circumstances right now when they are making money hand over fist with their leading edge nodes and building new fabs? No.
> Limiting physics. It is very hard to continue to make devices smaller at historical rates. Nobody is keeping up with the historical 2-year node cadence any longer. And the "nodes" now are marketing smoke and mirrors, anyway; they don't actually mean that device density has doubled.
There are plenty of process improvements to be made at the trailing edge of development, where the majority of manufactured chips actually are. The leading, most expensive device nodes which are most clearly bumping into physical constraints, will gradually be less and less representative of the industry as a whole.
Transistors actually keep getting smaller. 7nm->5nm devices got smaller. The transistors actually got significantly faster too. A lot of the problems are in wiring and powering the things, leakage, etc. so that's where more innovation is needed.
But the leading edge is far from standing still. I GAA and such pan out well that could be a very significant bump for the next few nodes. People act like everything stopped because "Moore's law is dead", but when you look at the rates of improvement in most other manufacturing sectors, semiconductors are still advancing at an enviable pace.
Dennard's scaling is long dead[1]. Density increases thanks to "3D" integration, but transistors do not shrink much. Thus more innovation needed in the areas you highlighted to make a "smaller" equivalent process node.
FinFET and GAA do achieve slightly better electrostatic control which results in lower leakage and slightly faster switching speeds, but gate length doesn't decrease much anymore, which puts a ceiling on switching speed improvements (thus frequency - GHz), and gate capacitance (linked to power consumption).
Increasing densities mean more avenues to explore specialized circuitry and dark silicon. Basically, more transistor to play with.
This is running very contrary to the previous paradigm with Moore's Law, in which specialized circuitry and specialized accelerators were not too deeply investigated as software implementations would gain so much performance that they would easily surpass them, while not needing to be redesigned for a newer node.
I'm now hoping for (and expecting) a revolution in maskless lithography at some point. That would democratize ASIC design (at 14nm, current multi-project wafers prototyping costs are around 10k€/mm², largely due to mask manufacturing), and make it much easier to explore the third dimension (thus making everyone more competitive with last gen).
Right (I'm also providing links for other readers). However, when mentioning the "nanometers" process node as evidence for scaling (it was unclear to me whether you said that it implied shrinking, or there had been some shrinking in between the two), I feel like I have to mention that transistors haven't been meaningfully shrunk for years, although there have been plenty of innovations regarding their layout, and photolithography processes have gained in precision.
Transistors are still at the hearts of ICs, so every bit of efficiency gained there has tremendous impacts on system-level performance and efficiency.
Some of the biggest possible gains on the horizon might come from circuit designs, notably adiabatic computing.
> However, when mentioning the "nanometers" process node as evidence for scaling (it was unclear to me whether you said that it implied shrinking, or there had been some shrinking in between the two),
No, my intention was actually converse of that, saying that transistors still shink, even in leading edge nodes today, despite popular perception.
> I feel like I have to mention that transistors haven't been meaningfully shrunk for years, although there have been plenty of innovations regarding their layout, and photolithography processes have gained in precision.
TSMC 7nm->5nm increased transistor density by 80%. There were other changes to minimum cell size too, but both are 6T in this case so the overall device that can operate as a transistor in a VLSI does absolutely shrink.
Dennard scaling ending isn't about not shrinking, it's about power density going up. Even then things are not standing still, practical designs absolutely get denser and do more work even in the logic parts of the design. It's just doesn't double every couple of years anymore.
> People act like everything stopped because "Moore's law is dead"
This is really not what I said anywhere. What I said is that the pace of advancement is slower now, and that process supremacy is no longer sufficient to guarantee performance supremacy.
That's a problem for Intel, since it was a core assumption of their business model.
I think the chart is meant to be an approximated summary, not a precise timeline down to sub-year intervals. Even though Intel has fallen somewhat behind compared to norm, I think they should still be considered "leading edge". The "Intel 7" process (formerly known as Intel 10nm process) is comparable in transistors per mm^2 with TSMC 7nm. Really, we should all drop the "nm" naming conventions and simply go with transistors per mm^2 for a given design pattern.
This is nicely done, despite some nitpicks in the comments; it should be interpreted as giving a qualitative picture. The recent development not mentioned is that China is launching a massive effort to bootstrap its own EDA industry, and they've poached a number of stars from the big three, with very generous signing bonuses. This is to get around sanctions, which are starting to bite.
With how critical semiconductors are, I'm really surprised having nationally owned fabs, even a few nodes behind, isn't a priority for big governments.
For China it is. The US seems to have been under the impression for the last couple decades that geopolitics don’t exist anymore.
On that note, if TSMC’s technical supremacy isn’t a core part of Taiwan’s national security strategy I’d be shocked - whatever else the US may think about Taiwan & China, the fact that our entire electronic infrastructure relies on TSMC in some form or fashion ensures we have a deep and serious ongoing commitment to their independence from our chief geopolitical rival.
Not mentioned in the article: almost all semiconductors used outside of computers and cell phones are produced on old processes - 40nm and up. Fabs for these are cheap - maybe in the $1 billion range - but no one builds them because of low profit margins. Instead these mostly get built in fabs that used to be cutting-edge, long ago.
Maybe the automakers could pay a little bit more for these chips, and it would become profitable to build fabs for them again. But everyone knows that the moment people forget the semiconductor shortage, automakers will try to drive the profit on those chips back to zero again, and whoever built one of those fabs will be left holding the bag. Basically the automakers and/or Wall Street would rather forgo the sale of an entire car than pay a buck or two in profit to someone else for the chips that go into that car.
The best term I've heard for something like this is "end-stage capitalism".
There is another problem: all of the equipment makers have stopped producing the "old" tools that are used for >40nm process 8 inch wafers. People in these "old" fabs seriously go on eBay to buy used parts to replace broken parts inside tools in their fab because the original tool maker stopped making spare parts years ago.
The very long form is an EE degree with an emphasis in VLSI.
In seriousness, there just isn't as much information about semiconductors publicly available like there is in software. If you can make your question a little more specific I will try to help.
You're absolutely right, I just sort of assumed it would be easily available.
Nothing specific, I wanted an overview of the whole thing. I guess the question I'm asking myself is "What would I need to know if I wanted to setup a fab from the ground up?" Which materials are used and where they come from, the different types of companies that would need to be involved and why (e.g. ASML) and so on.
But honestly, I don't really know much. So an intro to semiconductor manufacturing that could be read by a plain old software engineer would also be great.
There is a video called "Silicon Run" (1996) which I feel gives a good overview of semiconductor manufacturing. This video is now quite old, but the major principles remain the same. Seems to be available on YouTube: https://youtu.be/3XTWXRj24GM
Infographic is outdated and borderline propaganda.
The war with China has started. SMIC is shut to outsiders. Hong Kong fell to China. Taiwan is expected to fall next.
So what's going down? Everyone getting into the chip game because they must.
Europe’s Chip Act, Infineon is building chips in Germany
Chips for USA Act hasn't quite passed yet. Biden issued an executive order that is unchallenged by anyone. US sanctioned SMIC.
Samsung will be building a fab in Taylor Texas near to Austin. Intel will be building 2 fabs in Arizona(RISCV!)
Samsung obviously also building much more in Korea.
TSMC is building a fab in Japan with Sony. Everyone hoping the fabs in taiwan don't fall. Really good chance those fabs will be destroyed if China moves on them.
India is blowing up big time with fabs.
The number of secret fabs is surprising as well. Tons of them popping up in the midwest, primarily sourcing for the automotive industry. Makes sense to keep them quiet because it seems there's lots of fabs catching fire or being hacked and shutdown. Oh the nature of war...
> Infographic is outdated and borderline propaganda.
> Taiwan is expected to fall next.
You'll do everyone a favor by not exaggerating that much. You make it seem like there is consensus around the expectation that Taiwan will "fall next" while there is no such thing. You're also making claims about fabs being set on fire or being hacked by foreign entities without any sort of evidence.
If you believe this to be exaggeration... this is happening right now. Out of curiousity, which country are you in? How do you not know about this? 10 links provided and can be easily added to.
>You make it seem like there is consensus around the expectation that Taiwan will "fall next" while there is no such thing.
Is your claim that Taiwan won't be next and it's someone else who is next? Taiwan is second? Or is your do you believe all these governments are spreading propaganda?
>You're also making claims about fabs being set on fire or being hacked by foreign entities without any sort of evidence.
I suspect you're not asking for me to prove APT groups are chinese?
> If you believe this to be exaggeration... this is happening right now. Out of curiousity, which country are you in? How do you not know about this? 10 links provided and can be easily added to.
Yes, the media is crying wolf and also saying it won't happen, similar situation as around Ukraine/Russia, and similar to any other type of conflict in the world. Media organizations thrive on stirring the pot, and will always have scary headlines, especially when there is even a hint of anything approaching a conflict. I'm currently in Europe but live in different places in the world during the year, what about you?
> Is your claim that Taiwan won't be next and it's someone else who is next? Taiwan is second?
I believe that things are uncertain right now, and claiming that you know or that everyone "expects" something to happen to be an exaggeration. Linking a bunch of online newspapers with names like "warblog", "militarytimes" or articles about how Australia now has a deal with the US about nuclear arms, doesn't exactly improve your case. Most newspapers just wants views like everyone else, and they'll write news that makes you click on the headline. It takes careful consideration and weighing different sides of a story to see what's going on.
> Or is your do you believe all these governments are spreading propaganda?
Yes, every government on the planet does this. Some are more obvious, some are less. Some do it for betterment, some do it for personal gain. But what's clear in every single government on Earth, is that they participate in propaganda.
> I suspect you're not asking for me to prove APT groups are chinese?
I'm asking you to provide evidence for these two claims:
> Taiwan is expected to fall next.
> lots of fabs catching fire or being hacked and shutdown
You're saying fabs are being put on fire and/or being hacked by foreign entities, while still not providing any evidence for that happening in practice, essentially spreading conspiracy theories. The statement from akm says "there is no clear prospect for identification of the cause of the fire or for restoration of operation of the plant" and the Reuters article makes no claims regarding the cause of the fire.
> Multiple fires hitting non-chinese chip fabs? Just a coincidence... nothing to do with the ongoing war...
It could be related, it could not. Without any sort of evidence-based approach, you have about 0% chance at arriving at the correct conclusion. And while you don't have a clue what's going on, it's better not to spread theories that are not based on facts but instead boils down to "So you think this is just a coincidence huh?!".
>Yes, the media is crying wolf and also saying it won't happen, similar situation as around Ukraine/Russia, and similar to any other type of conflict in the world. Media organizations thrive on stirring the pot, and will always have scary headlines, especially when there is even a hint of anything approaching a conflict.
I'm right there with you in understanding the media's fake news and the somewhat recent move toward yellow journalism. That's not what is happening here. These are top officials, actual agreements, and very real reactions to overthrowing the government of hong kong.
It's challenging, but not everyone is fake news and throwing this out by labelling me as exaggeration is wrong.
>I'm currently in Europe but live in different places in the world during the year, what about you?
Canadian.
>I believe that things are uncertain right now, and claiming that you know or that everyone "expects" something to happen to be an exaggeration. Linking a bunch of online newspapers with names like "warblog", "militarytimes" or articles about how Australia now has a deal with the US about nuclear arms, doesn't exactly improve your case. Most newspapers just wants views like everyone else, and they'll write news that makes you click on the headline. It takes careful consideration and weighing different sides of a story to see what's going on.
I have provided many differing sources and you excluded all of them on the grounds they are fake news. This is fallacious and won't convince anyone of anything.
>I'm asking you to provide evidence for these two claims:
Why bother? You will instantly claim the evidence is fake news.
>You're saying fabs are being put on fire and/or being hacked by foreign entities, while still not providing any evidence for that happening in practice, essentially spreading conspiracy theories. The statement from akm says "there is no clear prospect for identification of the cause of the fire or for restoration of operation of the plant" and the Reuters article makes no claims regarding the cause of the fire.
Now I am a conspiracy theorist. Okay.
>It could be related, it could not. Without any sort of evidence-based approach, you have about 0% chance at arriving at the correct conclusion. And while you don't have a clue what's going on, it's better not to spread theories that are not based on facts but instead boils down to "So you think this is just a coincidence huh?!".
By extension it competes with Intel and AMD [and NVidia?] - as phones swallow the world, Both Apple and MS have ARM desktop OS's, and the big boys like Adobe are rewriting for ARM.
But I don't see how RISC has any hope, as ARM is already everywhere, and so much software has been written, and is being converted to being ARM compatible.
In fact, I don't see how x64 has any real future!
What am I missing with both AMD and RISC? How does RISC get such good press?
And - OT - how anyone choose a name which projects the concept of being risky ;)
> But I don't see how RISC has any hope, as ARM is already everywhere
You are ignoring the very low end.
ARM requires a royalty. It's not much, but it is what it is. RISC-V requires no such royalty (that I know of).
At the lowest end kind of chips, a fraction of a penny royalty is a big deal.
Currently, those chips currently all use proprietary designs. All the tools supporting those chips are absolute trash, as expected. Converging to a single architecture that can use all the same design tools and software ecosystem would be a big win.
I view RISC-V like I view autoformatters. It actually kinda sucks, nobody likes it very much, but converging to a single point brings more benefits than holding to your own unique opinions.
Aside: If I had my drothers, I'd converge backward to something like DEC Alpha EV-4/EV-5 (21064/21164). EV-4 was good enough to provide the basis for the StrongARM architecture. The software tooling for those chips still exists even if a bit dusty. MIPS dumb-ass byte access patents have long expired so we can have real byte access instructions. And in-order execution is often a plus in the embedded world rather than a minus.
I'm no expert, but in lieu of not having any sibling comments yet, let me offer my amateur view:
The name RISC-V (which I'm supposing you're referring to) comes from "reduced instruction set computer (RISC)" (from around 1980s), with a added "five". It's a instruction set architecture (ISA) which is basically a abstract specification for how a CPUs instructions could be. RISC-V is not a company, but a "open standard" for this ISA. While ARM for example both maintains a ISA, runs a company and builds chips, RISC-V is maintained by a foundation which many companies are a part of.
RISC-V is exciting because it's one of the few open standard ISA with a lot of public references and available material for curious hackers, compared to what we've seen before. So far, the performance and the power consumption (as far as I know) hasn't been as optimized as other architectures, so it's still early to tell if it'll have some larger impact on the industry at large or not.
Intel and AMD are focused on the x86(-64) ISA, with a strong presence in the desktop and server space [1].
ARM is everywhere mobile, they really cornered the "low-power" (ie energy) space way ahead of Intel and AMD. They provide a great ecosystem of IPs and tools around them. Think of them as the "battery's included" DIY provider for building chips.
But both of the above generally target general-purpose CPUs [2], the one you usually think as being the core of your computer. But there are others! DSPs have dedicated instructions for saturated math (no overflow/rollover) and multiply-accumulate, GPUs are DSPs on steroids, and there's otherwise an entire spectrum of complexity you can have from your 8-bit integer microcontroller that doesn't even have a divide instruction, all the way up to the massively-parallel floating-point processors.
Problem is, that spectrum was divided between many different proprietary ISAs, each with their various levels of tool sophistication and support.
RISC-V is based on the idea that we can do a better job unifying the ISAs across this spectrum of processor complexity. It's an extensible ISA that starts with basic integer-only operations and direct addressing, and can extend with instructions for more sophisticated operations such as adding 32-bit floating operations, 64-bit float, supervisor mode, hypervisor mode, SIMD... All of which can share a common ecosystem of compilers, formal verification, debuggers...
But the real popularity of RISC-V is that you can use its ISA, and design a chip around it, freely without having to pay a (expensive!) license to ARM. This does present an existential threat to ARM, although they have a strong head-start in offering a ready-made ecosystem of tools. Do you want to spend $10M in engineering manpower and experimental silicon tape-out designing your new chip? Or just pay $2M to ARM to license their known-good CPU?
It takes a lot of effort, expertise (and iteration!) to design a CPU core that's as performant and feature-full as what ARM spent years (decades!) perfecting. Do companies choose RISC-V because it allows them to re-use their tools on novel chip designs, or because it's just cheaper and, new as it is, sufficient for their purposes? It's exciting to watch the development.
(Also I can't help but notice that many of the RISC-V chip announcements are coming from China)
[1] Fun fact: in the 80s and 90s, x86 was the "cheap, consumer" chip that was looked down upon by the Workstation class RISC-type chips like the Sun Sparc, or DEC Alpha (and much of Apple's chip design team sprang from a core of ex-DEC designers!) But Linux came along, Intel kept improving their chips, and took over the workstation/big-iron space from the incumbents
[2] cue the corrections about ARM's (and Intel's!) GPU design, co-processor architecture...