Hacker News new | past | comments | ask | show | jobs | submit login

This isnt to surprising. They need money to continue their transition to their new Royal CPU baseline and frankly everyone and their dog is now in the ARM market, if they want to see growth its time to move onto something else.



Need the money is rarely sound reasoning strategically


It’s just a bit over $100mil. An insignificant on the grand scale of things.

Also ARM’s valuation is not exactly rational (their business model is pretty poor) so it seems like a reasonable time to sell


It is when your products are trailing competitors on both fronts (AMD and Nvidia) and are facing potential class action lawsuits over existing products (13th and 14th gen CPUs).


The thing is that this is a tiny amount in the grand scheme of things. It's been a while since I looked at the financials, but IIRC it was something like $33b per year of revenue, and $26b per year for R&D. Selling off ARM is something like a day's revenue, so it won't have been done just to raise a bit of cash, more just a recognition that it has nothing to do with their core business right now or their future plans.

From those approximate figures above (that I probably misremembered), personally I wouldn't worry too much about Intel short-term. They can get back to positive cash flow quite quickly by reducing the R&D expenditure quickly as long as they get over the PR headache of the current gen chips, and seemingly the new microcode is reducing peak voltage spike on single/two-thread loads. There's a lot of bad press right now, but the majority of business comes from data centres who probably weren't cranking all the overclocking levers and I'd guess any recalls will almost all be from the much smaller ultra-keen gamer segment.

Longer term, there's a risk if they cut the R&D spending too much and then find it hard to re-hire those engineers, but there's a good chance they'll get competitive again when they finally get their new process working, and that might be what they need to finally get their TDP levels back competitive with AMD again. It's worth watching the Asianometry video about backside power delivery - if they get this right, there's a very good chance of future Intel chips being more power efficient than AMD's.


[flagged]


>x86 needs to die.

Because?


It's an ISA designed last century. It's time to stop tacking on extensions and build something better.


Every architecture that survives gets filled with extensions. That's not the issue.

Even the time period is not the issue per se, though it plays a part in it. The real dividing line is more like 1985 than 2000 though.

CISC is the issue (at least, at the ISA level). Processors of that era were made to be "friendly" to assembly programmers. There are redundant instructions with slightly different semantics, there are many addressing modes, there are not many registers, and the instruction encoding is an illogical variable-length mess hidden away by the assembler.

RISC on the other hand starts from the premise that humans don't write assembly much anymore, they write high-level languages (yes even C counts) which get translated into machine code by a compiler. So an ISA is a compiler target first and foremost. So it's better to spend silicon on more registers and more concurrent/parallel work than on addressing modes and convenience instructions.

Obviously this battle has already been fought and won on a technical level by RISC; that's what the R in Arm originally stood for, and no modern CPU design even considers CISC. Heck, even x86 uses RISC-like micro-ops under the hood and the 64-bit version of it added a lot more registers. But the CISC instruction decoder is still there taking up some silicon.

The real question is: does that extra silicon actually matter that much in practice anymore? It's dwarfed by the other things a modern processor has, especially the cache. The answer seems to be contextual even today. An x86 server, desktop, or gaming console running at or near full load on CPU-intensive tasks still outperforms Arm (though by less and less each year), even though Arm handily takes the crown at lower power levels. Is that due to ISA alone or other factors too?


Arm was also designed last century.


AArch64 was designed in 2011. It's not just A32 with everything widened to 64 bits; it's substantially different.


I'm not criticizing ARM, I'm criticizing x86.


I thought we were in the context of the competition between x86 and Arm but sure, what is a non-last century ISA then? RISC-V, a very conservative rehash of an architecture from the 1980s? Realistically there are no other commercially viable contenders, sadly.


If you want to criticize x86 then you may want to support the assertion. Maybe something along the lines of bloated legacy features and horrible power requirements. However, in my opinion it's perfectly fine for a desktop computer. Just not so great for phones, tablets and portable computers.


> horrible power requirements

ISA makes very little difference these days when it comes to power requirements.[1]

[1] https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-...


I'm pretty sure that x86 could be more power efficient than it is now. But I have rarely run into an implementation of x86 that wasn't pretty dang power hungry. Probably for the explicit reason that it isn't a huge priority for the stuff they run it on.

I think the lowest power ones are Atom and Celeron. All the other low wattage CPUs are RISC based. I think AMD is also coming up with something similar.


Nearly all x86 CPU architectures originate from desktop/server CPUs which had much higher thermal and power limits, but nearly all ARM CPU architectures originate from embedded CPUs where there are strict thermal and power limits.

It's not as if there's something fundamental with x86 that makes it use more power. There used to be the case that x86 CPUs had many more transistors (and thus more power usage and heat) due to the need to implement more instructions, but I don't believe this is the case anymore. I think right now is that its easier to scale up an architecture, than it is to scale down.

If you look at something like Project Denver, it started as a x86 compatible CPU that would of had ARM-like power usage.


> I think the lowest power ones are Atom and Celeron. All the other low wattage CPUs are RISC based. I think AMD is also coming up with something similar.

AMD Had Bobcat and later Puma, IIRC there were some <5W parts in both of those families. I still wonder -why- they never did a dual channel version of those parts for desktop, my guess is it would have made Bulldozer look extra bad somehow.


>...horrible power requirements. ... perfectly fine for a desktop computer. Just not so great for phones, tablets and portable computers.

Except no one buys desktop computers any more: everyone's using phones, tablets, and laptop computers. There's also servers, but even here power efficiency is important; reduced power requirements for a datacenter would save a lot of money, not just in direct electricity consumption by the servers, but also from reduced cooling needs.


Ah, that is why Sony and Microsoft are now eyeing PC versions of console games, because no one ever buys gaming rings any longer, aka desktop computers.

Also what do you think most contributors to LLVM, GCC, CUDA, VFX,.. use as daily driver?


https://www.finaria.it/trading/laptop-sales-to-generate-almo...

it's not clear that the 'contributors to LLVM' market is going to make up the shortfall.


As I understand it, PC sales have been pretty flat. It's just their market share has been diluted by the monstrous number of tablets and phones.

https://www.statista.com/statistics/273495/global-shipments-...


I don't think those numbers are correct at all: they appear to be grouping laptops in with "personal computers". We're talking about desktop computers here.


>Except no one buys desktop computers any more

Wtf?

>Power efficiency

Isa has small impact on it.


Consumer sales of desktops have been falling for years.

A lot of regular office workplaces moved to laptops+docking stations as well.

People still buy desktops, but it’s a dying breed outside of gamers artists and those with heavy compute requirements


Yeah, and not only that, many of those "desktops" are probably "workstations". We had heavy compute requirements at my last job so we were assigned workstations with dual Xeon CPUs, not standard Intel desktop chips.

Outside of a couple unusual places like that, laptops have been the standard for workplaces over the last couple decades for me.


At my employer that started during covid.

Before that laptops were not quite limited to managers, but they did signify higher status (if not necessarily pay). Today it‘s iPads that signify status.


You’re not, though. Or at least didn’t provide a single argument..


Name traits that new ISA would need to have that would bring significant improvements


You need to port existing programs to your dream arch. It was tried before. With Itanic. /s




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: