Hacker News new | past | comments | ask | show | jobs | submit login
Intel plans thousands of job cuts in face of PC slowdown (bloomberg.com)
304 points by oumua_don17 on Oct 12, 2022 | hide | past | favorite | 454 comments




I think this is the first major layoff announcement at a big, established tech firm I've heard this year that is anywhere near this size. I mean, there have been loads of "couple hundred" employee layoffs, and certainly a lot of hiring slowdown/freezes, but again, haven't seen anything else near this big, and the layoffs previously seem to have been concentrated in unprofitable VC-funded companies. I know Tesla laid off a couple thousand people this summer, that's the closest thing that comes to mind.

Intel certainly has its own unique challenges, but even, for example, AMD had been doing pretty great up until this year, and they also just announced a big shortfall for their 3rd quarter results. Just wondering if this is really the tip of the iceberg for a true, broad retrenchment in tech, after the past 9 months of "Is a recession coming? Is a recession coming?"


To be clear, AMD third quarter revenue is expected to be approximately $5.6 billion, an increase of 29% year-over-year. Previous guidance was 55% YoY.


Yeah, but isn't their one year forecast much lower? Not to mention they have a bigger lercentage of business in areas other than PC CPUs, right?


Client/Consumer CPUs(Ryzen) is just one business for AMD. The others are graphics (Radeon), datacenter (EPYC), and embedded (Xilinx, PS & Xbox consoles). The one year forecast will probably have to be adjusted based on current unexpected economic conditions but they will keep growing.

What folks don't appreciate is that AMD is on a roll. A clear roadmap and processes in place to execute like clockwork in tandem with TSMC. Intel can easily bounce back in about 3-4 years if they fix their foundry issues.


If they hadn't cut fab R&D years ago to juice profit margins, they wouldn't be in this situation now. I have no reason to believe they will suddenly start acting intelligently or with any real long-term thinking. Most likely it's just about managing the decline at this point (please Pat, prove me wrong)


It's still not clear that Intel shouldn't just go fabless like their competition. They're going for a hybrid approach right now where some of their own stuff is produced elsewhere. They're trying to get some manufacturers to use their fabs but they're nowhere near the state of the art. They lost that battle to TSMC a long time ago.


Will not happen for National Security reasons. DoD standard policy is that high criticality components must be able to made domestically. Or at least it was at some point.


Yes, but: the CHIPS act plans to increase domestic semiconductor manufacturing significantly. In the future, Intel might be able to go fabless with domestic manufacturing partners to satisfy DoD requirements.

“America invented the semiconductor, but today produces about 10 percent of the world’s supply—and none of the most advanced chips. Instead, we rely on East Asia for 75 percent of global production. The CHIPS and Science Act will unlock hundreds of billions more in private sector semiconductor investment across the country, including production essential to national defense and critical sectors.”

https://www.whitehouse.gov/briefing-room/statements-releases...


Just because Intel would go fabless doesn't mean the fabs themselves would disappear. They could spin them out just like AMD did when they were in trouble.


This, AMD's CPUs are better than ever. Their upcoming GPUs are looking very promising, especially considering how team green turned into a luxury brand. And their support for Linux has been very welcome.


Also their CPUs with Radeon GPU are a lot more performing than Intel equivalents. You can grab a 5600G or 5700G alone, no external video card needed, and expect to play most recent games, albeit not at maximum details. For those not obsessed with gaming at the highest possible details, this translates in saving some serious money. A search for "5700g gaming" on YouTube returns some examples.


This, GTA 5 on integrated graphics blows my mind.


Yeah, I think they do significantly more graphics and embedded than Intel. But I could be wrong.


Graphics business should be worse than CPU business after crypt bubble gone and ETH PoS. Gaming PC isn't a very big market for CPU, but big for GPU.


Steam Deck as well.


Oracle is/was laying off thousands as of August.

Within the past 12 months:

Peloton laid off over 4000. Snap 1280 (which is 20% of the company,) Shopify 1000 (10%.) Groupon laid off 15%. Salesforce, 1000. Microsoft 1800. Carvana (while not "big established" it's still a lot of people) laid off 2500. Tencent laying off 5500. Alibaba: 9500, ByteDance: 1000, Zillow 2300.

This definitely isn't really the first major layoff announcement.

Outside of big tech: Credit Suisse laying off 5000, Ford 8000, Telefonica 2700. Societe General 3700

A bunch more, but these are the one measured in thousands or otherwise a significant percentage of a company's workforce.

(By the way, not arguing with you, my point is that this isn't surprising, the writing has been on the wall for the past year, so an Intel layoff isn't a bellwether for things getting bad -- it's a lagging indicator of things already being bad.


Yeah, understood it's a lagging indicator, but most of those US companies you mention (Peloton, Snap, Shopify, Carvana) are in the "unprofitable VC-funded camp". Salesforce and Microsoft obviously aren't, but their numbers are also much, much smaller and the amounts are a really teeny percentage of their overall workforce. Zillow is a bit of a special case because of their complete f'up with flipping houses. Oracle feels like the only one really comparable to me, but perhaps I'm just showing my personal bias that I'd really wish Oracle would lay off everyone and go under, but I digress...

I guess my main point was that, even with recent layoffs, feels like most of those folks wouldn't have had much difficulty getting snapped up by other companies, especially in engineering (not saying it wasn't disruptive to those involved). But once you start laying off 20,000 here and 10,000 there, you get to the musical chairs point where some folks are going to be left without a chair for some time.


None of the tech companies you mentioned besides Microsoft and Alibaba are BigTech. Most of them were bad business that VCs were able to pawn off during the bubble market to the public markets. They were more or less a Ponzi scheme.


> My point is that this isn't surprising, the writing has been on the wall for the past year, so an Intel layoff isn't a bellwether for things getting bad -- it's a lagging indicator of things already being bad.

Yeah. It certainly was not a surprise and the market euphoria was going to all end in tears. Saw that a mile away several months before it happened. [0]

[0] https://news.ycombinator.com/item?id=29508238


Where you saw microsoft / alibaba?


Here's 200 plus the article references the earlier announcement of about 1800.

https://www.theverge.com/2022/8/10/23299499/microsoft-layoff...


I don't know a lot of the details, but everything I've heard over the last couple of years indicated that AMD was absolutely crushing Intel.

A recent laptop I purchased, as well as the last desktop I put together (~2 years ago) each have Ryzen chips. I forget the details but in addition to performance issues, didn't Intel CPUs also have some major security vulnerabilities? And was it that they were related to instruction-level performance optimizations that, when disabled to address the security vulnerabilities, led to even worse performance?

So if AMD isn't doing great at the moment either, I can't imagine how hard Intel has been hit. I don't know anyone who is buying or recommending Intel CPUs at the moment.


I don't have any data, but from word of mouth and variously seeing posts and videos online, 12th gen Intel CPUs seem pretty popular for gaming builds. They're winning in benchmarks against 50 series AMD (as they should, being newer), but are also cheaper. I'll be curious to see how 13th gen Intel vs. 70 series AMD plays out. There are always complicating factors, such as motherboards for 70 series AMD being quite pricey for now.


Intel repeated their Prescott (Pentium 4/Pentium D) strategy of completely removing power limitations on their team to beat AMD.

As a result, the Intel processors have TDPs and real world power usages >2.5x that of a comparable Ryzen. Sure, they're winning, but at what cost? The 12900K at 240 watts pulls almost the same power as a 280W 64-core Threadripper.

AMD is responding in kind, with new top-end processors pulling 170W, or higher with their built-in overclocking that pushes the chip to even higher power draws as long as cooling permits. This looks to put them back into the lead, but it's just not a sustainable strategy.


I'm not sure if the actual differences in energy usage are so clear cut. These charts [0], which account for the 12900K spending less time to accomplish tasks than the 5950X, seem to indicate the disparity isn't so terrible. You can always under-volt too; the 13th gen press release includes charts [1] showing that the 13900k at 65W matches the performance of the 12900K at 241W.

[0]: (Search for "cumulative amount of energy required to perform Blender and x264 and x265 HandBrake workloads") https://www.tomshardware.com/reviews/intel-core-i9-12900k-an...

[1]: https://arstechnica.com/gadgets/2022/09/intels-first-13th-ge...


Most people don't turn their computer on to run Handbreak and turn it off immediately afterward.

Hence power draw is meaningful for desktop computers, not just energy usage for a task.


Idle power draw is not the same as peak power draw either, these arent 'nineties chips.


Who cares what a desktop processor plugged to a wall outlet consumes? Seriously, if you care about power consumption, don't buy a K processor.


When a 5800X3D offers equal or better gaming performance at 1/2 to 1/3rd the power consumption (e.g. 220w vs 80w), and you pay $0.65 per kW/h during peak times, I can only imagine: "Quite a few".


If electricity was $0.65/kWh for me, I'd move out of the country lol. Assuming a more realistic $0.40/kWh (considered very expensive in the US, where most enthusiasts live), 8 hours of gaming a day, and a 200W power limit, you're paying $19/mo. Not bad.

Don't get me wrong, the 5800x3d is a phenomenal CPU. However like many owners of that CPU, I'm also in the market for a 4090 and intend to use the full 600W power limit with my water loop and outdoor radiator. CPU power consumption is just not an issue for enthusiasts.


> (considered very expensive in the US, where most enthusiasts live)

That's very dismissive of the entire EU, which has lots of enthusiasts, and these aforementioned expensive electricity prices.

Some electricity providers in Germany even raised their prices to 1.35€/kWh (most famously DB Netz Bahnstrom plans)


Nice, ha ha. I was going to say, look into Undervolting. I saw 95% perf at 60% of the power come up, ie 270w; that's actually going to be superb (very cool, quiet, still extremely performant).

So maybe configure that as an optional profile for titles not needing maximum juice?

I'm quite keen myself.

Power $ mentioned was peak times in Norway; I'm in Australia where it's not anywhere that bad yet. (0.25 AUD for me).


> I saw 95% perf at 60% of the power come up, ie 270w

That's neat, but 5% is a lot when you're spending $1600. Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.

> maybe configure that as an optional profile for titles not needing maximum juice?

4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.


> Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.

This completely misunderstands heat transfer. If the hotspot temp is 75 deg even with overclocking you're not limited by thermals: https://www.youtube.com/watch?v=zc-zwQMV8-s

>4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.

This completely misunderstands power consumption and the nonlinear relationship between power and clockspeed. Performing the same work in the same time in short bursts of high clocks and high voltage uses more power than constant low clocks and voltage.


50$/year vs 18$/year for a pc that's always on, not exactly life changing.


The difference is closer to $1250/year vs $450/year. I suppose you forgot to multiply by 24 hours/day.


That rather depends on how many hours a day you use your cpu at max performance, which is likely closer to 1 hour to the average person.


Aw crap


It matters for SFF builds where people often want the highest performance possible but are severely limited by cooking. Efficiency becomes extremely important.


People with good insulation or who like quiet rooms?


> Who cares what a desktop processor plugged to a wall outlet consumes?

Anyone without solar panels?


Those 5 people running off-grid with solar panels only certainly do care about power consumption and wouldn't buy a K processor.


Traditionally nobody, but the massive power consumption of the RTX 4090 combined with a power-hungry CPU might make people take notice. When your desktop needs a dedicated circuit you have a problem.


My microwave uses 1200 or 1500 watts and does not need a dedicated circuit. Sure, I don’t run it continuously for an hour, but that has nothing to do with whether or not it needs a dedicated circuit.


A 16A circuit allows for 3.8kW or something like that @ 240v.

With these ludicrous power requirements, you may in fact need to rethink how you use your power. E.g. if on same as a non-heat pump drier, you'd have to make a choice between gaming and drying clothes.

Having said that I saw a reference to a 4090 offering 95% of the perf at 60% of the power usage if undervolted, so that becomes an attractive option now.

I absolutely love my 5800X3D's insanely low power usage (insane = performance per watt for gaming in simulator titles where it runs circles around Intel).


Of the last gen stuff, the 5800x3D is arguably among the best bang for buck, including for use in gaming builds. The applications where Intel's 12900K/12700K hold a significant advantage against the other chip with its 96MB of L3 cache typically aren't applications desperately in need of cpu power. IME it's poorly-optimized or hard-to-optimize software with bad cache coherency that most demands speed, and it's in those cases that the X3D delivers.


Gamers prefer anything that has better performance for cheaper price.

Tho overall having a more stable PC that consumes less electricity is better in long term and ppl really see that.

It shows in sale numbers.


The PC gaming market is tiny in the grand scheme of things.


I was looking for a laptop for work and decent gaming, I went with an Intel NUC. Everything pointed to Ryzen being far better in most aspects - power, weight, height, and importantly battery life - however for the price point I was looking at, the faux-no-brand Intel with the 3060 was better even if it seems the chips do lag behind for similar generations.


AMD also had alot of those major bugs, but it wasn't reported as much since the original papers all targeted intel CPU's since they are the "standard", and then a couple months later someone would do it for AMD but it was old news.


AMD's CPUs, including the latest Ryzen and Epyc processors, are immune to:

    Meltdown (Spectre v3)
    Spectre v3a
    LazyFPU
    TLBleed
    Spectre v1.2
    L1TF/Foreshadow
    SPOILER
    SpectreRSB
    MDS attacks (ZombieLoad, Fallout, RIDL)
    SWAPGS
https://www.tomshardware.com/features/intel-amd-most-secure-...


Are any of these attacks relevant unless you run a cloud service ?


All of the attacks are relevant in any environment where you execute code that you would rather not have access to any other information on your system (i.e. where you actually care about privileges).

For example, there have been demos of spectre in-browser, but applies to multi-user environments or simply apps that you didn't grant admin privs...

If you treat your computer security in a DOS/Win95 fashion, then none of them are relevant.


AMD CPUs were vulnerable to fewer of the speculative execution bugs than Intel, and the performance impact of the mitigations on AMD CPUs were far less than Intel.


> I don't know anyone who is buying or recommending Intel CPUs at the moment.

Aren’t the major cloud providers still mainly running on and buying Intel?


AWS developed two generations of an in-house ARM chip (Graviton), which I think is cheaper for an equivalent power to an x86 instance. Cloudflare has also begun switching to ARM, for 57% more performance per watt:

https://blog.cloudflare.com/designing-edge-servers-with-arm-...


IIRC GCP is using AMD EPYC.

Not sure about Azure.


GCP offers instances on many processors, including AMD EPYC: https://cloud.google.com/blog/products/compute/amd-epyc-proc.... Here's another one: https://cloud.google.com/intel/


Azure uses Intel, AMD and Ampere. You implicitly select your CPU when you pick the virtual machine family: https://learn.microsoft.com/en-us/azure/virtual-machines/vm-...


The cloud providers have been buying Intel for years and can't afford to just throw thos chips away. AMD is capturing a much larger proportion of data center CPU sales, but you should still expect to see them buying Intel as well, since AMD can't produce enough chips to satisfy all demand. Also, qualifying new platforms is very expensive so cloud providers will continue to buy already-qualified platforms for as long as the perf/W makes sense.


The past 2 companies I've been at have or had huge efforts to migrate large parts of infrastructure to AWS Graviton which are 64-bit ARM iirc.


AWS certainly offer AMD boxes as well (M5a, M6a, T3a etc), but nothing like the broad range of Intels, and not even the same kind of offerings as Graviton.


ARM is increasingly being a factor there afaik.


AMD chips had the same branch prediction side channel vulnerabilities that intel did. At this point their offerings are pretty similar with intel being a bit cheaper but using more electricity.


With the new releases, it seems that 5800x3D takes the crown for single threaded tasks and video games even in intel's 13xxx benchmarks. The 7950x is closely trailed by the 5800x3D.


Similar to response to maldev higher up - AMD has some of the same issues, but far fewer. And the mitigations had less of an impact.


Amd revenue 2021 -> $16.4 billion Intel 2021 -> $79.024 billion

But 2be fair amd is/was growing and intel is stagnant and lose market share right now but with a massive investments and going back on track with fabs process i would assume Intel soon start 2grow again (ofc after the end of the recession)

We have to remember AMD is making money out of TSMC advantage over Intel node which I assume won't last forever and if TSMC blunder even one node it can be catastrophic for AMD. Considering USA shift and focus on tech war with China Intel fabs can only grow faster or everything can crash.


We just ordered 4 new dual Xeon workstations for the office.


What kind of jobs do you run on them?


> everything I've heard over the last couple of years indicated that AMD was absolutely crushing Intel.

AMD market cap: $93B

Intel market cap: $104B

And now you will no longer be able to say that everything you've heard indicates that AMD is "absolutely crushing Intel".


Given their respective revenue, AMD's market cap does indicate they're crushing it.

Their multiplier means that the market thinks highly of their ability to continue to grow.


Given that AMD shares are down more than 61% YTD (vs 52% for INTC), I'd say neither company is setting positive expectations for the future.

*edit added YTD for clarity


I've seen another comment saying their revenue is way up YoY. What does this mean when compared with share price changes?


AMD market cap in 2015: $2B

Intel market cap in 2015: $150B

Kinda looks like crushing.


I am not sure if Intel counts though. They have had challenges for the last several years during the bull market, which they failed to address, now they are taking advantage of the bear market to cull numbers. If we see someone who did not have these very visible weaknesses cutting numbers that would be a sign, not Intel. Though I agree Intel is huge and them laying off people still is a big deal.


I’ve seen a whole lot of stories like this at larger and larger companies. There was one like it about Facebook yesterday. They’re shorting their own stock [predictions] and adding employment insecurity to reinforce that. Big tech companies would love to benefit from a less competitive hiring landscape and the “recession?” narrative fits that well.

Too bad, and so sad, that they’ll be just as eager for talent when talent inevitably moves on and finds they’re happier where they landed when they were being used as stock price pawns.


Anecdata here but: literally every single dev I talk to on a regular basis has told me their company has laid off a sizable amount of staff in the last 3 months. Every single one.


That’s very much anecdata. Were the companies profitable or were they living off of VC funding?


Yes, that’s… why I was upfront about that.

The companies range from startups to enterprise, at practically all stages of funding, but they each told me their funding never materialized or investors gave management ultimatums.

A few were in crypto, two are in fintech, a few more in various b2b tech companies. But they all gave almost identical explanations: management sees a rough economy ahead and tightened their belts accordingly.

I’m not trying to be doom and gloom, I still have a job, but even my partner, who is also a software engineer, just survived a round of lay-offs at their fintech that happened yesterday. They laid off 25% of their company across the board.


That seemed to be the feel in 2008 as well. Lots of companies that were fine choose to have layoffs just because they “saw” a rough economy ahead. Self fulfilling prophecy, or just trimming the fat…?


In 2008, spending was in the toilet after the housing crash. That’s not the case now.


Do you think inflation and supply chain issues could have a similar effect as a decrease in spending?


Why would “supply chain issues “ reduce spending? If a company is having supply chain issues, that by definition means that there is more demand than they can supply. Meaning people are willing to spend money. What are companies going to do when they get supply if they lay off workers?

Besides, if you work for any tech company, you should be able to throw your resume in the aid and get another job. Even if you are a blue collar worker, there are plenty of companies looking. I have a friend who works in finance for a major manufacturer. He said the company had to be a lot more lax about firing factory workers because they already had a shortage.


Because they can't get the supplies to manufacture and therefore sell their own products reducing their revenues.


And by definition none of them had a profitable business model. The investors were hoping to pawn off their investments to an acquirer or the public markets - ie “the greater fool”. The investors knew they couldn’t find a bigger fool and are left holding the bag.

They were “tightening their belt” a profitable company doesn’t have to worry about that.


Did you read what I wrote at all? Why are you being so combative, lol. My partner’s company is not seeking funding and is profitable. Still had layoffs. Another friend’s company is profitable but did happen to be seeking investment. Still saw lay offs. Why do you insist on looking at this through the lens of funding and refuse to acknowledge that companies see a rough economy ahead, regardless of their funding status?


Even the profitable ones too are under pressure to show YoY and QoQ growth. To make that happen, belt tightening has to happen. So take away is no one is growing or expected to grow in the coming year.


The startup I worked for before my current company sold services to health care systems. It was hit hard by Covid. Hospitals were losing money and not paying for new products or even old ones.

My CTO said specifically “we need everyone we have and we aren’t going to be successful by laying off people. We have a vision and we need you all to help execute it”. They were bought out less than 9 months later for 10x revenue. I had moved on to $BigTech by then after leading their “cloud modernization” efforts.

The profitable tech companies are using it as an excuse to get rid of dead weight. No one is going to come after Cook, Jassy or Nadella for short term revenue misses.

Facebook and Google still have founders who own more than 50% of the voting shares. No one can come after them.

You can say a lot about the big 5. But none of them can be accused of short sightedness.


DocuSign laid off ~over 7000~ 7% of 7000, as well: https://www.cnbc.com/2022/09/28/docusign-to-cut-workforce-by...


It looks like DocuSign has just over 7,000 employees and it's laying off 7% of them (so somewhere around 500-600)?


Oh, you‘re right, I stand corrected, thank you!


How in the world does DocuSign have 7k employees? I’m genuinely baffled. 700 employee would seem like a stretch.


Here is an attempt to crowdsource the answer on hacker news:

https://news.ycombinator.com/item?id=33012137

Spoiler alert: no really good answers


If you look at Intel's history, they do this consistently and at predictable cadence regardless of the state of the broader economy.


Anecdata; a friend at Dresden has been interviewing people from Xilinx all week.


Intel has the legend of a corporate turnaround after a huge layoff in its history.

I'm not sure if I can take this as a harbinger of things to come, and I'm saying this while generally being pessimistic of the economy, I think Intel is a deeply troubled business and some deep layoffs were long time coming because Intel was spread real thin across so many businesses.


The layoffs are happening in tech. Not as much elsewhere. This is .com bubble 2.0.


> The layoffs are happening in tech. Not as much elsewhere. This is .com bubble 2.0.

That's complete nonsense.

"The Automotive sector leads all industries in job cuts this year"

"Retailers led in job cut announcements in September"

"Technology companies followed"

Source: https://www.challengergray.com/blog/job-cuts-surge-46-in-sep...


I've read that currently the semi industry is in the negative part of a cycle.


What positive side of a cycle will cause a resurgence in personal computers in the age of mobile? Servers are increasingly using custom purpose built chips. Intel would be better off being a fab.


Where?


Hopefully for them that includes executives and managers, they are the ones responsible of the death of that company, not the engineers

Congrats to Apple and AMD for having done better

Microsoft ignoring AMD in their latest Surface lineup is a hint for the people looking for answers


> Microsoft ignoring AMD in their latest Surface lineup is a hint for the people looking for answers

I think the entire new surface line-up is a giant conspiracy.

The Surface Pro 9 5G has a Qualcomm 8cx Gen 3 with some tweaks called the SQ3. It does not match the performance of 12 Gen intel in a 15w power envelope, which is important because the Surface Pro 9 non-5G (intel) uses an i5-1235u, a 15w chip. But the last Gen, the pro 8 used a 28w 11th gen part. This means that the y/y perfromance is actually about the same (but better power/heat). If they used a matching 28w CPU it would have been a much bigger jump comapred to the ARM SQ3 5G.

But then, it gets WAY weirder.

The new (FOUR THOUNSDAND DOLLAR) Surface Studio 2+ ? 11th gen laptop CPU (11370H)...which is as fast as SQ3 will be.

Surface Laptop 5? Exact same specs as Surface Pro 8.

Casual reminder that the now two year old Apple M1 still smokes every single chip I just mentioned, all in a fanless iPad Air. The would-be Surface Laptop AMD Ryzen 6800U would be right on par with M1...which would make that SQ3 look like a bad deal.

Microsoft is sandbagging the lineup for ARM.


People need to remember that desktop/laptop ARMs outside of Apple have kind of royally sucked...

E.g. when the first Surface Pro ARM was released (the X, 2020ish), it was ridiculously more expensive, less performing and had less battery life (!) than the cheaper Intel options https://www.notebookcheck.net/Microsoft-Surface-Pro-X-Review...

The previous Surface non-Pro ARM attempts (RT, 2012ish) were even crappier (Tegra procs) but at least they were cheaper than the Pro/Intel variants...


> People need to remember that desktop/laptop ARMs outside of Apple have kind of royally sucked...

I've got an SQ1 and Apple M2 and they're both truly excellent.

(disclaimer: I worked years in engineering in both Qualcomm and Apple)


Got some benchmarks?


Good point: they're excellent for MY benchmarks, to your point (I think).

Essentially I run them all day with Visual Studio Code and vim sessions, Firefox scanning news and documentation (and of course normal email etc), and ssh to cloud hosts.

Your mileage and evaluation could of course vary.


> the now two year old Apple M1 still smokes every single chip I just mentioned, all in a fanless iPad Air

Well yes, but then you have to use macOS and the locked down close ecosystem that is Apple. That's not really something they're directly competing against.


Tell me again how MacOS is more locked down than Windows and why most people should care?


The iPad runs iOS, which I suppose is what they meant. Windows ARM is less locked down than Windows S. iOS is still a walled-garden with a few holes here and there.

The second part of your question is key though "why most people should care." They obviously don't. It "just works" and generally keeps them from doing insecure shit. Want to buy hardware from someone else? Tough shit! Why would you anyway? You've got money to burn and no desire to write code that runs on your box without a second "real" computer. Buy some more lightning cables while you're at it. Don't forget to mention green bubbles next time you message an Android peasant.

I said that last part pretty snarky, but you're not wrong about most people not caring. That's their audience and they've nailed it.


> The iPad runs iOS, which I suppose is what they meant. Windows ARM is less locked down than Windows S. iOS is still a walled-garden with a few holes here and there.

The conversation never mentioned “iOS” or phones and the submission is about Intel who has nothing to do with mobile or Android. He specifically said MacOS:

>> Well yes, but then you have to use macOS and the locked down close ecosystem that is Apple.

What does any of your response have to with Intel, Windows or MacOS?


It is mentioned a long way up this thread:

>> Casual reminder that the now two year old Apple M1 still smokes every single chip I just mentioned, all in a fanless iPad Air.


That’s a stretch when my question was specifically about the Mac vs Windows and the person I replied to specifically cited macOA.


[flagged]


Or you couldn’t truthfully answer the question. How is the Mac more locked down than Windows?

What class of software can you run on Windows that you can’t run on Macs because of Apple’s operating system?

Edit: I see downvotes. But no one can answer the question.. .


Most of the time it's the other way around, so programs which are used on MacOS but aren't available on Windows. Moving to MacOS is pretty doable I think, but leaving the Apple-sphere is a bit more complicated.

And this is not strictly MacOS but iOS: When I had an Android phone I could just connect it to a Linux or Window laptop and copy files of and to it, same probably also works on a Mac. Now that I have an iPhone I can't even transfer files between my PC and my phone, unless I install iTunes and/or iCloud or download pictures through the iCloud web-interface.

I thought at this point it was a well known fact that Apple devices work perfectly with each other, but that it gets complicated when you want to use a different OS in combination.

Can list more examples if you want...


MacOS actually doesn't support MTP out of the box (which most non-ancient Android phones use). You need to get something like Android File Transfer to do it. They just don't want to support anything other than their own garbage.


So you can’t transfer files to your Apple device unless you use the Apple provided software to put files on your Apple device?

I fail to see the issue.

And it still didn’t answer the question about how Macs are more locked down than Windows.


You can transfer files to a Google device using a built-in feature of your Microsoft operating system on a Dell laptop.

And in the past you could transfer files from your Windows phone to your Linux workstation with zero extra software other than your DE. Ditto BB10.

God forbid you need to get a file to your iPhone from a Linux computer sans net connection.


> Edit: I see downvotes. But no one can answer the question.. .

You likely would have had fewer downvotes if you had just answered in earnest the question asked: "Are you asking a question or being dismissive and flippant?" Then maybe you would have gotten an answer or two.

Instead, choosing to reply with inflammatory language like "you couldn’t truthfully answer the question" seem to indicate you are probably not interested in any ration discussion, but are instead going to dig in and dismiss or attack anyone who disagrees with you. Certainly doesn't make me want to spend any time explaining the bleeding obvious to you, but perhaps someone else will.


And yet, you also are unable to answer the question…


> And yet, you also are unable to answer the question…

Correction: Unwilling. Not unable. This is an important distinction. I'm simply unwilling to engage in a lengthy discussion with you, on any topic really, based on your uncivil behavior here.


Yes because the original question I replied to was

> Are you asking a question or being dismissive and flippant? You'd don't really need to think very hard to figure it out now ... do you.

Was the model of civility and “assumed positive intent”

And you still couldn’t answer the question…nor has anyone else including the original poster who went silent…


> Please don't comment about the voting on comments. It never does any good, and it makes boring reading.

https://news.ycombinator.com/newsguidelines.html


Since we are quoting rules…

> Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.


> What class of software can you run on Windows that you can’t run on Macs because of Apple’s operating system?

First-class containers.


What are “first class containers”? Are you claiming that Macs can’t run Docker?


They can run it, but not very well, and with tons of edge cases (especially concerning networking and FS mounts). Both Windows and Linux have container support built directly into the kernel.


You don’t think the Surface Pro and Surface Laptop are competing with the Mac and iPad lineup? They seem like direct competitors to me.


Microsoft has an exclusivity agreement with Qualcomm that isn't doing them any favors.


It will do them great favors when the Qualcomm Hamoa Chip comes out which is supposed to be as good if not better than the Apple M2


The time-tested strategy of releasing many iterations of a terrible product and hoping people will still care once it's good.


Just in time to compete with the Apple M4.


So, it seems they're shooting themselves in the foot

Surface pads do not seem to be too popular and might become even less now


That’s always the problem. An R&D group I worked for laid off developers but no managers, which lead to more layoffs because overhead went up, which mean rates went up, which meant more consultants came in instead.

Really at least a fifth of your layoff should be managers. If you lay off about six reports, that’s one less manager you need. So 1/7. If you lay off six managers, that’s one skip level you don’t need. 1 + 6 + 36 = 43. Go to three levels and that’s 1 + 6 + 36 + 216 = 259, or 43 / 259 or about 1/6.

But if you want to keep production capacity, you should move to slightly more reports per manager, not maintain steady state. Just to keep the math simple, if you had 216 developers reporting to an organization of 43 managers, if you went to 7 reports per manager you need 31 line managers, not 36, which is a 2% RIF without losing a single producer. Or if you do both you get about 1/5th of laid off people as managers if you lay off less than half your developers.

Edit: fractions and ratios are not interchangeable, bad math.


Why not both? Even much smaller organizations tend to have low-impact teams that could be absorbed by others and their work deprioritized. If you're going to fire a dozen managers then the implication (or at least possibility) exists that maybe even the bottom 20% of that group is assigned to low-priority projects and perhaps even under-performing.

Then 216 => 43 becomes 203 => 29

Now you've reduced the workforce by 10% and put some miserable projects entirely out of their misery at the same time!


And suddenly reduce the velocity of all other teams because whoops it turns out those low priority projects were dependencies of everyone else.


I worry about any company that doesn’t track priority and severity separately. Priority is political, and often bullshit (everything is top priority!) but some tasks, if not completed, are company killers, and some just embarrass a loud-mouthed salesperson in front of a customer.


Just to check my math, let’s look at an org that has a multiple of 6²x7²=1764 staff.

We had 294 line managers, 49 managers, and 8+ program managers. 351 managers, 2115 employees total. I need to lay off at least 15% or 318 people.

We go from 294 line managers to 252, plus reductions up the chain to 36 + 5 = 293. 58 down, 260 to go. I lay off 5 managers, 35 line managers, and 245 staff, that’s 58 + 285 = 343 people total, so I can argue for keeping a couple extra line managers and all their reports. Call it 327 layoffs, 96 of them managers, 28.4%.


>Microsoft ignoring AMD in their latest Surface lineup is a hint for the people looking for answers

For those of us who are bad at subtext, could you elaborate further? What are the questions that this is the answer to?


my guess: "will Microsoft continue to take AMD as seriously as it does Intel?"


Pfft, as bad as the managers are at Intel, don't try to pretend that they have top tier talent (in SW) when they are paying on average about half of what a FAANG pays.

When I was there, they called Intel "Great place to leetcode" (a corruption of Intel claiming they are a "great place to work" everywhere)

Intel is known for good WLB and quiet quitting because the non technical managers didn't have a clue about what was actually being done on their own team. It's easy for engineers to BS these kinds of people, and this was the norm since any passion for SW was killed by working there long term...


I really hope that Intel will be a role model example to showcase what happens if you dont cut the cancer known as terrible management.

Far too long managers are not taking responsibility for their bad decisions - not only business decisions but also simple feedback disregard.

„Look at what happened to Intel thats what happens if you let your managers loose”


But is it not the management that makes such decisions?


> Hopefully for them that includes executives and managers

Yeah, its always them that get cut \s


It’s usually the fault of the previous generation of managers, hard to tie that together.


I’ve been through two layoffs at tech companies - one of 14,000 and one of 15,000.

One touched a lot of teams including my own. The second was mostly smoke and mirrors - they took multiple factories and supply chain employees and sold them or outsourced them to a company that picked up the payroll for those employees. Only a handful of roles were truly culled.

The announcement of the layoffs was theatre for Wall Street.


It's important that Wall Street is not just some people but the owners of the company. So it's a signal to the owners that their money will be better spent.

Just saying Wall Street sounds very dismissive of the obligations that management has to the owners.


Sorry, care to explain why did you mention Wall Street in your post? What does it signify?


“Wall Street” refers to stock investors and analysts who put pressure on the company to increase their profit and share prices above all else. They are motivated primarily by making a profit on their investment rather than long term health of the company.

This is due to the New York Stock Exchange being located on Wall Street in New York.


> The announcement of the layoffs was theatre for Wall Street.

I'm interpreting this as "the (publicly-traded) company used a layoff announcement to appease investors and show them that executive management is taking steps to ensure the company operates more efficiently, thereby keeping it within the good graces of these investors/hedge-fund managers/etc".


They laid people off only on paper, essentially moving those people to outsource roles. This made their ongoing expenses look better to investors, when presented in a dishonest manner.


Wall Street is metonymy for the stock market.


I interpreted that as "for investors". Basically reassuring them that the org is not bloated, or improving the raw numbers so anyone using heuristics based on those numbers is made happy.


If a company is experiencing a dipping stock price (poor growth or whatever other reason) they can announce a huge layoff to attempt to bolster the stock price


Isn't this just a culmination of nearly a decade of unforced errors and mismanaged strategy? I'm a complete outsider but I see a couple of fundamental huge misses by Intel over the past 10 or so years:

1) Completely missed the boat on mobile. Their ARM-competitive chips (Atom, etc.)... weren't. Missed on every measure from power to performance. Let Qualcomm and Samsung eat their lunch.

2) Missed too many ticks and replaced them with tocks. Fell far behind because of ridiculous tilt-at-windmill folly around Itanium and other architectural decisions that absolutely didn't pay off. Ended up taking way too long to get further down on process size.

3) Lost their competitive edge so much that Apple finally realized they can do it better and suffer the transitional costs of building a middleware layer to translate off of x86. Apple's fully committed to that, so Intel doesn't just lose the "no one is using a PC anymore" market, they're also losing the "those people who DO use home computers which happen to be Apples" market.

It's absolutely incredible to watch these formerly phenomenally innovative companies falter at that very core competency of staying cutting edge but we see it happen over and over again.


> It's absolutely incredible to watch these formerly phenomenally innovative companies falter at that very core competency of staying cutting edge but we see it happen over and over again.

To be frank, I think that's a great thing. I don't want companies to stay giant in perpetuity, I think it's great that they are essentially "recycled" when new, better companies come along and eat their lunch.

If anything I think it's quite bad that over the past 25ish years that a lot of big companies have really learned the lessons of "disruption" and have responded by just buying up the smaller but up-and-coming competition before they can overtake (looking at you Adobe and Figma).


IF a company goes bad, it's great that new companies eat their lunch.

It's bad for a company to go bad.

It's bad for a company to lock out competition.


> It's bad for a company to go bad.

Hard disagree, so I wonder what your basis for this bullet is.

Agree on the other two though.


Not the op, but:

1. Tautological: from company's perspective, it's bad for the company (itself) to go bad.

2. More broadly, it's reasonably self-evident that it's bad for employees, and bad for shareholders of a company, for that company to go bad

3. Where we can have discussions, and probably for a long and fun if not necessarily fruitful time, is whether it's "good" or "bad" from market's, or consumer's perspective for companies to go bad. I personally think not.

I think we'll likely mostly agree that it's good for market to filter out and punish bad companies or companies that go bad, once they go bad, for whatever reason; but that doesn't necessarily imply or follow that's it's good for companies to go bad.

In other words, what's your perspective/bias - why would it be GOOD for a company to go bad? Why would it not be absolutely fantastically wonderful if all companies perpetually stayed good and we lived in utopia of rainbows and unicorns? :>

(all this without defining what "company going bad" means, left as an exercise for the student :)


> Where we can have discussions, and probably for a long and fun if not necessarily fruitful time, is whether it's "good" or "bad" from market's, or consumer's perspective for companies to go bad. I personally think not.

Generally where my head was at. The other two points were better scoped. I personally think it's good that companies go bad, as it moves control over productive assets from one group to another. This is good for society for a variety of reasons, not the least of which is enabling new ideas to be tried. Can a long lived company try new ideas? sure, but the degree to which they avoid risks (if they live a long time they necessarily avoid undo risk) puts on upper bound on how ambitious they are with new things.


I can think of another aspect of the badness: It's bad for geographical regions that have become dependent on the company. Intel is Oregon's largest private employer, for example, and most of this employment is in the Portland metro area. If Intel were to go away next year this area would be hit very hard economically. That's not going to happen, of course, but one can imagine a long slow decline. Tektronix used to be a very large employer here - employed about as many at it's peak in the late 80s as Intel employs here now. Tek went into decline in the 90s, fortunately Intel's star was rising then. As Intel's star declines it's tough to see a new replacement for the area.


It's an example of the general problem, it would be better if Intel (and some "competitors" to keep prices low) could always perfectly adapt to new technologies etc, and retrain people.

When they can't it's good companies come and go, or Kodak would still be blocking digital cameras (https://www.reuters.com/article/us-kodak-bankruptcy-idUSTRE8...) but the price we pay is things like wasted effort on re-implementing everything that didn't need to change at the new place, good people can't always move to it as you mention, etc.

Different countries at different times adopt different positions on the scale between inefficient (?) state-linked monopolies with jobs for life and letting the market do its thing. I'm not sure what factor means that sometimes we end up with Samsung and other times British Leyland.

That itself is an example of capitalism as the least worst option... similarly how much effort goes into trading currencies or commodities or whatever just so people get a fair price and aren't screwed over by whoever is the only person selling at the moment they need something. But we're self-interested, biased, and this is the workaround (or the system that emerges from our nature - self-interest is turned into the energy behind it all and "greed is good").


Things going bad is bad. Dying is bad. Etc etc.

Sometimes on net things are good because of other effects like "ah, promising new competitor gets a chance to shine!" but if you could have _both_, you absolutely _would_. A company going bad is a _cost paid_ for new blood, not an _added benefit_.

It's not good that you have 100 fewer hours when you spend 100 hours creating something awesome. It's bad that you have 100 fewer hours. If you could create the same awesome thing and still have those 100 hours to do something else, that'd be fantastic.

All other things being equal, I would far rather Intel be awesome than not. If it turns out the world is such that all other things can't be equal, it might be worth having Intel not be awesome in favor of other benefits, but I'm never going to be _happy_ about Intel not being awesome.


> I think it's great that they are essentially "recycled" when new, better companies come along and eat their lunch.

Don't you have on the order of tens of thousands of dollars invested in Intel through retirement savings, though? When S&P 500 companies fail, your retirement savings become worthless. (Apple is really the one you want to watch out for, though!)


>Don't you have on the order of tens of thousands of dollars invested in Intel through retirement savings, though?

Why is my fund in Intel but not in Apple? The whole point of choosing the index is so that you are diversified. You would only lose if Intel lost out to a foreign company, otherwise the demise of Intel is made palatable by the rise of Apple and AMD.


Even if intel loses to a foreign company, odds are that foreign company has been invested in my a domestic company I own so I'm still okay.


If you're invested broadly in the S&P, so long as companies are "properly" valued, it is okay if INTC falls and AMD rises.

If a major player is massively overvalued, though, then index funds, too, will feel some pain as the exuberance in a major player dissipates.

I hold some INTC directly after having been extremely impressed with the company's dedication to rigorous process in engineering interviews in the mid-2010s. I'm sad to see such a great company go through hard times.


Not a US citizen, but aren't the retirement funds managed by someone who should be held responsible for not diversifying?


Retirement funds do pretty OK here. Early in your career you'll mostly hold something like an S&P 500 index fund or a total stock market fund, as you get closer to retirement they shift the allocation towards safer investments.

I don't love this; the total stock market is a lot of tech companies, and I already have plenty of exposure to tech by working in the field. It's a good heuristic for most people, though. (And no, I don't do anything about this underlying fear of tech sector exposure. I just buy the Target Date 20XX funds like everyone else.)

My only point is that the grandparent comment generally expects companies to die when they get into the S&P 500. If that's true, we're all screwed. If one S&P 500 is just stealing business from some other S&P 500 company, though, it's probably a net gain. But if it's some privately owned startup, then it's not as concrete a win, and let's be honest, startups are driving a lot of the innovation in tech.


Which retirement funds are doing ok this time around? They typically switch to bonds which thanks to inflation are down something like 20 percent in the last year (some more like 30), despite yielding far less than stocks in the best of times.


It's the right strategy in general, not a strategy guaranteed to never lose money. Your savings account is down 20% year over year in terms of purchasing power because what $1 is changed dramatically. Rare event, and sometimes you're just screwed.

If I had a time machine, I would definitely use it to time the market!


People who are retired, and have absolutely no way to bounce back if the market is unkind, should absolutely invest in places that never lose money. I'm not talking about inflation. I can go out and buy actual guaranteed bonds that pay me an actual guaranteed positive interest rate. Yeah they don't keep up with inflation, but it only a small trickle in lost earning power at that point, not a loss of 1/5 of the number of years I can stay retired which the funds are delivering.

My savings account is not what I'm talking about. Inflation or not, it is still a number I can describe as "positive".

I'm referring to bonds that retirement and wealth management add to funds use to supposedly counter volatility. These are investments that at best will only pay single digits. They have suffered the worst year since 1931 or something. And we're just getting going. The frustrating part is as I said, the underlying bonds themselves still pay that 4% or so which would be much better than the funds are achieving. But for some reason they have to sell the bonds at a loss.


> the retirement funds managed by someone who should be held responsible for not diversifying?

fund managers don't have "personal" or "corporate" assets that are anywhere near the scale of the (retirement) funds they manage; being held responsible for mismanagement would not lead to a source of capital to replace that which had been lost through mismanagement.

All investment has risk.


In the US usually the employee usually has a choice of several funds in their 401 (k) retirement account, including international funds.

It seems the person upthread owns a target date fund and this means they probably are diversified. Their post doesn't make a lick of sense.


If this is referring to me, then perhaps you're confused because I was responding to a question about rewards and punishments not whether labels fit. And I never claimed the the question and my response were somehow opposite ends of a dichotomy. Indeed the connection there is the problem. A fund can be completely diverse and still high-risk.


I was not referring to you. I was referring to the person who wrote this over the top, inaccurate comment:

"When S&P 500 companies fail, your retirement savings become worthless. (Apple is really the one you want to watch out for, though!)"


Whether or not they're ``held responsible'' to the degree anyone in such a position in the US is (i.e. lolno), grocery stores and retirement homes do not accept epicaricacy as payment.


Nope! They're rewarded for getting slightly higher returns than other funds during the bull market times, despite the disproportionate increase in risk they take on to do it. Everyone is looking at recent trends and almost no one looks tat fundamentals. It's a moral hazard.


intel doesn't have to go up in flames, not sure why you view it as all or nothing. They can keep milking legacy stuff and even make money like IBM (what paul graham calls irrelevant), while someone else can keep inventing better chips.


Isnt this only a concern if you plan to retire in the next 5 ish years? Even 5 years seems a while to correct itself unless there’s a larger market implosion. The last 5 years has seen the tail end of a recession and a pandemic


Over time, those funds are re-weighted. Unless you bought inte directly, the rewighting should handle it. Unless it fails quickly and spectacularly


S&P 500 trackers are not re-weighted.


Standard and Poors periodically updates the S&P 500 with the 500 largest companies. Intel will one day likely be removed, just like one day apple will be. It may be a while but it’ll happen.

Companies come and go from it all the time:

https://www.inc.com/ilan-mochari/innosight-sp-500-new-compan...


In the context of your comment

"Unless you bought inte directly, the rewighting should handle it. Unless it fails quickly and spectacularly"

I would find little comfort in the fact that if Intel goes down 85% -slowly but somewhat spectacularly- from its current level it will eventually be excluded from the index.


Cisco used to have a massive market cap. Greatest in the country. Now it’s absolutely unremarkable compared to FAANG et al. It went from $80 to $12.

The fact that I had to google for the details of its fall gives me comfort that when intels time comes it won’t cause harm on its slow exit.


The only harm done is the money lost on that particular investment, of course.


> I think that's a great thing. I don't want companies to stay giant in perpetuity...

Except that semiconductors are a national security issue at the moment.


100%. It's worth considering that some of these people laid off may consider working for authoritarian governments if they were well compensated (or are ideologically aligned), and that's a huge national security risk too. Hope they aren't dumb enough to lay off engineers with important knowledge. Please tell me that US representatives have already met with Intel on this.


I work in an authoritarian country (Hong Kong) and it s not what you think. You cant transplant a well paid foreigner expecting a breakthrough: his entire team will spend their time enforcing the red tape to teach him to adapt and by the time the money dries out they ll either have an obedient clone of the rest or a frustrated quitter.

Plus, dictatorships arent against the US per se, they re against the US convincing the populace they too could act like americans, and are extremely dependent on that Schrodinger state where you re both the factory for the US and a public political opponent. Symbiotic parasite pretending to be the host's alternative.

Plus the US might change heads at the top, but it s hardly a safe ally to have. Ukraine nearly went to complete disaster thanks to Trump and having morons elected there is a security risk for many too...


we have temporarily made peace with dictatorships before. the bigger concern is china wants the #1 spot in world power and we will oppose any nation that tries to beconme more powerful than us.


> Except that semiconductors are a national security issue at the moment.

Maybe too big is a national security issue, too.

Too big to fail is a real problem for any part of our economy.


Computer chips are a product that is only possible to produce at the largest scales. Everything, from the supply chain to design to manufacturing, is a globalized effort. You’ll never have small mom and pop fabs.


If everyone realizes it's a national security issue, that might change. Sure, US, EU and China have players in the industry and can attract fabs with a bit of incentive, but what if Russia or India or Pakistan or Brazil or Iran want to ensure a steady supply even under sanctions? Some of them might open up smaller fabs that might not be economically viable but are of strategic importance.


Doesn’t Malaysia have fabs? Singapore?


You could fab chips in your basement. However your costs per chip would be in the tens of thousands each for basic chips. Large scales allows making those same chips for less than a dime each.


You'd still need chemicals, a pure substrate, a chip design that's been converted into a manufacturing process, a UV aligner, spin coater, etc... can't do everything in your basement. You have to trade with others.


> Computer chips are a product that is only possible to produce at the largest scales

Perhaps, or perhaps we're dealing with monopolization.


> Except that semiconductors are a national security issue at the moment.

Perhaps this is too much of a radical idea, but if a company is so critically important that it can't be allowed to falter without government support, then it should be nationalized.

At the very least the government support should be made explicit, and large, critically important companies should be forced to pay much higher insurance, i.e. the equivalent of what banks need to pay for FDIC insurance. Sick of the whole "privatize the profits and socialize the losses" mindset.


Eh, I think they always were; we're just finally being forced to pay them better attention.


Luckily there are many, many countries in the world with no semiconductor industry of their own who benefit from no single company or country staying as the perpetual, unchallenged giant.


yeah but it's not a good reason to prop up or keep bloating non-competitive corporations. Bloating Intel is not likely to result in another tsmc.


There's no guarantee semiconductor fabs will be recycled. If TSMC shuts down due to political turmoil, Intel decides to halt all R&D and milk their new monopoly, then hardware progress could simply stall with no new competitor rising.

An investor would need to front many billions of dollars for an uncertain payoff over a decade or two, and Intel could at any moment become competent again. And if no investor is willing to take that risk, there would be no new competitor.

Adobe and Figma are software companies, so don't require capital investment. So if one shuts down, it's relatively easy to make a new UI modeling tool(or for an existing one to take over their customers).


That's all very nice and yeah, we've all read the essay about "creative destruction" but what happens to the Intellectual Capital that gets annihilated, sold pennies on the dollar and left to whither in useless portfolio trusts.

It's a disastrous process, I passionately hate it


I've typed this before but I'll type it again here. About 10 years ago I ran the local Python Meetup group here in Phoenix. Intel had/has a big presence here and one of they guys who worked on the Atom team (doing compiler optimizations for that chip family) told us (at one of the Meetups)Intel corporate didn't like how good a processor Atom was shaping up to be. They were worried it would start to cannibalize their desktop and server processor lines -- so they did everything possible to cripple its development.

He and his whole team were furious.


If someone is going to eat your current business, it's better if it is you. Fear of cannibalization is basically, "our customers will find out we have an even better product for them than the one they are currently buying from us." It's a good thing for your customers, and it's good for the long-term competitiveness of your company.


I've always told my business partners when discussing future products that if you aren't afraid to cannibalize your own revenue streams, your competitor (or another internal team) will.


SOP.

IBM crippled the AS/400 when it looked like it could eat into their mainframe business, for example.

It's the innovator's dilemma.

1. https://en.wikipedia.org/wiki/The_Innovator's_Dilemma


The Atom core is starting to find it's place now though, albeit years later. New Hybrid architecture in the core platforms, in the guise of the efficiency core, and for IoT/MilTech platforms there's a massive growing market with their Elkhart/Tiger Lake processor taking a large share of it.


All that to end up with giant cloud providers selling arm servers...


This is called the innovators dilemma


> 2) Missed too many ticks and replaced them with tocks. Fell far behind because of ridiculous tilt-at-windmill folly around Itanium and other architectural decisions that absolutely didn't pay off. Ended up taking way too long to get further down on process size.

I applied to Intel's architecture group in 2015, and when I asked one interviewer "does Intel have a strategy if TSMC catches up on process tech?" I was left with a blank stare, as though nobody in the group had contemplated the possibility. I didn't end up working there...

That said, Intel got exceptionally unlucky on which technologies they pursued and which they didn't, so I hope they can turn things around. They missed on through-silicon vias and chiplets, EUV, Cobalt wires, and several low-power transistor technologies.


The fab business is very sensitive to scale, with enough money and discipline you can mostly make your own luck. TSMC does way more volume than Intel and they have a very aggressive customer (Apple) that can help work out early quirks and guarantee orders on new nodes before they're economical for anyone else. I don't know what a turn around might look like but it's a tough problem.


> TSMC does way more volume than Intel

Do you have a source for this? The only one I found says Intel is 2x TSMC:

https://www.statista.com/statistics/883715/microprocessor-ma...


That is revenue based, and not based who makes the chips. In the chart Nvidia, Broadcom and Qualcomm are all fabless (outsourced production) and their chips are made by TSMC, Samsung, or other foundries.

By wafers produced (including memory), TSMC is 2nd and Intel is 6th. TSMC is 2x the 5th placed company on the table, so Intel is even smaller than that. https://www.eetimes.com/chipmakers-increase-share-of-global-...


By wafer doesn't sound like the correct comparison as we mostly care about top nodes for this. Revenue actually sounds better to correct for that. Your link also shows that Intel sells their old nodes to others further skewing the per-wafer comparison.

Wafers at 14nm or better would probably be a good metric for who has the volume behind top nodes to then be able to stay in front. If that's TSMC by a large margin losing the mobile and GPU markets has really done a bigger number on Intel than I had assumed before.


people buy a lot of smartphones, often


Itanium was a bet that seemed the way forward at the time. Making an x86 instruction set processor that did on the fly analysis and conversion to a superscalar computation core sounded really hard, so VLIW was the safe "we'll do the superscalar decomposition at compilation time and run that". Turned out to be a bad bet because a miracle¹ occurred on the other side.

I think of it as a peer to the 2005 proposition that electric cars didn't need to be golf carts and the technology was coming into place to make high performance, real cars that were purely electric. Could have been true. As they say "very dangerous, you go first." That one paid off.

I wonder if there is a list of large dollar, novel, engineering expeditions and how they turned out.

¹ Where "miracle" means some really smart people worked really hard at it for a long time.


Itanium was a gigabuck and drove everybody out of the high-end processor market except IBM leaving Xeon for Intel to monopolize for a decade+. I'd qualify that as a business success even if Itanium was a technical failure.

And, it may be paying off after all. Graphics cards are effectively giant VLIW machines. And while Intel's new cards get slaughtered on DirectX 11 or older games, they are quite competitive when used on the DirectX 12+ stuff which basically wants a giant multi-core parallel processor.


I’m not sure the timeline really works for this argument. Pentium was the first x86 superscalar and that was 1993 so it predates Itanium.

Rather Intel wanted to leave the baggage of x86 behind, segment the market for higher profits and hoped to get better performance than superscalar x86. But they never achieved the latter and most people just wanted 64 bit and software compatibility.


> Making an x86 instruction set processor that did on the fly analysis and conversion to a superscalar computation core sounded really hard, so VLIW was the safe "we'll do the superscalar decomposition at compilation time and run that".

The first superscalar x86 processor was the Pentium, which came out in checks notes 1993 (Wikipedia claims design work started in 1989, hit working simulation in 1990 and taped out in 1992). Intel didn't start work on Itanium until 1994, after Pentiums were already shipping to customers, and Itanium itself wouldn't ship until 2001.


1) Completely missed the boat on mobile. Their ARM-competitive chips (Atom, etc.)... weren't. Missed on every measure from power to performance. Let Qualcomm and Samsung eat their lunch.

I'm gonna add the caveat that the failure didn't happen out of the gate. Xscale was leagues ahead of anything else two decades ago. Atom stumbled at the start but Cherry Trail had a better price to performance ratio than anything that followed it. Intel's failure was nothing but pure avarice.


In this market what counts is performance/watt ratio. Battery power is a big selling point for cell phones, and so you can charge a bit more for less watts. Though of course price does matter too. There are also embedded markets where you have to be passively cooled, again watts matter.

My impression is Intel hit the price for embedded just fine, but at higher watts, and thus they can't compete.


In this market what counts is performance/watt ratio.

That was also excellent for Cherry Trail, with an SDP of 2 watts. When you have some time sit down and compare the benchmarks from a Cherry Trail tablet like the Surface 3 and its successor chips like the 4415Y. The 4415Y like the one in the Surface Go is three years newer, has a 60% higher TDP, a Recommended Customer Price over four times higher, and while its 3D chops are better it actually benchmarks lower than the Cherry Trail chip in PCMark.


Completely missed the boat on mobile

I think their main error was short-termism. They were addicted to high margin (workstation/server) devices. They had no interest in the low margins of mobile devices. They could not imagine how low-power devices would take over the world.


Actually that is the "Innovator's Dilemma" in action. Managers will always prefer to safeguard and enhance the high margin aspects of the business and discourage/under-invest in low margin disruptor aspects of the business due to cannibalization of profits and thus missing profit guidance for the quarter (thus losing compensation personally). They knew the tide was changing but organisation dynamics prevents a course correction.

Only a few companies can buck the Innovator's Dilemma, usually only founder led enterprises with significant control.


> missing profit guidance for the quarter

The Innovator's Dilemma isn't about missing quarterly guidance. The dilemma is that it's perfectly rational and profit maximizing to continue focusing on your cash cow even when obsolescence is a foregone conclusion. It wouldn't be a true dilemma, otherwise.

The incumbent is the only player that can maximally squeeze the very considerable remaining profits from old technology, and they should do so with gusto. Moreover, switching to new technologies comes with more risk, even when it seems obvious what the new market will look like because the old market has almost zero risk--it's completely proven.

All the "solutions" to avoid the dilemma, like selling the old technology to take future profits and then pivoting to the new market, are just corporate branding shell games. They might even be in fact sub-optimal, but in any event the fundamental dynamics remain the same.


Except it didn't actually work out. It worked out at the time, and thus was "perfectly rational" then but rationality should not have a limited time horizon because then it is limited. This is the problem with short-term thinking, it eventually leads to loss.


What value is there in corporations being immortal? That is, why forego substantial and easy profits merely to exist as the same corporation further down the road?

There are transaction costs to creating and building a corporation, but do those offset the clear costs of leaving money on the table, especially in the modern world of highly liquid capital, and particularly in markets with clear technological breaks. The lesson of the Innovator's Dilemma is that very often the perfectly, unqualifiedly rational decision is to press your advantage to the very end. And importantly it not only maximizes short-term profits, but implicitly it maximizes long-term profits globally by most efficiently allocating resources. Why waste energy swimming upstream when there are endless fish spawning and starting their journey upstream already along with ample resources of their own.


There isn't anything particular about a company being immortal. The issue is now. No one wants to be there when a company is dying. There's nothing heroic about "going down with the ship" in the corporate world.


It's reported Steve Jobs did ask Intel to provide CPU for the first gen of iPhone, but the offer price was too low so Intel would lose money. If you were the CEO of Intel, how would you explain it to the board that "we should make this deal with Apple even with loss, because I think iPhone will be the next big thing"?


The proposal was unstable on both sides in fact. Jobs didn't fully appreciate the importance of power efficiency at the time - an internal team scrambled to demonstrate why Intel would have been a non-starter due to power budget. There were also ecosystem issues, since gearing up for an embedded device at that time mostly meant choosing ARM architecture for that complexity tier they were engineering.

So a deal based on any price wasn't a realistic avenue for iPhone. The fact that Intel was actually considered was itself a radical move on behalf of Steve (absent the technical obstacles that emerged later).


Did Intel counter-offer? I mean, if someone offers you a bad deal, your options are not restricted to "yes" and "no".


Likely they did but Apple didn't want to pay more. Apple generally is not known to be flexible when it comes to business deals with suppliers.

This is reported in a profile of Otellini:

> "We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," Otellini told me in a two-hour conversation during his last month at Intel. "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."

https://www.theatlantic.com/technology/archive/2013/05/paul-...


If they blew their cost estimate that badly, then yes, it is a massive error on Intel's part.


Which is ironic because it was their low margin desktop products that eventually took over the high margin workstation/server market in the 90's.


They missed the boat twice. Once with their sale of their ARM XScale branch, in 2006 for 600MM (such bad timing). Later with their blunders with Atom on mobile.


Apple gets incredibly high margins on their products probably leaving more play room.

Is there any evidence that mobile chip makers who do not also create/own the device and are in a race to the bottom on price actually get decent returns?


Intel also doubled down on x86. Apple, in contrast, is creating the ARM future by designing their own chips, x86-to-Arm emulation layer, and compiler toolchains


Also they are losing share in the datacenter space as well. AWS is fabricating their own chips for gravitron instances as well as some other special types. GCP has been doing this for awhile with their ML chips, and I am sure Azure is moving in that direction too. It looks like they will take the path of IBM where they are still massive due to legacy users not moving off of them, but it will be hard to capture new market share without some massive change happening.


To add to the list their 5G modem chip also failed in recent years. They sold their mobile division to Apple who then indirectly laid off a ton of staff by making them reapply and interview for their existing positions.


Itanium dates back to 2001. It was "discontinued" in 2020, but as we've learned it was only being continued because HP was footing the bill. It really doesn't have anything to do with Intel's current problems. Intel was basically past the Itanium debacle by 2010.


> 1) Completely missed the boat on mobile. Their ARM-competitive chips (Atom, etc.)... weren't. Missed on every measure from power to performance. Let Qualcomm and Samsung eat their lunch.

As someone who worked related to those projects (PowerVR, which supplied graphics IP for some of those devices, on Android so other OSs may have a different view), I am completely not surprised they didn't go anywhere.

This was a good few years ago - I no longer work for PowerVR, and Intel seem to have completely given up on the market for some time. My memory isn't likely prefect, but I can give broad strokes.

It always felt like "Having" to go to PowerVR was an embarrassment for the teams, they kept trying to replace us with their internal GPU architecture, presumably completely fail to hit any power/performance targets, then last second call us again and try to rush everything through. Then, it feels like most of the time they drop the entire project before it made release anyway.

The teams that we worked with never felt high status - all the engineers we spoke to were either on that team because they couldn't move internally to something more prestigious due to some internal politics, or were actively in the process of changing teams. There were times where our internal engineering contacts changed monthly, if we had any engineering-level communication at all.

There's also some weird public claims about things around this - like PowerVR not providing driver source or similar, but that's simply not true. As the person packaging up the releases, they were only source releases, no obfuscation, with full documentation and build instructions.

The PowerVR driver model generally meant releasing a "reference" driver as source, with hooks for customers to hook up to their specific SoC implementation (think stuff like setting clocks, power management, bus endpoints etc.). The supplied package only had a couple of example backends. While one of those may have been an Intel chip, it was only minimally setup in order to make the graphics work for PowerVR testing. It was simply not possible to release a binary-only driver. But I've seen this claimed on a number of tech forums - often from people claiming to have knowledge from the Intel side, and if that's true and not exaggeration, I can only assume it's due to lack of communication between teams internal to Intel. I can assure you, someone in Intel had the full driver source.

Though WRT the internal communications, I sometimes saw this from the other side - we dealt with 2 Intel teams with different SoCs they intended to build, and it felt like 2 different companies. They never shared anything between them, almost as if they never spoke at all, and seemed resistant to changing that. I have no idea why, I guess it's just how their management structure is setup?

If you go through all that and end up with an actually good product, I feel it'll be almost luck rather than anything else.


> unforced errors and mismanaged

No. Managing innovation at the scale of Intel is a challenge. And their customer base (PCs and servers) is shrinking. Also, this is only like 15% of the workforce. Which is only 5% more than a typical culling.


Never understood why they did not just get a new ARM license and make a mobile chip it was not like they were going to impact existing business.


This blog post[0] explains it pretty well: "Fewer and fewer computer users think their computer is too slow." And therefore they only buy new computer if the current one breaks. "This happens less and less often. Even most disk drives, which are about the most mechanical part of a computer system, come with at least a 3 year warranty."

"It means that computer purchasing decisions are no longer made based on price/performance or just performance, like in the dark ages. Now, when somebody decides to buy a new computer it will be price alone, or maybe price and service, that determines which computer to buy."

[0] https://jlforrest.wordpress.com/2015/12/18/the-forrest-curve...


Can't read the article, but for context Intel's share price never recovered from the "dot com" bubble of 2000:

https://www.tradingview.com/chart/?symbol=INTC

It's one of a few large companies with that distinction.

Shares now offer a (relatively) regal 5.3% dividend yield and a price/earnings ratio just above 5. Of course, as profits dwindle, both metrics will be recalibrated downward.

This seems relevant because during the late 1990s Intel was allegedly the company whose shares one bought and never sold. I could name a couple of those whose positions today are argued to be equally ironclad.


> Intel's share price never recovered from the "dot com" bubble of 2000

As you observe, this is true only if one ignores dividends, which are a material component of total returns for a mature company like Intel.

> during the late 1990s Intel was allegedly the company whose shares one bought and never sold

This has never been true for any public company. And it’s driven by investor style more than company fundamentals.


> This has never been true for any public company.

Not really true. Most billionaires utilize the "buy, borrow, die" strategy of borrowing against their holdings instead of selling to get around taxes. As long as the equity increases in price more than inflation you come out ahead, plus save the 20% on taxes. Then when they die their capital gains are reset so no taxes ever paid.


Even with dividends reinvested, $INTC gave zero real returns from Dec 1998 to now, and a net loss if invested after July 2014. Also $AMD now has a larger market cap, and $NVDA is 3 times larger! An amazing decline.

https://totalrealreturns.com/s/VFINX,VBMFX,USDOLLAR,INTC?sta...


I believe this shows total returns for Intel(i.e. including dividends):

https://stockcharts.com/freecharts/perf.php?INTC

The picture is better, but not great. Folks who bought at the peak of the bubble would have needed to hold for about 18 years to break even.


> It's one of a few large companies with that distinction

Because the rest went bankrupt.


Remember, if you're buying someone's selling. (Which might be the company itself.)

And your dividend % is before tax, so compare to tax-free bonds.


> Shares now offer a (relatively) regal 5.3% dividend yield

US treasuries are nearing 4% and are risk free.


Reports elsewhere ( https://arstechnica.com/information-technology/2022/10/repor... ) suggest that the layoffs might be a bit concentrated in the Sales & Marketing departments.

Is it just me, or do some other folks also smell "corporate bloat, which better management would not have allowed to happen"?


Might make sense: If you're selling mostly to mobile OEMs and data centers you need a completely different sales/marketing strategy that's more focused on whale hunting than building "brand awareness" to a broad consumer market.


"Intel Inside" worked..


I wonder if it did? Who bought a laptop because it had an Intel CPU? I know I always bought Intel laptops because Intel laptops were the only things available, not because I have some high view of Intel as a brand. I imagine the vast majority of laptop buyers wouldn't even know what an Intel is.


> Who bought a laptop because it had an Intel CPU?

If you're a Linux user, Intel integrated graphics have always been the most compatible and most well-supported option.


That was a long time ago. AMD has shipped iGPUs/integrated platforms and dGPUs with fully mainlined drivers for ages now. Only recently has Intel gained credibility with Iris iGPUs beating AMD's, whereas for the longest time you had to pair Intel platforms with Nvidia GPUs to get any kind of graphics performance, spelling trouble due to Nvidia's insistence on closed drivers and binary blobs.


Back when I was not on Mac, I always preferred laptops with Intel integrated graphics, regardless of performance. It was just better drivers, less battery drain and boards didn't fail either. My first laptop had NVIDIA graphics, and it was whole combination of bad drivers and motherboard frying because of overheating. Things have probably changed, but I'd just get one with Intel graphics -- just because I trust Intel to keep pushing better drivers based on their history.

Speaking on AMD, 10 years ago at-least, there were very few premium AMD laptops, and they used to overheat quite a bit, has that changed?


Yes, AMD's APUs have filled the market vertical with good integrated graphics and CPUs starting about a decade ago with the AMD A4 through A10 lineup, and continuing today with the Ryzen processors with Radeon Graphics.

GPUs of the era your thinking of had high failure rates from issues with lead free solder, though the Nvidia GPUs on Macs and Laptops would outright fail from other issues, requiring a full chip replacement.

https://eclecticlight.co/2015/12/20/lead-free-graphics-cards...


> graphics performance

For the Linux users I was referring to, the most graphically intensive thing many of them run is a desktop compositor.

But yes, today, AMD iGPUs are a great choice for mobile, and AMD dGPUs are a great choice for desktop or for i-don't-care-about-battery gaming laptops.


I've seen this come up a few different ways over the past year, and I think I've also read that Intel contributes more to kernel development than any other company. One thing I'm wondering is: how dependent on Intel have desktop Linux distros become?


> If you're a Linux user, Intel integrated graphics have always been the most compatible and most well-supported option.

You forgot to add, "worst performing by a mile" to the feature list.


nothing much slower than when an nvidia dkms process fails silently and you're left without video options at next boot -- something that generally can't happen with intel-video/linux.

I get your point, but intel video options perform on par or better with regards to the most common consumer video rendering demands at this point. Video acceleration and high resolutions and multiple displays are well covered -- not everyone needs to process GPGPU workloads and play the newest games at 90FPS.


A slow functioning video card is still better than a fast video card that doesn't work.


In my experience nvidia with nouveau is slower than Intel with its free driver.


I recently unplugged an nvidia card from a machine because of kernel panics with the nouveau driver.

Of course I can install the proprietary one, but that means I can no longer just upgrade the system and expect it to work after reboot.


Compared to other integrated graphics?


Was this marketing to Linux perverts worth all this money that Intel poured? For the vast majority of normies, Intel Inside was useless noise.


I don't know, for a good while before Ryzen came out Intel CPUs were widely regarded as the best choice, so the marketing may very well have helped move machines.

On an unrelated note, "perverts" doesn't seem to me like a particularly kind appellation.


it absolutely worked.

the pentiums were a huge leap from the amd 386/486 and cyrix options. 'Intel Inside' was basically a premium-product differentiator for those that could afford it, and that was well understood by consumers at the time.

Ferrari/Gucci/Armani/Rolex/Louis Roederer labels also help to push product. Same phenomenon, people didn't buy Intel strictly because it was needed for specific workloads, they bought it because of the fancy sticker that differentiated their product from cheaper alternatives; even if the person didn't know thing-one about computers or CPUs.

It sounds corny, but having lived through it I can vouch that things are really that stupid.


I always bought Intel CPU laptops. Could never figure out the AMD naming scheme (and still can't but that goes for Intel now too). Also back then it seemed the AMD laptops were of poorer quality than the Intel ones.


It was one of the most successful branding campaigns. It allowed Intel to become the primary brand over the laptop/pc vendors. You can buy Acer/HP/Dell/... Because they are all Intel. Even non tech people understood that and this gave great bargaining power to Intel.


> It was one of the most successful branding campaigns. It allowed Intel to become the primary brand over the laptop/pc vendors.

I mean, it coincided with that. It's not clear how much was caused by branding/marketing and how much was caused by Intel being better (at least in laptops) from the Core Duo days until now[1] plus-or-minus a few years?

---

1. Honestly, I haven't kept up with laptop hardware performance during the AMD chiplet era.


Before Ryzen, I strongly preferred Intel CPUs and Intel-compatible motherboards for building PCs.


For what it's worth, the laptop I currently use for work I requested a few years ago specifically because it had an Intel CPU – not necessarily because of performance (though that was a factor) but also I know how to get important performance counters out of it for diagnosing performance issues. I don't know if I'd make the same judgment today, though.


Intel marketing is NOT aimed at customers. It’s aimed at Dell and HP et al.


Then why did they (used to) run ads during prime time TV shows? Was that the only chance to get product buys from Michael Dell?


Because that’s part of the deal. You sell the campaign to Dell and promise to do co-marketing and advertising - how effective it is when 90% of the prebuilt market is already Intel is left to the reader.


>Who bought a laptop because it had an Intel CPU?

I always buy Intel. I simply do not care for AMD's software jank, Intel has consistently proven more stable and reliable and that's not something I'm willing to trade for marginal differences in performance caps.


Among sophisticated consumers you would think people buy a laptop based on thermal envelope and longlasting performance on battery. The marketing around Intel CPU and the Nazi concentration camp style model numbers takeaway from an informed choice.


Should have pitched their mobile strategy as "Intel Outside"


By the time mobile got big, Intel was no longer an ideas company. There is no way they could have come up with a tagline this good.


Intel, if you're listening.. I hope you you're able to find this tagline and offer me a 7 figure contract for more of these ideas


It was the bunny suit dancers...

https://www.youtube.com/watch?v=YMCNILzZsWk


"Intel Inside" worked to build awareness of the Intel brand. The campaign validated the idea that, for a certain amount of ad spending, one can build brand awareness, even for a CPU chip. But, compared to Qualcomm, for example, do people need to be aware of a brand in order for that brand to be dominant?

With the rise of mobile gaming, the fact that people don't know what chip is in their phone casts even more doubt on the value of brand awareness. Intel got dominant by always having a design and/or fab dominance over rivals. A "one-two punch." Who among PC buyers understood that?

On top of that, if Gelsinger is serious about building a contract fab business, that has sales and marketing needs way outside of anything Intel does today.


Did it? Or was there no other option for a long time?


(This comment was originally posted to https://news.ycombinator.com/item?id=33180313, but we merged the thread hither since that article was a misleadingly titled ripoff.)


Seems normal that the number of employees in Sales would follow the rate of product sales.


Perhaps. Though I'd be a bit slower to react to the market here. Depending how sales are handled, removing staff can remove the relationships that they have developed. It's a pretty bad sign to be doing this.

Might be different for Intel though, everyone has to drink at the same oasis.


Yes, sales are at the front of the pipeline. If there aren’t prospects and customers to talk to, salespeople sit idle.

If you’re at a company and you look at your engineer to sales ratio you might think to yourself “gee, we have so many sales people!”

That’s because there’s not much multiplying factor to sales roles. There isn’t a lot of potential to automate like engineers can do. A salesperson only has so many hours in the day and they’re fitting maximum one or two customers into a one hour slot of their time.


There is indeed a lot you can do to avoid overcrowded sales departments, e.g. define minimum order volumes your prospects need to hit to let your salespeople interact with them (and refer them to an external reseller otherwise). Invest in online sales tools like product configurators to let you customer do the sales job etc.


Ads companies like Google/Meta are good examples. Any small account gets a webpage and that’s it. Still, it must be harder to sell semi-commodities like CPUs.


Their GPU group seems to have some minimal success. The CPUs are still great but obviously process issues dominate.

The rest of intel is a shambles. The networking group in particular is an industry joke.


FWIW, Ars are citing the same Bloomberg piece the submitted blogspam is, so I'd not take that as an independent validation or clarification on the magnitude of the sackings.


Tangential, but what are the consequences of upcoming layoffs cutting across the industry and impacting mostly these "soft" non-technical roles? Will this create a large disgruntled segment enough to start upheaval the like of the French or the October revolution?


That kind of doomer prediction would be more interesting if we did not live in time period with historically high employment rates.


>historically high employment rates

People will believe any statistic that aligns with their mindset.

Unemployment rates are at historical lows within a small margin, it's been 50 years since unemployment has been lower (i.e. something like 60% of Americans have not seen lower unemployment in their whole lives), and since the WWII war economy that they've been significantly lower (i.e. almost nobody alive).


What's the difference between historically high employment rates and historically low unemployment rates?


(employment rate) = (employed people) / (total people)

(unemployment rate) = (people not currently working, but actively looking) / (total people)

The first includes all people, the second excludes those not working, but also not looking.


In Australia at least "unemployed" according to parliament: https://www.aph.gov.au/About_Parliament/Parliamentary_Depart...

"The Australian Bureau of Statistics (ABS) defines a person who is unemployed as one who, during a specified reference period, is not employed for one hour or more, is actively seeking work, and is currently available for work"

So if you are desperately looking for more work BUT you have a casual job providing 1 hour of work THEN you are NOT unemployed. I mean, it's right there that you have a job.

Given the high rates of casual/part time work in our workforce, unemployment rates mean nothing to me anymore...


This is also true to a similar degree to how the US defines it. It's an important point, since a lot of people are arguably underemployed, but are still included in employed statistics. I just didn't want to get too far into the weeds of the definition and distract away from the point of how unemployment excludes people while employment does not.


Surely there's a Full Time Equivalent (FTE) employment definition somewhere to capture exactly this issue. But if there is, it never seems to come up in public discourse, possibly because it is difficult to understand or for people to directly relate to.


And, obviously, the employment rate is harder to game than the unemployment rate. There are long term trends that affect it (women entering the workforce, etc.), but it's an overall more stable and valuable metric to look at.


Or just look at employment, not unemployment data.

https://tradingeconomics.com/euro-area/labor-force-participa...


Historically high employment rates == historically low unemployment rates, right?


Not necessarily: https://www.investopedia.com/terms/d/discouraged_worker.asp

In the UK people sometimes make the claim that the unemployment rate the government is quoting is so low because people have given up even looking


Havent the calculations changed ?


The cited source states that "Some divisions, including Intel’s sales and marketing group, could see cuts affecting about 20% of staff".

This article extrapolates that out to "All divisions will cut 20% of staff".

It's unfortunate for a lot of people regardless, but the report seems (for now) to be exaggerated.


I love how the PC market slowdown is being headlined as causal when Intel has clearly done this to themselves.

Maybe if Apple's products were still "Intel Inside" Intel wouldn't be in this position.


I asked an Intel chip designer more than 10 years ago why they were not prioritizing mobile chips. He didn't have a good answer. It has been obvious for so long that everything was moving to phones, tablets, etc. The fact that Intel has no foot at all in that door is astonishing to me.


It happens. Data center computing is profitable, and Intel mastered the sales motion long ago. IBM never really got off mainframes for similar reasons. It's a variation of the Innovator's Dilemma.


I think IBMs case is a bit different -- after all, other companies may have innovated in the space, but they are the ones who actually came up with the PC standard we still use today (and then quickly lost control of it). By comparison, Intel has nothing to show on the mobile space, just a few half hearted attempts at mobile CPUs that fizzled out.


> IBM never really got off mainframes for similar reasons.

You could kinda say they made the PC? Also involved in PowerPC processors.

Not getting of mainframes were not their problem I think.


IBM made the PC but others monetized it, most notably Microsoft followed by Compaq and Dell. [0] The PC architecture was not proprietary, and IBM didn't control the operating system.

By contrast IBM made boatloads of money off mainframes and their advantage was durable. That business line is still quite profitable even today. (Just not very big in comparison to the IT market.)

[0] https://www.forbes.com/sites/timbajarin/2021/08/25/attack-of...


10 years ago, phones, tablets were the big thing. But smart management realized that no one was making money without a full vertical stack. The choices were iOS or Android. Besides Apple and Samsung no one was making money from phone and tablet. On the other hand - high performance computing and data center was a growing market. So smart money put the bet on high performance chips where you could have >50% margins. That's why AMD cancelled all their low power tablet chips (both X86 and ARM) and focused on Zen and the datacenter.


It's almost funny that Intel did have a foot in that market in the 2000s. They sold off XScale right before the original iPhone was released. A lot of PDAs used XScale CPUs.

I think Intel just bought into their own Wintel uber alles bullshit and couldn't even conceptualize devices with TDPs under a watt. No one could possibly do anything worthwhile without Wintel so shed everything that's not Wintel!


Sad thing is this has nothing to do with engineering and I'll bet a ton of those 20K employees will be engineers.

Piss poor management strikes again.


Neither does Microsoft; great isn't it?


And ‘people not at home’? What does that even mean? People take their laptops to coffee shops? But so what?


I'm under the impression that for a large chunk of the population, spending most of their time on their PC during the pandemic was actually an undesirable lifestyle change. I suspect they might be changing back to doing whatever it was they did beforehand.


This seems like a reference to the situation during the height of Covid lockdowns, where people were stuck at home buying tech products.

In that case, the deeper read of the situation is that the Covid crisis helped them kick the can down the road by 3 years on responding to the trends of 2019.


There was a huge surge of people buying new gear to outfit home offices for WFH. That's over now. Not everyone is back in the office fulltime (obviously), but we'e not out rushing to buy new stuff when we just did all that not too long ago.


Ouch. Very sorry for those affected. I worked at Intel for >8 years, and I still have a lot of friends there.

I have little faith in Intel's management. They apparently plan to continue increasing their dividend payouts, while cutting back on investment in fabs according to https://semianalysis.substack.com/p/intel-cuts-fab-buildout-...

... and in the context of the recently-passed CHIPS Act...

I wish they'd be more of an engineering-driven bottom-up company. IMO there's a huge company culture to contend with that is primarily top-down. So much scar tissue in the company prevents changing things for the better. So many good people capable of making these sorts of changes get burned out and leave the company. Intel's compensation compared to FAANG certainly helps to push people out of the company.


Intel invested a lot into EUV thus enabling ASML to do it, but did not take advantage of the technology because they thought not to take "risks" even though they could have easily set up different teams invested into different technologies and picked the best one. Business people running the company was a failed experiment clearly.

Good thing Pat is back and now going with EUV to remain competitive.


Intel just received money from the US government and they swiftly started laying off American workers. The money is supposed to be for the new Ohio plant and 3000 jobs, but now we’re losing thousands of jobs up front?


Uh, yeah, you didn't realise it was a scam when all the politicians trying to get that bill passed had investments in the companies getting the money?


They are using that money to build a chip plant not to subsidize 3000 workers.


That chip plant will be the factory of the future. However:

“The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment” – Warren G. Bennis.


The US Govt regularly robs the tax paying people to give kickbacks to their corporate allies. It's just business as usual


The US needs a chip fabrication factory. The money is being used to build it. This has no relevance to the the 3000 workers who were laid off


Then they can build one for the people by the people. Nothing says they need to give money to private orgs to secure fundamental national technology.

With all the things we book under the "National Security" banner a much needed next gen chip factory should be one of them.


This is a terrible idea. Imagine if the same people who bring you the DMV try to make a chip plant


20% is in sales and marketing. How would you feel if taxpayer money was going yo marketing? Hopefully the money is going toward equipment and research. Maybe they don’t need the people.


It is not 20% of the Job Cuts are in Sales and Marketing, which is what your sentence could implied. But up to 20% of Sales and Marketing department being cut, which represent majority of the 1000s of job cuts.

I am pretty sure there are lots of bloats at Intel that requires trimming down.


what about the other 80%? presumably the comment holds for that?


20% of sales and marketing is getting cut. Some other departments may also see losses.


Same as always. I'm pretty sure Ford did something similar several years ago. They got money for jobs, then closed the plant like a year later or something.


What should they have done if they aren’t manufacturing enough cars to keep the factory at capacity?


I seem to remember them moving the production out of the country. So I'm not sure if the numbers were down or not.


Probably a sweet stock buyback as a side effect.


Looking at the chart, would it not be a good time - especially with USD losing 8% of its value on a yearly basis? After all, you can only invest so much into capital projects and who knows what will be left of the US after Biden's tenure, so you want to be cautious.

Also Intel never had a hyper-inflated share price like most other semis. It's actually quite comical just how much the share price has dropped because its previously meagre dividend yield is now 10% due to denominator effects.

The buy back would similarly help ex-employees with stock or options, and I'm sure the package on the way out from Intel will be quite good.

See: https://www.tradingview.com/chart/?symbol=INTC 50% down YTD.


Intel suspended stock buybacks in Q2 2021 and hasnt done any since: https://www.intc.com/stock-info/dividends-and-buybacks


They had also suspended dividends and after getting the money immediately reinstated them.


huh… its been so long since I’ve found this kind of thing controversial

Just had a little flashback to 2008 me


PCs?

Gamers (and non-Mac high-end desktop performance users) would seem to be a pretty small slice of the pie these days. I have a high-end current M1 laptop but I do a lot of my work on a couple of 7 year old Intel-based desktop/laptop machines which are perfectly fine although they're about to go out of OS support. (Though I'll probably use them for a few more years anyway.)

I remember when you wanted a new PC with every tick of the processor cycle. These days, for most people, who cares?


Can't recall where I heard this recently, a podcast I think, there was a CS professor talking about undergrad introductory programming classes. He said that about 5 years back his students stopped coming in with experience using a PC because they'd only been using phones/tablets. So the prof said it was kind of like going back ~25 years when not a lot of incoming students had PC experience either.


Was talking to someone at work and their kid didn't want a laptop because they were fine with doing papers on their phone. Blows my mind (and that of others) but I guess that's where we are.

Interesting situation. I never touched a PC until after college. But, yeah, there was at least a period where basic "computer literacy" was expected. Maybe things have shifted again.


I, and most of us on here, are likely power users of computers, and beyond my job, they are a big portion of my hobby as well.

I acknowledge I'm in a unique situation where I run a home lab and so I have a local cloud, but because of that, I can effectively do 100% of my job off an iPad Pro (I don't want to for quite a few reasons, but I could). If I didn't have my home lab, I could get close by deploying my OS on AWS/Digital Ocean/etc. and have the same effect (though latency would drive me nuts eventually + internet connectivity issues - though this would break my job anyway).

If I weren't coding, I'd probably entirely switch to the iPad. I do all my document editing on it. I do my code reviews on it, architecture planning, etc.

I could realistically see the average consumer never needing anything more powerful than something like an iPad Air. I complpetely believe we will head to the era where the only hardware the average consumer needs includes:

* Great resolution screen * Great internet connection * Great input controls * Minimal CPU/RAM to run a robust, hardened portal to cloud services

iPads are basically beefier versions of even that.


>iPads are basically beefier versions of even that.

I'm happy using a good Chromebook for most things but I've never found an iPad--even with external keypad--to be an adequate substitute for anything involving text editing.


Yeah - it's a specific implementation of the general idea. Chromebooks essentially fit the same bill. I just happen to have an iPad (and I do most arch design with a stylus so the Apple Pencil is fairly crucial to my workflow).

I'm fairly happy with the magic keyboard with it, but understand different preferences!


Yeah, I don't really draw much and I'm used to a mouse/touchpad for editing so it's probably a matter of what I'm used to. Also, at events I often want a laptop on my, well, lap and tablets+keyboard don't work as well for that.


I rarely see people who can touch type quickly on keyboards these days. Even members of my generation can't seem to do it anymore.


I never learned touch typing--I basically never used a typewriter until college and a word processor afterwards. (Nor any formal shorthand.) But I still am a pretty fast typist so long as I can look at the keyboard and get frustrated in a hurry even doing searches--much less editing--on a tablet. I know people who even write a lot who can handle doing it on a tablet with an external keyboard but I'll travel with a laptop rather than trying it.


It's worth noting that Intel has been stuck at a single tick in the processor cycle for... 7-8 years now?

Interestingly, Mac sales are up 40% recently. So I guess a lot of people actually do care -- they just want in on the best laptop processors out there, which are decidedly not offered by Intel.

With no good Windows support for ARM, I'm curious what trends we'll see over the next few years for PCs. Will they languor in their current state, using Intel's hot n' heavy processors? Will some manufacturers switch to ARM + Linux solutions to offer battery life and heat generation competitive with the Mac?


The Mac number is still relatively small though and it sort of makes the point that the people who really care about their laptops are disproportionately going with Macs (relatively speaking). While the basic corporate box is mostly don't care.

>Will some manufacturers switch to ARM + Linux solutions to offer battery life and heat generation competitive with the Mac?

I don't know. You already have Chromebooks but those aren't really mainstream outside of education even though they're all a lot of people need. And Google exited as basically one of the most high-end hardware makers there.


But outside of Apple, ARM processors aren’t really that great


I know that the Qualcomm's 8 Gen 1+ is within spitting distance of the A-series phone chips -- inferior for single core work, but superior for multicore loads.

Both Qualcomm and Samsung produce tablet chips that perform well enough. I wonder why they haven't expanded into the laptop space? I imagine Apple's M-series would be hard to compete with, but many consumers would be fine with something competitive with the M1, which seems doable.


Not disagreeing with you, this is just the best citation I could find with benchmarks.

https://www.notebookcheck.net/Qualcomm-Snapdragon-8-Gen-1-Pr...


You can edit 4k video on an iPad mini and run CAD software. The things that needed workstations now run on tablets.


Can we develop software on tablets though? It seems like this is the one missing component. On a PC we can easily get access to the tools to program the machines. Phones and tablets require a real computer to program them. Corporations own the machines so you need to go through them to reach the users.

It's like we've all gone back to the time before free software and GCC. It's a bit worrying that the PC market is shrinking. The PC is the only freely programmable computer on the market today...


While Intel has had it's own shortcomings, the PC market is cratering [1] and Intel among others is very exposed. I was very curious about the very steep discounts on Apple Macbooks recently and thought it was due to latest versions coming out in October; but looks like there is a massive reset in shipments after the massive boom during the pandemic.

[1] https://www.gartner.com/en/newsroom/press-releases/2022-10-1...


I would be surprised if anyone actually thought the surge in PC demand seen in 2020/21 would be anything but short-lived. Computers don't just become obsolete in the blink of an eye anymore; most of those new computers will still be perfectly servicable well into the 2030s.


I remember 10 years when I get interested in building a computer and the general consensus was that AMDs bulldozer debacle had them on life support.

Intel's dominance was practically guaranteed for five years, I knew Zen was going to be successful but I was surprised to see that they've successfully flipped the script and Intel is in a bind.


This is going to become the norm for a while - many of these massive tech companies have a lot of dead weight, and these current conditions allow for the shedding of such weight. It will be painful, and not great for many - but at the end of the day we should emerge a more agile, stable, and innovative society. At least that is the hope. The age of exuberance is gone, time to put our heads together, and work hard again.


Too bad mass layoffs don't target folks based on performance at Intel. They just lay off entire buildings.


> After the quarterly report on October 27, nothing better can be expected.

How to tell an article is written by a hedge fund shorting the company.


Sell rumors, buy facts :)


I worked at Intel and it kicked off my career. I believe putting a Cfo in charge of a company will lead to this. I wonder if apple will ever end up in the same situation


> I worked at Intel and it kicked off my career. I believe putting a Cfo in charge of a company will lead to this. I wonder if apple will ever end up in the same situation

Are you referring to past leadership? Because Pat Gelsinger has never been a CFO...


He's just mopping up though?


Pat Gelsinger was a CEO of VMware prior to Intel, COO at EMC prior to that, and CTO at Intel before that.

He was never a CFO.


I think they are talking about the previous captain that pointed them at the iceberg.


Brian Krzanich was cancelled/fired for sleeping with a coworker, so the then-CFO Bob Swan was made CEO after Intel tried to hire "externally" for a year or so. After MBA'fying Intel for a couple of years, he left to a16z, a perfect place for people like him. Pat Gelsinger then joined, probably too late, to try to steer the ship back in the right direction. Intel as an organization is finished. (I worked at Intel for 5 years through these changes)


> Intel as an organization is finished

this will age poorly.

Intel is already fighting back strongly and really only had 1-2 iterations where they weren't in some ways the performance kings. They'll be back, fabbing their own chips, while AMD and others pay TSMC to make them, and intel will make massive profits.

Intel 12th and 13th gen competes with AMD on performance and price and its still using yet another 10nm finfet process while AMD is using TSMC's latest whatever.

none of this is indicative of a 'finished' organization. It's just not quick, because nothing in basic research and chip design is quick.


This will age poorly. Performance king at unconstrained power is utterly irrelevant from a financial point of view, and requires very little engineering effort to boot: shoving amps into a package until it breaks is the job of a junior engineer frankly. The rapidly dying PC/DIY market will not save Intel.

The metrics that matter financially are performance/power and performance/area, and Intel is worst-in-class in both metrics in both CPU and GPU right now.

I worked in chip design at Intel for over a decade. In the 2016 culling, I noticed they laid off a ton of smart people, but all the terrible management and fake-it-til-you-make-it engineers survived. I left very soon afterwards. I suspect the 2022 massacre will be along the same lines. Intel as an organization is not just finished, it is terminally toxic and incapable of being fixed.


It's a little sad to hear that from multiple former Intel employees at multiple levels. They're all tremendously down on Intel. Maybe there's a bit of "refugee bias"? I really do hope Pat Gelsinger pulls through and defies our pessimism.


They have US fabs, they are too big to fail from the perspective of the US government.


Btw, they carefully never said the gender of the person who BK was porking. I'm convinced it was a man!


I don't think we should really care to probe into the details of people's lives like that, but also the WSJ clearly stated they were a woman: https://www.wsj.com/articles/intel-ex-chiefs-affair-with-emp...


> putting a Cfo in charge of a company will lead to this

The why is more interesting to me and seems pretty self-evident: a CFO's job is to keep bad things from happening.

That's not the sole person you want running a company, because that's just dying more slowly, with good numbers.


>I believe putting a Cfo in charge of a company will lead to this.

That sounds like what happened to Boeing.


And GE, and HP...


Intel is always planning thousands of job cuts. As soon as you get a job an Intel you'd better start looking for your next job.


And this isn't even counting the number of contractors that are being let go.


Used to work at Intel campus on previous job with another company.

No one really worked and would go home early.

It was really weird.


I have a desktop that had decent specs 10 years ago. I still use it today. I did add a graphics card, more ram, and more storage over the years. Guess what? It's still decent specs and able to play many new games. That's a long time compared to when I was growing up.

I also have a 14 year old laptop. That one's days are probably numbered. Maybe I'll get one in the next two years.


Intel is laying off so they can increase their profit margin from 45% back to 60% so investors are happy? Am I reading this correctly?


Is Intel meant to be a jobs program? Should they just hire and retain people they don’t need…just to keep them employed? Should they hire me to twiddle my thumbs for $100k a year just because they can afford to?

Increasing productivity is a good thing actually.


a company that has low profit and employees a lot of people as a lot more beneficial for society than the opposite


Assuming output is the same in both cases, this is wrong. If a company is making $10 million in profits, is it better for society if they spend $9 million hiring people to dig ditches and then fill them in? No, because those people could be doing something else that is actually useful.


Why so? I think it would better if everyone were in a productivity maximizing role, not busywork.


is *


Geez I wrote everything wrong

a company that has low profit and many employees is a lot more beneficial for society than the opposite


It's good locally, yes, but our current economic system does not have adequate systems to support people who are laid off into other work. Layoffs will result in significant hardship. Also, it's good for a company to have a degree of slack.


Those are people. Each one could have a family with kids. Don't be so callous.


Intel lived from the climb on moores law enforcing its standards, now that this is plateaued out, others have become the new intel setting standards into the performance relevant fields (arm for low energy compute (cellphones), nvidia for parallelization, etc.).

He who steals the fires from the gods, can force his standards upon men for the time the flames last.


The cause is the international economic downturn/ high inflation, combined with a collapse in cloud computing orders.

Around 30%-40% of PC sales were going to cloud compute. This is slowing.


I wonder how much profit Intel really makes on the data center. The tech press never questions it, but Intel does a lot of gaslighting and notoriously has many sock puppets in the industry press. We are told all the time that the data center is subsidizing the consumer but I wonder if the truth is the other way around... Certainly Intel wouldn't have gotten within 100 miles of the data center if it hadn't been for the volume of client parts being able to pay for technology that pulled ahead of SPARC, MIPS and all the legacy chips.

On one hand, the data center gets better utilization, maybe gets more value, and maybe turns over hardware faster. (e.g. why do I want to buy a new computer when the IGPU is just going to make it crash faster?)

On the other hand there is more competition for the data center, particularly cloud providers who could amortize rewriting simple but large scale applications like Amazon S3 for ARM or RISC-V over a large fleet of machines. If the data center is able to drive a hard bargain, it may well be that the client is still subsidizing the data center, but we just get told its the other way around so that we won't ask for me and complain about the e-waste Intel tries to pass off on us. (e.g. "try" because their sales are collapsing)

---

A good example of the gaslighting is this article

https://www.tomshardware.com/news/linux-kernel-update-kills-...

which should have the headline "Intel iGPU kills laptop displays".


I imagine tomshardware has to "play nice" so they get free demo units to do benchmarks, articles, overclocks etc with.


Hire Jim Keller back and build the next thing instead of milking the same customers.

The company developing a competent alternative to the M1/M2 or nvidia GPU is going to have a good business.

Seems to me that Intel is alway in this cycle of greatness and then crash while sitting on their laurels. It seem they need to have a fire under their buts to make the right decisions.


I love comments like this. I don't think any other industry has this kind of sport-team mythos in the mind of the layman. If Ford or Toyota has a bad year, people think "Hmm they need to make better cars". If Intel or AMD has a bad year, people say "Clearly they need do hire that guy who was in charge when they made all those good chips". It's incredibly lazy thinking, and I love it.


Intel has had a bad 10 years. The market has more than doubled in size and yet they have grown 25%. 10 years ago Intel made more revenue than AMD, Nvidia, and Qualcomm combined. I bet Qualcomm passes Intel in revenue in two years. All those companies have more than doubled their revenue. AMD went from $6b to $22b. NVidia from $4b to $30b. Qualcomm $19B to $54B. Intel went from $53B to $68B with the last 2 years being flat and this year actually declining.

Intel can't execute. They are failing to shrink transistor size. Failed at a 5G modem. Failed to deliver a competitive mobile processor.


This is irrelevant. If Ford or Toyota have 10 bad years, people will not suddenly start talking about all of the superstar, legendary car designers they should pull out of book signings back onto the field.


If Ford or Toyota had a Steve Job or Elon musk they wouldn’t be in the same place.

Listen to interview with Jim Keller, yes, there is a whole team doing the work, but you need a technical/management person with the credibility and willingness to make the hard choices and take risks.

99% of people hired for CEO roles will be happy to coast, take their bonus on short time gains and move to the next opportunity. Who is willing to take risks for long term success that will not appear on the quarterly report?

You need someone who can get that job and at the same time doesn’t give a damn about what people think and about the money. Those people are rare. That’s why Apple doesn’t have ’another’ Steve Job, Apple will coast on the previous trajectory for a while, but won’t disrupt anything else major, they will play safe.

The same is true for Google, they got their own money printer going nonstop, who would be crazy enough to take risks with that?

The last disruption of Google was Gmail/GSuite and was started as a side project. How many top engineer do they have, where is all their output?

For Intel they need to disrupt their own products to move forward, that’s why it’s hard to do, you don’t want to kill the golden goose. But it’s either you now or the competition in a few years.

Man Intel was really sleeping on this one to be kicked that hard by Apple on the first release of their desktop chip.

So in a nutshell: talented engineers are everywhere in those companies, that’s not the diferenting factor, good leadership is and the incentives reward those who don’t take risks.

BTW: your comment is mostly an ad hominem with a condescending tone: mind of the laymen, incredibly lazy thinking, I love comment like this!

Try with arguments instead.


> Try with arguments instead

Your original post had no argument, either, so I don't see the necessity.

Now that you've provided an argument, we can have a discussion, starting with:

> Intel was really sleeping on this one to be kicked that hard by Apple on the first release of their desktop chip

This isn't true. The M1 in a desktop isn't competitive against Intel chips.


Intel’s newest chips have efficiency cores alongside performance cores. They are actually somewhat competitive to M1 now


It would be more accurate to say Intel CPUs will have performance cores alongside efficiency cores from 13th gen onwards.

That is to say, i9 13900K will have 8 performance cores and 16 efficiency cores if I recall correctly. Many of the mobile CPUs in 12th gen also have more efficiency than performance cores.


For the desktop, they mostly have more P cores than E cores. For mobile chips, it's reversed.


How does this work with the recent passing of the CHIPS Act? Aren't they supposed to be expanding their workforce?


Much larger than the Meta layoffs if true, given that this layoff is 20% of Intel's staff getting the cut.


PC slowdown or Apple is taking food off of your plate? They caused few ripples in Meta's world as well.


With laptop oems transitioning to Arm, I don’t see where they have a future other than competing with AMD.

I don’t think a company like Intel was ever designed around innovation past what made them dominant in Windows PCs. And like Microsoft, they’re going to need to diversify.



Yikes, this is pretty scary. Hope everyone impacted at Intel comes through this. FWIW I have also heard of freezes at many other large tech companies, offers being rescinded, and over hired teams having to reallocate people to other teams.


It's insane to me that Intel just can't figure out what to do vs AMD. They continually position themselves as the premium choice and price their products accordingly even though that isn't true any more.


Seems like Pat Gelsinger has at least a couple of ideas what to do. https://www.theverge.com/2022/10/4/23385652/pat-gelsinger-in...


TLDR.

there's something I don't understand: car companies and everyone else can't seem to get enough Chips and yet Intel is laying off employees?


PC shipment are way down YoY, according to gartner, and people are worrying about inflation and potentially even worse disruptions so they aren't spending money on frivolous things like unnecessary computer upgrades and purchases.

car companies need specific chips for their cars. Intel makes its money selling different chips. Car companies don't want to redesign for different chips all the time.

https://www.gartner.com/en/newsroom/press-releases/2022-10-1...


If everyone's not designing new products due to lack of availability of cutting-edge stuff, selling more lower-end products than usual anyways (because buyers are more cost-conscious).... then the organization may need to lay off people who are focused on high-end products so that they can spend the money elsewhere.

The people you need to grow fab and manufacturing capacity are different than the people you need to design cutting-edge new products. And the sales team you need to offer fab services to other companies is different than selling CPUs to consumers.


They don't really need Intel's general purpose CPUs.


It's kind of mind boggling that they employer 114,000 people to begin with


This is insane. Surely they can't ALL be sales and marketing people.


So many questions behind this decision...

1. Is desktop computing still relevant?

2. Is server CPU market not making enough return?

3. I thought Intel needs talent to bootstrap their US-based fabs.

4. Does this mark the beginning of an end for PC?


> “Thousands” of Intel layoffs planned as PC demand slows and revenues fall

They get what they asked for: EFI, ME, expensive processors, feature locking, core as a service the bullshit that USB is.


Is the 15% sales "tumble" because people and companies are holding on to their PCs for a longer time? This could be good news for the environment.


Companies maybe not, but more likely, people just dont need PCs. My wife and I share one computer at home for like doing taxes. If we want another I’ll probably get a Chromebook for cheap. I do everything on my phone.


PC/laptop is a luxury goods for middle income families. Currently uncertain times are definitely not helping sales. And if they are forced to buy a new one a budget options are more likely (cheaper chip option - amd).


Wouldn't be surprised if data center is already overtaking the PC chip market.

But it doesn't look good for Intel on that front either.


It only took 2 years after Apple stopped using their chips for all the Macbooks that are shipped around the world.


Damn, that's a lot of people. Wow.


And the effect gets multiplied - all of them have a family, buy products, hire services etc.


I made a mistake here. I no longer stand fully by that statement. For details, consider my later reply.

Original text:

It is. The article does not say where this will occur.

They cannot lay off people from the EU like that. It's not that easy. Unlike the US, EU countries are welfare states.


Harder to fire, harder to hire, without fail.


If you have a strong welfare state, it should make it easier to fire.

This is the case in Denmark, some countries offer incentives to lay staff off temporarily but, generally, it isn't the case (which is why you have unbelievable levels of unemployment in Europe).


I might also have used the wrong terminology here. (English isn't my native language.)

However, I know for sure that in Germany, it isn't that easy to hire and fire people on a whim. In the US this seems to be the case.

Just in case, let me also clarify this:

I don't care, if you think the American system is better than the German one. I simply made the statement that even if not all EU countries have rulings like Germany, it is surely harder to fire people on a whim.

And also the article didn't mention where this will occur.


Yes, Germany has such a rule, Kurzarbeit basically does this through a state subsidy.

And one particular aspect of Germany is that capacity is controlled on the way up. The reason why the US has these huge swings is because capacity isn't controlled. It isn't possible to have the upside without the downside. Germany's political economy is totally different because the US has a far higher level of competition and innovation, this isn't possible with Germany's labour market.


My mistake was to assume such rules in other EU countries.

In Germany, there is a thing called "Kündigungsschutz"[1].

I am not sure about Ireland, but I assumed there are similar rulings.

So at least in Germany, it is not that easy, but possible.

1) http://www.rechtslexikon.net/d/kuendigungsschutz/kuendigungs...


Germany has strong works councils because it has very strong companies that, essentially, are given a licence to print money by the state. Ireland is an example of a country with a totally different political economy (more similar to the UK) with high levels of competition/innovation.

Either way, the point is that if you have high levels of protection that should mean that it is easier to fire someone...because they have something to fall back on. That is why the UK and the US have universal welfare states. Germany does not have a universal welfare state, and the only way that is possible is by having high employment security/strong labour laws and huge companies that are profitable due to low competition. This, obviously, comes with downsides (this is why Germany has struggled with high unemployment in the past, and has things like Kurzabeit to subsidise companies even further against firing staff).

An exception to this is Denmark which has almost total union participation and very weak labour laws, they achieve this through a much more expensive system of social insurance. They have, for Europe, relatively high levels of innovation so this all offsets (you get high security and relatively high levels of innovation).


Is big tech intentionally trying to increase unemployment so as to appease the Fed ?


Maybe? Or maybe to give the Right some wins to that Big Corp can beg for some tax-payer funded bailout cash when the looming recession hits.


Failing companies always blame things on the macro environment.


Wait. Intel and AMD are both headquartered in Santa Clara, CA?


Yup. Nvidia is also based in Santa Clara.

AMD for a long time had been headquartered in Sunnyvale, CA, but they moved just a few years ago. They are now building homes at the former AMD site.


I just got laid off by my employer last week.

Just remember, we're not in a recession.


That was sarcasm, downvoter.


I didn't downvote, but reading your comment reminded me of the denials some have made about a 'recession' and finding more excuses to deny and wait for the official 'figures'. By the time they are released, it is too late.

I still stand by what I said about already being in a recession months ago since November, in this classic comment: [0] but then one said:

   "No. That is completely false. We are most certainly not in a recession. Nor did one start in November. You can check the Fed data here and verify you claim is false"

   "The economy is considered to have entered into a recession if it experiences a decline in GDP for two consecutive quarters. And that has not happened."
Now that in [1] the US GDP fell again in the 2nd quarter, the top comment in [1] (and many others) mostly all talking about a recession.

So those in [1] still waiting for the official figures by the Fed, NBER, etc which those are lagging indicators to tell them it is too late? I don't think businesses would like the sound of that.

Really unsurprising, that all of this is happening. The time to prepare was November.

[0] https://news.ycombinator.com/item?id=31441710

[1] https://news.ycombinator.com/item?id=32263444


Right after CHIPs?


[flagged]


How many people have they hired in the last few years?


If Linkedin data is trustworthy, headcount growth has been 9% last year, 16% last two years. https://imgur.com/a/tTjb2ZM


Whoa that's a lot of people. The financialization of the global economy has been disaster of boom bust cycles that destroys the mental health of the population.


>financialization of the global economy has been disaster of boom bust cycles

??? Hasn't there been booms and bust cycles in america since founding?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: