Hacker News new | past | comments | ask | show | jobs | submit login
Apple, ARM, and Intel (stratechery.com)
456 points by jonbaer on June 16, 2020 | hide | past | favorite | 474 comments



The most important paragraph in this piece is last one, which plugs a $22.8 billion bill to support domestic semiconductor manufacturing in the USA and suggests the need for new entrants.

Semiconductor manufacturing is truly one of the few remaining manufacturing industries in which the USA has a competitive advantage (albeit quite eroded by TSMC over the past few years).

Especially with US aviation circling the drain, this is an industry we can't afford to lose. Removing semiconductors from US manufacturing statistics makes US manufacturing statistics go from "stagnating" to "extraordinary decline".

And it's not just the manufacturing sector. Semiconductors power some of America's most valuable companies (including but not limited to the FAANGs).

US economic and actual security absolutely depends on a robust supply of world-class semi-conductors manufactured on-shore.

Congress should take the advice in this piece and avoid making the same mistake with Intel that it made with Boeing (piling the bulk of all federal and state support for an entire sector into a single clearly dysfunctional company)


US manufacturing has been growing steadily for decades. The US concentrates this in sectors that are high tech, expensive, and profitable. Low margin, low profit manufacturing has been offshored, but manufacturing hasn't been stagnant at all.

As a percentage of GDP, manufacturing has been on a downward trend, but that's because of the growth of service industry.

https://www.macrotrends.net/countries/USA/united-states/manu...


It's true, US manufacturing output is not in decline. But it is stagnating, in the sense that growth rates are much slower than they used to be. Look at the average from 2000 to 2010 or from 2010 to 2020 and compare to any decade in the 20th century.

But that's not really the point. The point is, look at those numbers sans semiconductors and you see a pretty scary decline in manufacturing output.

It is totally fair to point out that this has been a strategic choice to focus on profitable industries, and that that's even a good thing. That's sort of the point here: the choice has been made, semiconductors are now a key industry for the US manufacturing sector, and AFAICT we don't have any plans for what would replace semiconductors.

IDK if we have a healthy amount of manufacturing output, but it's pretty clear that losing leadership in semiconductor manufacturing would be quite bad for the USA.


> The point is, look at those numbers sans semiconductors and you see a pretty scary decline in manufacturing output.

Not at all: https://www.nam.org/state-manufacturing-data/2019-united-sta...

Semiconductor manufacturing is a component of the second-largest US manufacturing bucket in 2017, and was only #8 of the top 10 manufacturing buckets for job growth in 2018. It is certainly an important sector for economic and strategic reasons, but you are significantly overstating its importance.


> manufacturing buckets for job growth in 2018... It is certainly an important sector for economic and strategic reasons, but you are significantly overstating its importance.

But I'm not talking about jobs, at all. The US needs a robust manufacturing sector so that it remains economically competitive. Figuring out how to translate that economic competitiveness into material benefits for individuals is a separate problem. Confusing the two is itself a risk to our economic competitiveness. (To make this point is a somewhat hyperbolic way: if the goal is jobs, build a canal through the center of the country. Using spoons.)

Also, I think the manufacturing output numbers alone miss something important from my original post. Losing semiconductor manufacturing as a core competency also threatens the most profitable and important components of the US services sector.


> But I'm not talking about jobs, at all. The US needs a robust manufacturing sector so that it remains economically competitive. Figuring out how to translate that economic competitiveness into material benefits for individuals is a separate problem. Confusing the two is itself a risk to our economic competitiveness. (To make this point is a somewhat hyperbolic way: if the goal is jobs, build a canal through the center of the country. Using spoons.)

Why, then, did you ignore the other chart I pointed out that specifically dealt with the economic value?

> Also, I think the manufacturing output numbers alone miss something important from my original post. Losing semiconductor manufacturing as a core competency also threatens the most profitable and important components of the US services sector.

I agree that losing this competency would be damaging. I disagree that it represents a threat. The threat is the continued consolidation of certain types of semiconductor manufacturing into an ever-shrinking pool of companies. While it would be slightly better if that pool were partly concentrated in the US, the size and diversity of that pool is a much bigger issue than where its members are located.


> Why, then, did you ignore the other chart I pointed out that specifically dealt with the economic value?

The only other chart lists "Computer and electronic products" as the second-largest manufacturing sector, which seems to confirm my assertion that semiconductors have been an important source of growth. It's the second largest category and has had annualized real growth rates 2x+ the rest US manufacturing for decades.

I don't see any breakdown of semiconductors/not-semiconductors for that category, and this is a 2017 snapshot as opposed to longitudinal data, so it's hard to say anything more specific using the data you linked to.

> The threat is the continued consolidation of certain types of semiconductor manufacturing into an ever-shrinking pool of companies. While it would be slightly better if that pool were partly concentrated in the US, the size and diversity of that pool is a much bigger issue than where its members are located.

Sure, I agree. (Caveat of course that geopolitical diversity matters too. N separate SOEs would increase diversity in a way that wouldn't be particularly helpful to the USA, for example.)

In any case, this is actually something I more-or-less agree with, and hence the last sentence of my original post.


The US doesn’t need to competency in commodity manufacturing that’s not going to increase our standard of living. Things get offshored when we have better investments to make with our people and capital here.


The US will not be in the dominant position where it can have such incredibly advantageous trade relationships that it can increase standard of living using only services forever.

It also causes a huge vulnerability.


Free trade has only worked for 240 years, let me know when managed trade works once.


it worked because the cards were heavily stacked toward the US and other first world countries.

as the developing world improves their standard of living, so does the same decrease for people in the developed world, because these developing nations are no longer dependent on the developed one's.

It was always just a question of time with globalization in effect.

personally, i think that this is a good thing though. we in the developed world have been living unsustainable lives for generations, living off of the developing nations.

though thats most definitely not the only reason why everything has been getting harder for the working population. automation, higher efficiency, extreme concentration of wealth and more do their part in the current situation as well.


Our standard of living continues to increase along with the developing world, and free trade deserves much of the credit.


And eventually many countries in the developing world will have a higher power and higher standard of living than us, and our service industries will not have much of a comparative advantage or really much added value, and power relations will change which will completely upend the favourability of trade relations.


Hasn't worked for most third world countries, actually. Most that succeeded was with massive state involvement including trade management.


Somehow, China has managed to create a huge expansion of their middle class with outsourced manufacturing.

>According to a study by consulting firm McKinsey & Company, 76 percent of China’s urban population will be considered middle class by 2022. In 2000, just 4 percent of the urban population was considered middle class.

https://www.businessinsider.com/chinas-middle-class-is-explo...

Meanwhile, in America:

>Fewer millennials in 2019 are middle class compared to baby boomers when they were in their 20s

https://www.marketwatch.com/story/why-the-middle-class-is-sh...


Meh. Middle class Chinese means something very different from middle class American. There’s something true here, but the exact comparison you’re making isn’t quite it.

Also, valuearb is kind of right. Except that the right solution is more efficient onshore manufacturing, not offshoring.


>Middle class Chinese means something very different from middle class American

Of course it does. They do have that extra trillion people to bring the average down.

However, moving the manufacturing jobs from here to there has certainly raised their average standard of living while reducing our own.


No it’s raised both our standards of living. Google the law of comparative advantage.


No, it's raised profit margins for manufacturers who no longer have to pay a living wage, healthcare, benefits, and obey stricter environmental and worker safety regulations.

The unemployed here are not better off.


What unemployed? Unemployment was at record low levels before the pandemic.


Ever been to the rust belt? There is plenty of unemployment in our nation's former manufacturing region.

In addition, a Walmart cashier, Starbucks barista or Uber driver does not have the same standard of living that a UAW member once did.

People are, on average, worse off than the preceding generations, all in the name of higher profit margins for the few.

>Across Western Europe and North America, the middle class is shrinking, according to a new report by the OECD. With each new generation, a smaller share of the population find themselves earning middle incomes.

When the Baby Boomers, who were born between 1943 and 1964, were in their twenties, some 68% were in middle-income households. Only 60% of Millennials, who were born between 1983 and 2002, could say the same at a similar time in their lives.

https://qz.com/1592826/the-middle-class-is-shrinking-generat...


Unemployment statistics and median incomes prove your anecdotes and cherry picked statistics wrong.


> US manufacturing has been growing steadily for decades.

You need to research a bit more what is behind these digits.

A lot of funny accounting where companies which haven't manufactured literally anything on US soil for decades somehow manage to accrue billions in "manufacturing output"


> The US concentrates this in sectors that are high tech, expensive, and profitable. Low margin, low profit manufacturing has been offshored, but manufacturing hasn't been stagnant at all.

Slight refinement: it’s a barbell: production of low value goods that are not worth shipping has remained, e.g. paper. But that’s just damning.


Genuinely curious then: where do we actually excel?

The only examples I can think of right now where we are genuinely number one is high priced low volume boutique aerospace (SpaceX and other new space companies, satellites, experimental military aircraft, electric planes, etc.) and of course weapons.

We can manufacture decent cars, but at least half of this is under the direction and management of overseas companies like Toyota and Nissan. US car companies have been on the ropes for decades.

We seem to be losing the bulk of aerospace, generation turbines, electronics, chips, medical devices, power equipment, materials, ... ?

Seeing graphs like this makes me wonder if I am living in the late days of the USSR when I'm sure many optimistic graphs were published by Pravda.


What? Turbines? Materials? Only the US and Europe can actually build functional jet engines and most of the bleeding edge materials. Even semi functional jet engines are still a distant dream for China. Same thing goes for chips, where only Taiwan and the US can manufacture current gen nodes. If anything, the US offloaded commodity manufacturing that is easily reproductible to build a very very high moat around key technologies instead of fighting to the bottom for scraps of old industries.

As for medical devices and tech, the US probably creates more new drugs and treatments than the rest of the world combined. when's the last time china, india or russia made a break through in medecine? When's the last time the US did? Last year?


Jet engines will not be a distant dream for China for long, neither will chips. China has functional 14nm now and has been building their own EUV machines in expectation of the recent US moves.

It will be the same story for jet engines, Chinese improvement is very rapid. It was the same story for every rising power so far, there is no real reason that China will never be able to build a jet engine, and no reason that they will never be able to run a fab (and indeed, they are accelerating quite fast).


Chinese engine improvement has not been rapid by any measure. China still has to import the majority of high performance fighter engines from Russia (primarily the AL-31). China has been trying to develop the WS-15 as a replacement for almost 30 years, and still relies upon the AL-31 for their premier stealth fighter (J-20).

They have started to deploy the WS-10 Taihang engine for their J-10 fighter, but the performance of this engine is markedly worse than comparable Western engines. Military grade engines are hard!


Jet engines are indeed hard. But with the upcoming tech transfer from Ukraine as well as the slow but steady improvement of the WS-15, I would be incredibly surprised if it took more than 6 years for the WS-15 program to yield a serviceable engine.

Even the US relied on foreign jet engines for a time, although you won't hear about it a lot - for example, an absolutely huge amount of engineers were hired from AVRO in Canada after the development of the superior Iroquois engine (which had by quite a margin the highest nominal thrust as well as the highest) as well as other Orenda engines, and their work was absolutely instrumental to the US becoming superior to the USSR in jet engine technology. I don't know if this is publicly available information, so you might have to trust me on it, but I guarantee you even the US had a 20+ year period where it had to steal engineers and tech in order to jumpstart it's lead, which is exactly what we are seeing with China.


If the WS-15 ends up being serviceable in 6 more years, it will have taken SARI and Xi'an almost 35 years to re-create something marginally better than the current Russian engines, and no where near what the US is delivered routinely by PW and GE. Chinese engines can't approach the thrust of Western engines, nor the reliability.

So say they iron out the kinks in the WS-15 (though odds are they'll settle for the WS-10). They then have a roughly 1990s era engine, when the US will be deploying Adaptive Cycle Engines etc.


Sure, but this always how it works. The first high-quality engine will take 35 years, the next 15. The important thing here is to develop the capability to manufacture high-quality single crystal components.

As far as adaptive cycle engines, China is also already researching them. They also have a lead in supersonic combustion as of today.


Discounting the competition is sure path to defeat. I am not in the semiconductor industry, while everyone is chasing a node sizes down to zero, I think the real breakthroughs will be in reducing the production costs to zero.


> Same thing goes for chips, where only Taiwan and the US can manufacture current gen nodes.

This view is rampant throughout this thread and I keep wondering why South Korea is omitted. I thought SK semiconductor tech was comparable the US and Taiwan?


Samsung's transistors are only slightly behind TSMC, SS8nm probably trades blows with intel 10nm and solidly beats 14+++. And their DRAM and NAND flash is the probably the best, so it is at least comparable.

But beyond SK I cannot think of any other countries with competitive fabs.

Also only a tangent but I really believe that deal where TSMC is going to put a N7 fab in the USA is either going to get restructured into something way smaller or never ever going to happen.


Sony dominates imaging sensors.


Sorry, yes south korea has an amazing industry. Japan too even if it's not the leader it used to be. but even then, last gen chip design is overwhelmingly coming from the US. Apple, AMD, Qualcomm and Intel have the most performant designs by far, so the moat is still definitely present. Taiwan and SK are arguably better in manufacturing the chips right now though I agree.


I remember that Fujitsu fabricated the SPARC processor, but what other semiconductors is Japan known for?


Pezy SC was kicking around top500 for a while, Sony domminates imaging sensors, likewise for sending niche medical and electrical stuff.


Kioxia (formerly Toshiba Memory) is large semiconductor for NAND chips, its fabs are in Japan.


> when [was] the last time china, india or russia made a break through in [medicine]?

2015.

https://www.nobelprize.org/prizes/medicine/2015/tu/facts/

(PS. This took less than 30 seconds to find and confirm).


Exactly my point. The USA has breakthroughs every years and very bleeding edge meds coming out every few months. 2015 is not recently at all, relatively speaking.


The prize was awarded for work done in the 1970s, fwiw.

(PS. This took me most of a minute of reading the link you provided).


Sure, any to answer the more general question in the OP:

Russia invented and built and then sold to the USA the RD-180 rocket engines that are used for the first stage of the American Atlas V launch vehicle.

Which the USA will no doubt "reverse engineer" soon :)

The OP needs to get out more often.


I feel weird playing the jingoism game, because it doesn't even vaguely reflect my feelings on the subject, but here we go:

SpaceX, a privately-held American company, is the only source of innovation in rocketry at the present time. Their reusable launch platform plays in its own league; everything else is steam power, Falcon is internal combustion.

That's just a fact. Unlike insisting that China, Russia, India are incapable of innovation more generally, which is risible.

My vague impression, in fact, is that jet turbines are quite a bit more difficult to manufacture than rocket engines, even if the latter are thought of as 'higher tech'.

I'm fairly certain we'll discover, in the next ten years, that China is quite capable of making cutting-edge computer chips, and just hadn't gotten around to it yet.

I've never shared my countrymen's obsession with insisting on being 'the greatest'. This country is great in its own right, along many dimensions; it is, in fact, right at the cutting edge for medicine and CPUs; denigrating the work of other nations only detracts from this.


> My vague impression, in fact, is that jet turbines are quite a bit more difficult to manufacture than rocket engines, even if the latter are thought of as 'higher tech'.

My impression is that “real” rocket engines include a jet engine just to pump fuel. They have many other pieces as well.

Not that a modern high-bypass turbofan is simple, but don’t discount the complexity of rocket science.


+1


Music, movies, software, high speed pizza delivery


Computer hacker, pizza deliverator, and greatest swordsman in the world, at your service.


Underrated response.

As true today as when it was written!


There's Tesla as an American car company, which is arguably the most fearsome in the world right now. Kind of interesting how US manufacturing lead in several industries is tent-poled by one immigrant, Elon Musk. SpaceX doesn't just manufacture rockets, they are also now the largest satellite manufacturer in the world.

We're also really good at manufacturing houses and food. Besides cars, houses, and food actually representing a good chunk of physical things we buy, more of our spending is moving towards software/information in which we have a crazy lead.

I see it as manufacturing was displaced by an even more value adding industry. Though certainly it isn't good for our national security to be relying so much on imports for physical goods.


Actually, we're really not that great at manufacturing houses. We have too few where they are needed, the quality standards both in terms of energy efficiency, design and construction tend to be low, and consumer satisfaction is reportedly dropping.


Too few where needed is a product of laws. Most people have no idea what makes a quality house and so pick the wrong things to focus on. Codes don't allow junk houses, and energy efficiency is pretty high - what counts as a passive house in some areas is too poorly insulated for the northern US code.


I think Tesla is a tech company and not a car company. The amount of technologies that goes into a Tesla car is unreal and Tesla abilities to use the sensors and chip on car and update a car via software is rather fearsome.


I’m not really seeing that - a Tesla Model S seems to me like a much simpler device than a Mercedes S Class. Talking about overall complexity, not just software.


Why Mercedes electric cars are not Tesla quality then? I mean the drive, the interior is better in Mercedes.


Why not both?

Tesla does, after all, manufacture cars — it doesn’t just write software that goes into cars made by others.


You excel in cultural exports still.


Does that data set factor in inflation? I could't tell from write up.

If not, The 1997 number would be ~$2,200B, after adjusting. That would still only be a slight decline.


"Data are in current U.S. dollars."


Interesting manufacturing output comparison list, but why is China not included in it?


>As a percentage of GDP, manufacturing has been on a downward trend, but that's because of the growth of service industry.

Or is it the other way around? Manufacturing is declining, and services are growing because they're the only jobs left for former manufacturing employees?


IMO your last statement is kind of ridiculous. Intel is literally the only company that does high volume wafer manufacturing in the US at anything approaching leading edge geometries. Are there any other American companies the government could even realistically support? (The only other US company that has any modern wafer fabs in the US is Micron and the only really have R&D and legacy long-life products being manufactured in the US.) Maybe if this program occurred 20 years ago you might have had another option. (And you would have basically looked like an idiot as for the next 10 years Intel pretty much made all the right decisions in manufacturing while everyone else in the US went down dead ends.)


Yes, I agree with you. It's a huge problem for the US right now, both with Intel and with Boeing.

My last statement is ridiculous, but also TFA is correct in its diagnosis of Intel.

So... where does that leave us?

I don't pretend to have good answers to your question. There are some smaller shops nowhere near the scale of Intel. Some foreign-owned companies have fabs in the USA (including TSMC). Also, Intel is made of people who might be interested in striking out on their own.

But, yeah. Building things is hard and expensive and risky.


There are many other aerospace companies that are in principle capable of designing high performance airplanes in the US. There are literally no other companies in the US with modern semiconductor manufacturing expertise (in logic, not memory, GF doesn't count). The level of capital expenditures and technological expertise and IP required to manufacture leading edge semiconductors is staggering. It's unlikely you could even start a new company that could manufacture leading edge semiconductors for the entire $23 billion that is in that bill. (They'd also probably get sued into oblivion by existing manufacturers with basically zero patent protection.) Just look at what happened to Globalfoundries. It was given effectively free capital for close to a decade, a not totally instate starting point of starting from nothing AMD's old fabs, and in the end gave up.

The "diagnosis" of Intel is that it threw away the chance of higher volume for lower margin which gave TSMC a bunch of money to catch up to them. (I think that's a very hindsight is 20/20 kind of view too...I don't think anyone thought that the iPhone would become larger than the entire PC market when it came out...and at those volumes Apple would eventually realize anyway that it wouldn't make sense to keep paying Intel to make chips for them.) I don't see how a start up is going to capture that volume without some sort of pseudo-captive market to grow with (which is basically how the various companies in China are progressing...).


Global Foundries may be stuck at 14nm, but they'd be the most natural successor to next-generation if their financials could be improved a bit. Its a shame that the New York based semiconductor fab has fallen behind.


They've spun down so much of their R&D, that it'd be a real uphill battle to get going again. There's been so much brain drain.

And the wall at 14nm is them not investing in EUV, so they'd have even more work versus say the 14nm/28nm transition.


Global Foundries is hardly an "American" company...it's owned by the Emirate of Abu Dhabi.


Doesn't Samsung have a semiconductor fab in Austin? I guess they don't count as a US company ...


I think they dropped the whole semiconductor team a few months ago iirc.


That is their custom CPU design team, nothing to do whatsoever with this. (and which wasn't very good in practice). Samsung is committed to their semiconductor business.


> IMO your last statement is kind of ridiculous. Intel is literally the only company that does high volume wafer manufacturing in the US at anything approaching leading edge geometries. Are there any other American companies the government could even realistically support?

Well yes, but... you could hire a bunch of foreigners to run things, as that's where the current experience mostly is, as you point out. But that experience includes use of semi fab equipment, essentially all of which is built in the USA.

That said, I'm not a fan of industrial policy; not sure what results were generated by the megabucks spent on Sematech (and I worked for Sematech sibling/neighbor MCC in its day too).


What about IBM? They have a few foundries in the US (and Canada), and used to produce the processors for Xbox 360 and PS3, which were both pretty advanced. And these were far from their most advanced chips, the stuff they built for their Z-series is truly impressive.

I haven't followed them much since, but 10 years ago, they were doing good stuff when I worked there.


IBM's fabs are acquired by GlobalFoundries and GF retired to develop leading process.


Thus continuing IBM's very sad tendency to sell every part of itself that's interesting.

I didn't know, but I'm not surprised.


does it have to high volume ? lots of semi-conductor companies in the US, some of them flying under the radar. the us stimulus will help these companies invest more capital, talent. hell, I will say any country needs to be manufacturing and designing their own silicon. and like said by ben, not all silicon has to be for pc's.


None of them are on cutting edge processes.


Are there any other American companies the government could even realistically support?

Micron is perhaps a candidate, though not sure it meets all your criteria.


Unlike Intel, Micron has practically outsourced most of their manufacturing to Singapore, Taiwan and Japan. Their only production fabs in the US are in Utah and Virginia both of which are mostly focused on niche or older products. More importantly, they only produce memory. Logic and memory processes are very different.


Semiconductor manufacturing is not what average Joe thinks when they talk manufacturing. It’s highly localized and requires not that many, but very highly educated workers.

While I highly support US investing there more, it’s not something that gives US blue collar jobs in the rural America. And push for more manufacturing in USA is about that.


For better or for worse, the American model of working in the same paper mill/steel factory/automobile plant/coal mine that your father and grandfather worked at is over. Automation, commodification, and globalization means that the profit margins in these fields are in free-fall and the ability to have the sorts of wages that these professions used to have where workers could be paid sufficient to support their families on a single salary are long gone and they are not coming back.

The US should be focusing on decoupling the need to be able to survive (food, housing, healthcare) from holding a job instead of pushing for bringing back these jobs that are not likely to ever come back.


I wouldn't conclude too much about the steady-state price of manufacturing labor based on the present global market.

At the moment regional differences in the price of manufacturing labor are so large that corporations are basically labor market arbitrageurs. But arbitrage opportunities are by their nature kind of ephemeral, ending when the relevant markets stop mis-pricing things.

Your POV seems to be that labor is correctly priced in the current cheapest labor market (the "Race to the Bottom" hypothesis) but I don't know that this is automatically true. It's at least possible that one can only arbitrage the price of labor in developing markets for so long until they develop and drive up the local price of labor, and until they are all developed.

"The Jobs Ain't Comin' Back" is an appealing position for professional automators like us, because it lays the credit for manufacturing labor market dynamics at our own self-important feet, when in actuality it's shaped largely by brutally exploitative and probably transient (on the scale of decades, though) labor-price arbitrage. In particular the current manufacturing labor market artificially rewards centralization because the profit made by labor-price arbitrage is so large compared to shipping costs.

I don't think it's a foregone conclusion that manufacturing is either Sweatshops or Robots in the long run.


> It's at least possible that one can only arbitrage the price of labor in developing markets for so long until they develop and drive up the local price of labor, and until they are all developed.

This is a fairly optimistic point of view, IMO. What I think is much more likely to happen is the following cycle:

1. Manufacturing comes to a new place due to low wages (relative to the cost of labor in the previous place), lax environmental regulations, low tax rate for corporate profits, and very low worker safety regulations.

2. This goes well for 20-40 years, but as things typically do, environmental regulations are put into place by the local government, the tax rate rises to pay for the externalities of the low wages, and workers organize, agitate, and win higher wages, more safety regulations, and worker protections. All of this raises the cost of labor.

3. Manufacturers move to a new location because it is much cheaper. This causes people in the old location to lose their jobs, become impoverished, and craters their economy to the point where the government, implored by all of these laid off and angry workers, need to do something to help the situation. This results in a lowering of wages, rolling back of environmental regulations, lowering the corporate tax rate, and repeal of the safety regulations and worker protections.

4. Twenty years later the manufacturers come back to the country because people have become so desperate for jobs that moving back to the country is relatively cheaper compared to the country that manufacturing just moved to.

Just view the policy decisions with respect to manufacturing that the leaders of the United States and Mexico have proposed to "bring back" these blue collar jobs. It all follows this playbook.


If things are better than before after step 4 you still have damped oscillation. I think the distillation of your hypothesis is this: labor (and environmental externality) markets change slowly enough relative to the amortization schedule of a factory that perpetual arbitrage between them is possible.

To the extent that this is true, one cause is that governments (with the notable exception of China) handicap themselves by honoring agreements, which they absolutely do not have to do absent a bigger, badder government to make them. A savvy, Machiavellian US government policy would:

1. Promise corporations the world

2. When factories are built, expensive machinery shipped in, and personnel trained, renege and hike taxes, but not so much as to force a write-off

3. When foreign governments make it difficult for US corporations to operate factories abroad and they ask for diplomatic support, shrug and tell them play stupid games, win stupid prizes

4. Do not apply the above policy for US corporations doing natural resource extraction abroad.

(i.e. act more like China)


The "Race to the bottom" will be ongoing for a long time. The nice thing about manufacturing for a middle-income economy is precisely its role as a stepping point to far higher productivity in services and high-skill sectors - this means that many developing economies will naturally tend to become manufacturing-intensive over time, even beyond the East- and South-East Asian countries that most observers focus on at the moment. The U.S. is supposed to be well beyond that point already.


Problem is most people are past that. China and India do have a few more poor people to take cheap low skill jobs, but overall cost of labor is going up in both countries and manufacturers are starting to look elsewhere. Problem is that elsewhere is countries with much less population so they will run out.

It probably won't be in my lifetime, but my kids need to plan for the day when there isn't a cheap labor source waiting to tap. It will be a good problem to have (think about all the engineers making life better), but it will be a problem.


I think its a bit like a game of hot potato, where the final manufacturing location will basically end up being a global power house for the next couple centuries.

Which at the moment looks to be china, despite textiles trying to move to lower cost countries, the magnetic force of ongoing factory automation in china might pull them back. The US deindustrilized to early for any of these really to come back without some seriously focused "billionaire" class investors putting their money first (aka Musk).


How much can we learn from each other? I work at a factory making things, we have locations around the world because our final products don't fit in a shipping container so local factories out compete global ones. That means that we have manufacturing experts scattered all over the world. We buy parts from all over and share knowledge. (expect China, we are careful what gets shared with China)


Spending on external training is $83bln [1]. Internal training program spend is an order of magnitude higher. Let alone salaries.

That is a lot of spending that could be transitioned to robotics if humans become too costly.

1: https://trainingmag.com/trgmag-article/2019-training-industr...


That's true, though it's interesting that many companies still talk and act like employees should be "loyal". They don't seem to understand how that worked.


I cannot fathom being loyal to a company where they expect two weeks notice from me (at a bare minimum) if I'm leaving but where they don't have to give me two minutes notice before terminating me.


No, it’s not. The push for maintaining manufacturing capacity is not about jobs though that’s been one way to try to frame it for a myopic populace. I thought more people would figure that out given PPE shortages and the inability to produce enough medications for our citizens but people need to simplify and straw man the issue. It’s more importantly a national security and strategic consideration. But many cannot see that far for some reason.


Yes, this is the real reason. What if a foreign power controls something you desperately need? It's not going to work out too well for your country. In that case, even as there is comparative advantage in global trade, one must still control certain industries domestically.


The US followed the advice of the business schools long ago and outsourced all its core competencies. The US' loss of technical and manufacturing expertise is going to be extremely hard to reverse, and reversing it is going to require a rejection of financial capitalist economic ideology and things like "maximizing shareholder value" as the prime directive.

Unfortunately this election gives us a choice between an utterly unacceptable racist crackpot and a bland candidate who belongs to the old guard who still believes that you can have an economy based on nothing. That doesn't make me optimistic.


I don't know about the rest of your comment, but it starts with a false statement.

> the advice of the business schools long ago and outsourced all its core competencies.

Business schools recommend the opposite. Outsource everything OTHER THAN core competencies. Core competencies are your competitive advantage; keep them in house.


> Outsource everything OTHER THAN core competencies.

The core competency of the US today is finance. We've been actively outsourcing everything else we can.


I've looked and it seems that our actual core competency (the thing we do better than others) is raise soybeans, beef, and chicken. https://money.cnn.com/2018/03/07/news/economy/top-us-exports... for example.


> The core competency of the US today is finance.

Since when? It didn't look very competent in 2008...


We're not even competent anymore at our core competencies…


Don't know why this is getting downvoted. Tone aside, it's a fairly accurate assessment.


I'd wager it's because he criticizes Biden (too), but come on, people...


I can't say for certain, but I believe that statement is being downvoted because the first paragraph is factually inaccurate in each sentence:

> The US followed the advice of the business schools long ago and outsourced all its core competencies.

Bashing metal and low-tech widget-forming has never really been a core competency of the US. It was useful to keep people employed for a 50 years or so but for the past century the US seems to have aimed for technological synthesis and increasingly high-tech production as a core strength.

> The US' loss of technical and manufacturing expertise is going to be extremely hard to reverse, and reversing it is going to require a rejection of financial capitalist economic ideology and things like "maximizing shareholder value" as the prime directive.

It is unnecessary to reverse something which was not lost in the first place. As others have repeatedly pointed out all over the comments on this article the US has continued to climb up the manufacturing food chain and dominates the top-end. It may employ fewer people, but the productivity and economic benefit of the US manufacturing sector has simply not declined at all and therefore no 'rejection of financial capitalist economic ideology' will be required.


> It may employ fewer people, but the productivity and economic benefit of the US manufacturing sector has simply not declined at all...

Unfortunately wanting something to be true, and it actually being true are not the same thing.

Even if a whole bunch of your peers say the same thing, it still doesn’t make it true.

I’m not really sure what to say... please go and do some actual research on this topic.


> It is unnecessary to reverse something which was not lost in the first place. As others have repeatedly pointed out all over the comments on this article the US has continued to climb up the manufacturing food chain and dominates the top-end. It may employ fewer people, but the productivity and economic benefit of the US manufacturing sector has simply not declined at all and therefore no 'rejection of financial capitalist economic ideology' will be required.

For many years, I have observed that folks in the Bay Area (if I might use that as the shorthand for those at the "top-end" of the technical food chain) tend to dismiss and deride the work further down the food chain, whether that work is "bashing metal" or piecing together business applications.

There are two truths I believe about this work that some do not acknowledge:

1. People who bash metal or piece together business applications have expertise and knowledge that folks higher up the food chain do not have.

Most people who make their living high up the food chain did it by proving they could rise up above those further down the food chain, so it makes sense to believe that this work is beneath them. But the domains are different, and the choice to live high up the food chain necessarily means that one chooses to miss out on experiences and knowledge that those further down the food chain will have.

2. The folks higher up the technical food chain are dependent on those lower on the food chain.

If I might pick up an imperfect analogy: Sharks are higher up the food chain than fish. Fish get bossed around by sharks. But sharks are dependent on fish to survive, not the other way around.


> Bashing metal and low-tech widget-forming has never really been a core competency of the US.

If you think the far east only dominates low-tech widget forming, you should look into where the iPhone and Macbook are manufactured :)


I know where they are assembled. I also know where the displays are actually manufactured, where the RAM comes from, and where the CPU is fabbed, and where the various radio chips are sourced from. I also know where the software is written that makes it all work.

I also know that the iPhone and Macbook are not particularly difficult pieces of engineering and that the tech required to make the shiny bit of tech you see in the store is the bit of the iceberg beneath the waterline that represents a lot of the value in what we are talking about.


> I also know that the iPhone and Macbook are not particularly difficult pieces of engineering

https://en.wikipedia.org/wiki/No_true_Scotsman


It is not a No True Scotsman fallacy, it is simply that the tech and machines/processes used to create the components are far more impressive and represent a more important tech base than the actual output from those processes.

If we want to trade vaguely related aphorisms: “Amateurs talk tactics, professionals talk logistics.” The tip of the spear is less impressive than the logistical tail that makes it possible.


This. I've always been most fascinated by industrial machinery and process engineering more so than the end result of it. It's that piece that represents the Golden egg of industry; and like it or not, the capability to keep the people of your country employed making widget X, receiving paychecks, etc is valuable. Incredibly so. Possibly even more valuable than the money it makes as the know-how it takes to have formulated that process stand-up is nearly impossible to codify, and closer to a luck based emergent outcome than anything you can plan for. Similar, if you'll excuse the long-jump analogy to how all the enzymatic processes our cells must undergo to live and thrive are all based on a fundamentally chaotic process; diffusion.


movies, music, and microcode.

and let's not forget disruptive pizza delivery services.


Government picking winners is corrupting and bad for our economy. It speaks to the intellectual bankruptcy of the supporters of these handouts that they can only look backwards and say we need to fund dinosaurs to steal engineers from new industries of the future and put them to work on do overs for Intel and Boeing.


I don't disagree with this sentiment - we do tend to overly fund existing industry such as air, space, energy.

As far as new investments though, China arguably picks their winners and has learned valuable lessons in boosting manufacturing from garage to factory. Their lead in solar manufacturing was astronomical.

The US have also done this, look at how Boeing, HP, Intel, the Auto industry and others had their start. Lots and lots of loans and grants.

If anything our policies are overly conservative and careful with new investment, investing heavily in the "idea factory" models surrounding higher education and subsidiaries. Fortunately that's also led to a large and mostly competent investment community that is willing to bet the farm on new companies.

Comparing the industrial & investment growth to industry in Europe and the ME has informed my view that we do a pretty decent job. Corruption is there, but it isn't the norm that I've see in other parts of the world.


This only works when you are playing a fair game.

When China plays clear favorites with its companies, the US is forced to compete in tis unfair market.

This is especially relevant for industries that require such an extreme amount of capital to be bootstrapped, that it is essentially impossible for a competitor arise in the market fair and square.


When China picks winners, it benefits the US because those subsidies lower our costs of goods.


It's only picking winners in the sense that we need to empower players in our market to be able to build here. It's creating the opportunity/playing field for a winner ton exist.

Because "winners" in this sense right now just means "who can build it cheaper" -- and there's no intellectual bankruptcy there. The answer is pretty much always "K, build it cheaper? Build it somewhere else then."


The market is saying the US is better doing other things.

It would be as if in the late 70s deciding we had to subsidize supercomputer and mainframe computer development, to hire all these talented engineers who were wasting time building and programming kit computers and “pc”s that could never be as important to our economy, and our “technological leadership”.

We can’t industrial plan the future.


Right but the market doesn’t have any foresight to say “an adversary is going to take advantage of this dependence the market has created here and strangle us with it.” The “market” like water just flows toward “it’s cheaper to do it this way.”

The market doesn’t care if US democracy triumphs over (or even survives) Chinese state managed capitalism. But I think a lot of us have skin in the game on this one.

China has long recognized the strategic importance of this kind of thinking, see the massive subsidies they provide many of their industries to create the very dependencies you’re crediting as a natural market efficiency.


They aren’t strangling us, they are enriching us. There are no dependencies, just subsidies to give us better deals.


It sure does feel like a good deal for now, comrade.


On a related note, TSMC is based in Taiwan. Ben Thompson (who wrote this article) chose to live there. It's cheap/friendly/modern/safe/connected to all asia/awesome healthcare system/amazing food/surrounded by nature.

Taiwan now puts effort trying to attract new talents. The "gold card visa"[1] is quite easy to get, just earn more than usd5.5k/month at your last/current job, or work in a "trendy" field (AI/big data/energy/biotech...) and you get an "open visa" for 3 years without restrictions for ~300USD. If you're looking for change, I highly recommend her Taiwan! Happy to reply to any questions

[1] https://taiwangoldcard.com


What does the security situation look like in Taiwan, and how might it change within the next decade or so?

Besides that I <3 Taiwan!! As a tourist I loved pineapple cakes, beef noodle soup, night markets, high-speed trains, and plenty of fresh mangoes and coconuts!


Taiwan is safe, except the threat of Chinese invasion.


Yeah, that was what I was worried about. Given how the past four years prove you can purchase a major U.S. political party for clearance prices, the bilateral security agreement underpinning Taiwan’s freedom may be gone when it’s needed most. Beijing is much richer than Moscow is, and smarter too. I sense the opportunity is just too good to pass up.

I love Taiwan and its people, and I hope everything stays peaceful and will do my part to keep it that way like voting and writing policymakers, but if I’m an expat I wouldn’t be emotionally invested in a total war for survival.


Sorry, pineapples, not coconuts. I don't remember having coconuts but the pineapples had no bitterness and the mangoes melted in your mouth!


"The Gold Card does not provide for a work permit for dependents of the card holder. Your spouse should get their own work permit."

Just like the TN visa. That's a deal breaker right there.


Same problem in a lot of countries. I looked into this for Germany as a backup option; my technical expertise and low level of German proficiency would get me and my wife easy residency, but she wouldn’t be allowed to work.

Then again, given the right mix of COL and wages, that might be tolerable, especially in a situation where something makes me willing to follow through on that plan.


In Germany, if your partner doesn't work, your income tax is essentially divided by 2 for taxation purposes, which is huge given that ~45% of your income will be paid in taxes otherwise, a big part of which are income taxes.

Also, charitable donations, which includes working for charities for free, as well as voluntary work, are tax deductible for the "tax entity", which if you are married you can choose to be your marriage.

So depending on what your partner does, and likes to do, if you are in a very high paying job, it might make more economic sense for your partner to "work for free" donating their work to a charity.

This essentially would mean that you get 100% of your income, free of taxes.

Example A: your salary 200k, your partner salary 50k. Combined taxes: 250k * 0.45 = 112k. Earnings: 250 - taxes = 138k.

Example B: your salary 200k, your partner exercises a 50k job for a charity for free. Taxes: 200*0.45/2 - 50k = -10k = 0k. Earnings 200k > 138k.

I'm not suggesting that your partner shouldn't work, only mentioning that if you are moving to Germany for a high-paying job, that is just one part of the equation. IANAL, nor a tax advisor, so I'd recommend you explore the options and evaluate the package "as a whole".

For a lot of people, being able to work for a charity for free for a couple of years can be much more rewarding than a 8-5 office job at a FAANG.


I worked in software in Germany, almost all of the technical people are on Blue Card, which gives a spouse a work permit from day one.


Also, depending on your level, you may be eligible for an immediate Permanent Residence permit. This gives you a lot more rights in Germany than the Blue Card, but ironically fewer rights in the rest of the EU.


Interesting that this is not the case in Canada. My partner is the main visa-holder here, and is tied to her employer in the same way a H1B would in the US, but it also provides a full 'open' visa for her spouse, which allows me to work anywhere without restrictions. It seems like a smart move, as it definitely tipped the scales in Canada's favor for us.


When I moved to Canada we both got Canadian "Green Cards" as soon as we landed. Technically possible in the US but show me an employer that will sponsor an EB-2 unless you work for them already.


Also don't be Indian in the US if you ever expect a EB-2/3 "Green Card"


A French friend of mine was recently denied a visa to Canada. His wife got a job there working in public policy, and he was going to work there in software (film industry). I figured it would be easy for him, but it's a mess!


low COL usually means you don't both need to work.


Is this controversial? to me that's one of the benefits of a LCOL areas, that one of the spouses doesn't need to be in the rat race.


Software engineer living on and off in Taipei for the past 20 years, if you like outdoor activities, especially road cycling, this is a mountain-climbers heaven, riding any direction for 30min will hit a gorgeous and cool mountain with spectacular views, on top of that most of the days including fall and winter it’s sunny and suitable for ride.

Drop me a line if you were ever here and longing for a joy ride ;)


How hard is it to live in Taiwan speaking only English? Is their official language Mandarin or is it some different Chinese dialect? How safe are Taiwan streets?


By far the most english-speaking country in Asia (after singapore). You wouldn't have issue to communicate with people, have a normal daily life. Might have issue to communicate with the old generation, but most of the people will go out their way to help you. Mandarin is the official language, but they are now making a huge push to become a bilingual language[1]

For safety, from 0 to 10, I'd give it a 9.5. I like to joke about the daily biggest national news being a grandma who lost its cat, since not much "bad things" happens here. Most of the thing I lost were returned, for example I thought I've lost my bag with my ipad few years ago in a bar, but someone returned it the day after x)

https://www.taipeitimes.com/News/front/archives/2020/06/13/2...


> By far the most english-speaking country in Asia (after singapore)

I would've thought that was India.


At least in the USA it's my experience that when people say "Asia" they mean maybe what some other people would describe as "East Asia." This has been true my whole life, despite there usually being someone there to say "but what about India?"


In my exp, an average Indian person without a university education does not really speak or understand English.


Not by a long shot. You read that in India English is sort of an official language. But in truth, a big chunk of the population still doesn't speak it, and the ones that do often have a thick accent and not a very rich vocabulary. Source: been to India ~25 times.


It would probably be a little difficult. Outside of big cities / tourist attractions, I found few people that spoke decent English (thankfully I speak a passable amount of Chinese). The most widespread dialect is Mandarin, but you'll hear a lot of Hokkien and Hakka (for example, transit system announcements).

I found safety to be a non-issue the last time I was there. Taipei feels like any other big Asian city and even on smaller side streets I didn't feel unsafe. In smaller towns and such, I also wasn't nervous.


The main problem, when I was there, as with many places throughout Asia, is the poor air quality. You’ll definitely notice it, my eyes were burning to get through the adjustment period.


Really? I found air quality even within Taipei to be comparable to San Francisco or even better, at least according to my nose, sinuses, and tear ducts.


Perhaps you were there during a weather pattern that minimized the effects. Unless you are saying you had a long-term low pollution experience there, which I think is impossible.


> Unless you are saying you had a long-term low pollution experience there, which I think is impossible.

Over a year of anecdotal data using my own senses, which I consider extremely sensitive to allergens.

Maybe some pollution data will indicate otherwise? It's certainly no rural Kansas, and I bet industrial hot spots in Taiwan suffer (like near TPE), but yea I felt great in Taipei. Far better than most Asian cities, perhaps comparable to Tokyo.


https://en.wikipedia.org/wiki/Air_pollution_in_Taiwan

> based on reports by the World Health Organization, that the air quality in Taiwan is generally the worst of all of the Four Asian Tigers, in particularly drawing attention to the annual mean PM10 level of Taiwan (54 micrograms per cubic meter). The annual mean PM10 level of the country's capital Taipei (47.1 micrograms per cubic meter) made the city rank only 1,089 among 1,600 included cities around the world. Based on 2004 data by Taiwan's Department of Environmental Protection, the annual mean PM10 level for Taiwan had, for the last decade, been worse than the European Union limit value (40 micrograms per cubic meter) every year.


Taipei is actually pretty OK. Other cities in Taiwan, not so much.


Happened to me in the throat in Kaohsiung. Otherwise I approve the qualities listed by the the grand parent post. I didn’t know about the Gold Card but I’ll definitely look into it.

I always enjoy taking a break from Japan and going there. People are more spontaneous, food is more varied, it’s refreshing.


Yikes. That's a hard no from me.


Well, you can always ride a bicycle for 10 minutes and find yourself in a thick jungle on top of a mountain, far removed from any hint of smog.


Can I just live in the jungle instead?


Absolutely! Apartments (luxury and otherwise) line the roads up into the mountains.


Taiwanese people are weirdly cautious for a country with absurdly low crime - there's massive, ugly (imo) bars on all the windows, to prevent burglars (and also hang your clothes off of).

Despite that, I've never felt safer at night than I have in Taiwan.


I spent a month there in 1998 working with a software development partner. Language was a complete non-issue in the office but I definitely would have liked to have some conversational Chinese for dining, shopping, taxis, etc. but I’d assume the generational shift I saw then has continued in the last 20 years, too.


It's not that hard to survive on English in Taipei. The official language is Mandarin (though the typical local accent is different from the Beijing-esque prestige dialect). The streets are safe from crime, if that's what you mean, though the competency of the average vehicle driver leaves something to be desired.


> competency of the average vehicle driver

As long as you give a WIDE berth to blue trucks you should be fine.


Some comparative info for Canadians: Taiwan is slightly larger than Vancouver Island but has roughly 2/3 the population of all of Canada.


Also some disadvantages: rains and clouds almost 24/7, street noise, overpriced real estate.


You can rent things for pretty cheap compared to the purchase price, like the buy vs rent calculation there is so heavily skewed to rent that it's insane that people still buy property.


These are pretty inaccurate or exaggerated statements. Real estate in Taipei is expensive to buy, but reasonable to rent. To buy, price per square feet is comparable to a high priced US city (SF, NY). It rains a lot in winter and has typhoon season, but can be nice otherwise.


The 24/7 rain is exaggerated. I'd say it's getting quite rainy during the rainy season (I admit I also don't like this season much), but we have quite a bit of beautiful sunny days, especially when you head to south.


I've lived in places where people hide in their homes with the furnace for four months out of the year. I've lived in places where people hide in their homes with the A/C for three months out of the year. I've lived in places where it rains 24x7 for two months out of the year.

Each of these places has a stereotype. Each are amazing the vast majority of the year. Pick your poison for the rest of the year.


No Taiwan, the future is in America.

Taiwan needs to open branches of all production in US and plan for mass evacuation of its population, so it doesn't wind up wasting it's time like Hong Kong must do.


I feel this is representative of a lack of understanding of how a people can feel pride and attachment to a location, that is, the territory itself, and also of how insanely fortified the nation is.

A land invasion of Taiwan would quite possibly be the most painful land invasion imaginable.


Where Would you suggest I can read about it more?


Unsure why you got downvoted, it's a fair question

https://warontherocks.com/2018/10/hope-on-the-horizon-taiwan...

> The first phase is the decisive battle in the littoral, extending up to 100 kilometers from the island. Key capabilities at this phase will include sea mines, and large surface vessels equipped with Taiwan’s capable, domestically manufactured anti-ship cruise missiles, the Hsiung Feng 2 and 3. Taiwan’s surface fleet includes larger vessels from the legacy force, such as French-built Lafayette class frigates, U.S.-built Kidd-class destroyers, and U.S.-designed Perry class frigates armed with both Hsiung Feng and Harpoon missiles, as well as a new class of fast attack Tuojiang class catamarans that carry 16 missiles. Taiwan also fields anti-ship Hsiung Feng missiles mounted on trucks that will disperse in order to survive initial strikes. While evading detection in Taiwan’s urban and mountainous terrain, they will launch strikes at surface ships throughout an invasion.

> The second phase seeks to annihilate the enemy at the beach area, which extends approximately 40 kilometers outwards from anticipated invasion beaches. This phase calls for Taiwan’s navy to lay mines in deep and shallow waters off suspected landing beaches. A new fleet of automated, fast minelaying ships are being built for that mission. In the interim, mine-launching rails can be installed on several classes of surface vessels. While invading ships are slowed by mine fields, swarms of small fast attack boats and truck-launched anti-ship cruise missiles will target key ships in the invasion force, particularly amphibious landing ships carrying the initial wave of PLA assault troops as well as roll-on-roll-off vessels carrying follow-on vehicles and armor.

On top of that, hike along the west coast at any time. As you hike along mountain paths you'll stumble upon bunker after bunker.


Thank you! Very educational.


How about the east coast?


The entire island is a mountain chain, and the navy is posted all over. If you're suggesting Admiral Shen Jinlong could outsmart the entire Taiwanese defense strategy by landing on the east coast, well, lol...


What cities do you recommend to visit to get a sense of living there?


I'd say Taipei for a "big" city. Hualien or Taidung for smaller ones


I’d actually recommend Taichung as well. Much more affordable than Taipei, and more spacious.

Air quality has regressed though, because they shut down the nuclear power plant after Fukushima.


My partner and I are planning a move there and somehow were totally unaware of the gold card visa. This is super helpful, thank you!


Taiwan is happy to welcome you :)


Does anything happen to a current citizenship (I'm US) if I do this? Seems awesome!


No, getting a visa is just something that allows you to stay and work (in this case). You’d only need to worry about citizenship if you were planning on becoming a citizen of Taiwan, which is a much longer and harder process.

You will however need to continue file US taxes at a lower rate. America is one of the few countries that expects its expats to do that. Retain an accountant who specializes in this.


Looks like this could lead to PR and eventually citizenship although Taiwanese law requires you renounce any existing citizenships. Weirdly enough though once you have it you can resume your previous citizenships.


> Notably, one thing Apple does not need to give up is Windows support: Windows has run on ARM for the last decade, and I expect Boot Camp to continue.

I brought up the possibility of Bootcamp supporting ARM Windows on HN yesterday, and others responded with some rightful criticisms.

From mrkstu:

> The problem is that Apple has moved to a home-grown stack for it graphics and deprecated OpenGL and moved wholly to Metal. [...] There is no way Apple would create a Windows driver for that custom silicon—it has always relied on AMD or Nvidia for those drivers in the past, it has no in house code base or expertise to write the requisite code in Windows and none of the silicon has been optimized for it.

And from pavlov:

> I don't think there is a standard BIOS for ARM like there is for x86, so multi-boot would probably require Microsoft to support Macs explicitly in Windows on ARM.

https://news.ycombinator.com/item?id=23528066


High-end ARM servers support standards called "SBBR" and "SBSA" which means they support UEFI and ACPI. The experience of using them is not very different from using x86 machines: boot an arbitrary OS and almost everything works.

https://en.wikipedia.org/wiki/Server_Base_System_Architectur...

There's an effort to make even the Raspberry Pi support this standard: https://rpi4-uefi.dev/


Yes. UEFI. Because your bootloader should totally be able to access a network stack for future remote attestation...I mean, trusted computing purposes!

Reading my way through the specification at the moment, and I'm failing to see how UEFI does anything user-friendly at all. It seems to only be useful to entrenched business interests (read Big *) and to a lesser extent maybe OEM implementers. Whereas all the cryptographic subsystem utilization by industry seems to be being used against users in the interests of fortifying market position.

Wish I'd have added my voice to the din back when I was first reading about it back in the oughts. Didn't realize how much of a pain in the rear it was about to make my computing life.

Too late to turn back the clock now. I just wish I understood what was so wrong with the old BIOS POST->handoff process.

Probably need to sit down with an Old HP or Dell XP box and try to figure that out. Besides the Rootkit issue (which frankly, it feels like there are less intrusive and user-abuse enabling approaches to dealing with) I don't buy that the end-user got a single blessed thing out of UEFI; only lost.


It's worth noting here that Apple was actually among the first to adopt UEFI—it was used by the first Intel Macs in 2006.

They didn't really follow the specifications so there were a lot of differences, but it was definitely UEFI. Actually, one of the reasons Hackintosh bootloaders have gotten so much better recently is they don't need to deal with BIOS ==> EFI translation.

So clearly Apple saw something in it.


Intel donated their original EFI standard to the Unified EFI Forum for further development in summer 2005.

>The Unified Extensible Firmware Interface Forum or UEFI Forum is an alliance between several leading technology companies to modernize the booting process. The board of directors includes representatives from thirteen "Promoter" companies: AMD, American Megatrends, ARM, Apple, Dell, Hewlett Packard Enterprise, HP Inc., IBM, Insyde Software, Intel, Lenovo, Microsoft, and Phoenix Technologies.

https://en.wikipedia.org/wiki/Unified_EFI_Forum

Apple started their adoption of EFI before the UEFI was a thing.


Sorry, yes, I should have said "EFI" rather than "UEFI".


Irrelevant story: Apple adopted and has trademark of OpenCL but they finally dropped it.

https://www.khronos.org/legal/trademarks/


Fun fact: Apple wrote OpenCL, open sourced it, and then handed it over to Khronos.


I assume even Windows on ARM relies on that UEFI support. It's not just server-side stuff.


Yes, it does.


I doubt many would find bootcamp on ARM very interesting, especially if that means you need to run Windows-ARM.

The great thing about x86 macs is that you can run x86/x64 versions of Windows, at decent speeds, through the likes of Virtualbox or VMWare Fusion, and have access to a vast amount of legacy win32/win64 applications.

Nobody needs to run legacy arm windows binaries, there simply aren't any around.

It would be a shame to lose access to decent windows VM performance on the Mac.



I think a Windows desktop-as-a-service offering similar to Amazon WorkSpaces is more likely.


Seems possible. We use VDI at my employer for access to windows apps for developers on Macs. The performance is at least adequate although there are inevitable glitches in the experience, but the same is true of running in Parallels or VMWare. But we do rely on the ability to run the same docker containers locally that we deploy to our servers.


Indeed and I think it's where the future of the Win32 desktop is. To be clear, not Windows as a whole, but consider how Microsoft suggested not too long ago that they planned to deprecate Win32 eventually (via Windows 10 X) and render those apps via RDP.


How would games work? I understand that the Xbox One runs a build of Windows 10, and I would expect that to continue on the Series X. Are Xbox One games Win32 binaries?


> Are Xbox One games Win32 binaries?

Nope. You can run UWP on Xbox, but most games don't (because functionality is limited).



I think it’s totally possible there will be external versions of these, connected through USB 3. Around the size of a hard drive, with an Atom or i3 and it’s own RAM/storage. Figure out the right interface and maybe it could be used from an iPad Pro as well.


I expect someone will come up with an Intel emulator, I have a vague recollection that one existed in the pre-Intel Mac days.


It was called VirtualPC and it was REALLY slow - 1/8th native speed as I recall. I found it pretty much unusable even on a good G4.


Apple has to significantly invest to continue with Bootcamp, only to offer people Windows on ARM. Windows on ARM is actually a great product, but it's not what people use Bootcamp for. There's no value add here for Apple, and I simply can't see them doing it. Not even if they continue using AMD graphics in (some of) their machines. It's not worth any effort to run a platform that is still as niche as Windows on ARM.

The only way I see this happening if someone else would do the work. And that someone else would have to be Microsoft. Why? They're the only ones with an incentive to do it. ARM Macs would be a highly available, highly performant development platform for Windows on ARM. Apple could be accidentally solving a supply problem here for Microsoft - a supply of quality high performance ARM desktop hardware. Apple might even be inclined to work with Microsoft (or they might not), but even if unlikely the only way I see this play out is without Apple paying for any of the work.


I wouldn't call Windows on ARM a niche product, since it supports x86 emulation, enabling, arguably, most of the use cases.


I don't mean to speak against Windows on ARM viability - it's a necessary product and it's well executed. But Windows on ARM is still building a market, and can't run significant high performance software yet.

Even for gaming it can't run anything released in the last 15 years in the current Snapdragons, which by itself excludes a really big portion of Bootcamp use cases.


Part of them. If you just want to use one-off old software then emulation is adequate. If you actually care about performance, then no, not really.


I used bootcamp for years to run real games at real speeds (windows+steam)

it was pretty much: mac workhorse by day, windows game machine by night

all that's fading away.


>> > The problem is that Apple has moved to a home-grown stack for it graphics and deprecated OpenGL and moved wholly to Metal. [...] There is no way Apple would create a Windows driver for that custom silicon—it has always relied on AMD or Nvidia for those drivers in the past, it has no in house code base or expertise to write the requisite code in Windows and none of the silicon has been optimized for it.

That's assuming there will only be ARM-based macs that use Apple GPU's, which doesn't make sense. For iMac/Mac Pro/Macbook Pro I fully expect those would still be equipped with AMD GPU's. Good as the Apple GPU's are for mobile hardware, they are nowhere near the performance and feature set of desktop-class hardware. This would also free up a lot of silicon on the CPU die to allow for higher core counts, bigger caches etc. I think ~50% of the A13 SoC die is just for graphics...

On top of that, instead of bootcamp Apple could still support virtualisation to be able to run Windows ARM on Apple ARM hardware and write a paravirtualized GPU driver that just translates guest DirectX/OpenGL/Vulkan to host Metal, similar to what pretty all other hypervisors are doing already.


> That's assuming there will only be ARM-based macs that use Apple GPU's, which doesn't make sense. For iMac/Mac Pro/Macbook Pro I fully expect those would still be equipped with AMD GPU's. Good as the Apple GPU's are for mobile hardware, they are nowhere near the performance and feature set of desktop-class hardware. This would also free up a lot of silicon on the CPU die to allow for higher core counts, bigger caches etc. I think ~50% of the A13 SoC die is just for graphics...

While I agree that the higher-end machines will still have discrete graphics, that doesn't mean they won't also use on-die graphics—after all, the MacBook Pro I'm typing this on uses Intel HD graphics when it doesn't have to spin up the Radeon.

And I wouldn't really want to bet that the Intel onboard graphics are/will be less powerful than Apple's.


> While I agree that the higher-end machines will still have discrete graphics, that doesn't mean they won't also use on-die graphics

But there's no reason those have to work in Windows. Apple isn't going to care if Windows gets crappy battery life.

The bigger question is, if Bootcamp would only work with a small portion of Apple's machines anyway, and you'd have to run a version of Windows which no one really wants, and Windows just generally isn't as necessary in 2020 as it was in 2006... is Apple going to bother?


I'm sure that Apple with its total control of hardware on Macs could get Windows running. If that's too much of a technical effort, Apple should easily be able to get Microsoft itself to lend a hand.

After all, Microsoft would like to encourage software companies to make their products available for Windows on ARM as well, being on ARM Macs would make WoA more desirable. Besides allowing Windows to run on ARM macs must be preferable to MS over Apple investing in some Win compat thingy, while Apple would like to avoid investing in Win compat. It's an obvious win-win if Apple wants to do it that way. Apple could on the other hand decide to close the platform further, or manage to do this itself.


A build of Windows 10 for ARM already exists.

https://docs.microsoft.com/en-us/windows/arm/


> > The problem is that Apple has moved to a home-grown stack for it graphics and deprecated OpenGL and moved wholly to Metal. [...] There is no way Apple would create a Windows driver for that custom silicon—it has always relied on AMD or Nvidia for those drivers in the past, it has no in house code base or expertise to write the requisite code in Windows and none of the silicon has been optimized for it.

They're derived from PowerVR cores which do have WDDM (AKA windows) drivers. They're not great, but usable. Paying a third party to write new drivers isn't a crazy idea either.

Remember, they don't have to be perfect, just usable.

> > I don't think there is a standard BIOS for ARM like there is for x86, so multi-boot would probably require Microsoft to support Macs explicitly in Windows on ARM.

Microsoft ARM machines use UEFI. It should work fine if Apple implements an UEFI loader on top of iBoot or whatever.


PowerVR graphics is quite terrible, no doubt about it. It's the reason why some Intel Atom-based or similarly low-end x86 devices are stuck on an unsupported version of Windows 10, with nothing to upgrade them to. Even Linux won't run properly on that hardware, it's that bad.


Yeah, those particular drivers were a special kind of terrible. Didn't even run the vertex shaders on the GPU.

They do have regular Linux drivers, Imagination just needs to get it's head out of it's ass and recompile them for x86.


Is the Apple's GPU core really based on PowerVR arch? I suspect that adopting PowerVR arch without grant may cause lawsuit.



Because the license is in 2020. I don't expect Apple copies full arch without grant. The license is for patent but not for arch, isn't it?


>> One thing Apple does not need to give up is Windows support: Windows has run on ARM for the last decade

Supporting legacy x86 apps is 'windows support', not 'running windows'


> I don't think there is a standard BIOS for ARM like there is for x86, so multi-boot would probably require Microsoft to support Macs explicitly in Windows on ARM.

Apple "just" needs to properly* support UEFI for Windows on ARM to boot

*properly == supporting the minimum requirements for Windows


The other issue is that Windows for ARM is not a product you can just buy off the shelf right now.


Microsoft would probably make them available if this were to happen, since they'd have to do work to get Windows running on the A-series chips anyway. If Apple wants it (and is willing to help, because some way of breaking their secure boot will be needed), it's beneficial for Microsoft. They get to sell more licenses, and corporate environments that buy Apple hardware to run Windows on (which is apparently not uncommon, according to r/sysadmin) will keep doing it.


i thought the surface pro x runs windows on arm? or do you mean you can't get a copy of the OS without the product?


Yep, WoA is only sold preinstalled in devices, you can't buy a copy (but you can install it yourself: https://github.com/WOA-Project)


You can get ISOs/WIM files if you know where to look...


Official ones are behind two-factor authentication, only downloadable using Internet Explorer. Yeah.


The MSDN Subscriber Downloads site (now “My Visual Studio” or similar) hasn’t needed IE since... 2011?

Microsoft gives away the ISOs for non-LTSB builds for free - it’s the product-keys that they care about.


Windows 10 on Arm shares the same product keys/licenses as on x86.


I know the Intel ones are available from Microsoft’s site and you can find it if you just search for “Windows 10 ISO”. Are the ARM ones different?


> I don't think there is a standard BIOS for ARM like there is for x86, so multi-boot would probably require Microsoft to support Macs explicitly in Windows on ARM.

Windows on ARM requires UEFI, so that's not a problem.


As for standard firmware interfaces, UEFI is as much a thing on ARM as it is on AMD64. It's ugly, but it is at least standard.


The OpenGL bit is even worse than it sounds. The driver is half-broken and the version is ancient, already a decade old.


I don't see how this would change. I haven't seen anything that suggests that Apple would be moving away from using third-party GPUs in this process. The ARM move would only affect the CPU which would have no bearing on the graphics stack on the Windows side.


Also, Apple has gotten to a point where it doesn't really need to support Bootcamp anymore unlike in 2006ish when it released the first intel Macs.


Also they probably have to consider how much money they're willing to throw at this.

Switching to ARM on macOS will save them money as they don't have to pay Intel and can reuse existing designs and relationships with TSMC.

But actively investing money and personnel to support Windows, NVIDIA and AMD GPUs, virtualization, etc on a platform that sells 20ish million units in an year, compared to the hundreds of million of units that iPhones/iPads/Apple TVs, etc sell - probably doesn't make financial sense.

Especially with how they've treated macOS lately, enabling easy transitioning via Catalyst, SwiftUI support, making it easy to reuse code for iOS. Switching to and getting more revenue from services, thus them being a bigger priority.

And there's the argument of doing all of that instead of investing more time into macOS with the whole Vistalina thing going on.

Again no idea whether it makes financial sense, they've more than likely done the math and it'll be interesting to see how it plays out.


I saw some stats a couple of days ago that said Bootcamp usage had fallen from 15% of users to just 2%.


Interesting. Any source?

Personally I only used bootcamp for gaming a couple of years ago but then I just ended up building a desktop PC. It's just the better option as it will probably outlive all my Macs and I can upgrade parts as needed.


I would assume part of that is the improvement in virtualization. Those with adequate RAM can run Windows apps in a VM performantly.

You really only need Bootcamp for gaming, and with Nvidia support waning on the Mac, you're going to have a second rate gaming experience even in Bootcamp.


And I assume that because Windows hardwares getting better and Macs not or getting worse.

Around 2010-2014, MacBooks are great hardware even for Windows. But after that, some good Windows devices are released (like XPS 13) and Apple starts failing (like TouchBar, keyboard, USB-C only, ...).


Windows on Arm might not be able to make use of all the secret sauce and custom silicon these chips are bound to have. Who's going to put in that work?


The unseating of x86 as the de facto standard for laptops, workstations, and servers would be a net benefit for both performance and security. Unfortunately, though Apple will likely find very great success with their ARM Macs, it does not necessarily follow that the architecture will gain significant ground outside the Apple-sphere. I would love for my next laptop to be ARM-based, but I can't see Apple happily flogging chips to Dell or Lenovo, and I'm not sure if anyone else's ARM processors are up to the necessary standard.


Not sure if that's what you have in mind, but Microsoft now has an ARM-based product line :https://techcrunch.com/2019/10/02/microsoft-launches-the-arm...

They even implemented a x86 emulator in order to run your usual native software on it.


> x86 emulator

What they need is an x64 emulator. 32-bit apps are few and between now.


I think that the restriction to 32-bit apps is based on the fact, that only the patents covering the early iterations of x86 ISA have expired. Microsoft wouldn't want to be sued by Intel for infringing on later Tech (like x86-64), so emulation is still restricted to this subset.


I mean, ARM is 'there' in most cases, the issue tends to be perceived latency of RISC vs CISC CPU's on desktop, not overall performance.

There are decent quality laptops shipping with ARM already, like the Thinkpad Yoga: https://www.theverge.com/2020/1/6/21050758/lenovo-yoga-5g-sn...

There's also the obvious: Pinebook pro.


The RISC vs CISC discussion was important in the nineties but not anymore. There's absolutely nothing 'reduced' about modern ARM chips, and every Intel processor since the late 1990s has used significant RISC design elements.

In modern contexts, the RISC vs CISC discussion still gets brought a surprising amount. But it is simply not relevant anymore.

See also previous discussions on Apple SoCs: https://news.ycombinator.com/item?id=19327276


I think the Pinebook is an awesome piece of kit and if I could justify doing so I would absolutely buy one but from everything I've read it's not really something you'd want to do serious work on unless you also have the luxury of a beefy workstation to shell into. I suspect the same applies to the Yoga - perfectly alright for most of the day but just wouldn't have the firepower for big compilation jobs or serious multitasking.


Your original assertion

> Unfortunately, though Apple will likely find very great success with their ARM Macs, it does not necessarily follow that the architecture will gain significant ground outside the Apple-sphere.

Does not imply heavy workloads, compilation being offloaded to cloud is something already being done though.

I suspect that ARM will cannibalise most workloads, since it's shown very strong early competence already in photo and video editing on the iPad Pro, I highly suspect advancements in that direction will come to quallcomm CPU's in future.


My original comment discussed the prospect of ARM becoming the leading architecture for the entire range of machines that at the moment tend to be x86 - that's laptops, desktops, workstations, and servers. For this to happen, ARM needs to be capable of dealing with intensive, CPU-heavy tasks like compilation. As the Torvalds quote in the linked article explains, ARM servers cannot become the norm without ARM desktops becoming the norm and vice-versa.


Sure, but it's not all or nothing.

ARM can eat the earth but it will not be a shift in a single year, I was suggesting a likely way forward and that is bottom up, coming from low end laptops to higher end laptops and eventually workstations and servers.

Just because it's not for -you- right now because you need local compilation does not mean a JS coder or a normal office worker would be unable to use an ARM based machine.

Nothing prohibits ARM from ever being workstation grade, it's just not the focus right now- but you seem to imply that because there's no workstation capable CPU that there never can be one, which I find disingenuous.


I think you’re right, but I also think that he’s right in that there’s a real chicken and egg problem here.

Just because it's not for -you- right now because you need local compilation does not mean a JS coder or a normal office worker would be unable to use an ARM based machine.

I don’t know what workflow other JS coders have, but mine starts with local development in Node. Does V8 produce code that’s as optimal for ARM as x86_64? That’s a pretty big deal to me, and can in fact be very enabling. The local development experience really needs to be good to build a platform, otherwise development will be halting and piecemeal the way embedded development usually is.


> JS coder

> never tested on android

just as planned


> JS coder > never tested on android just as planned

???

I’m back-end, not front-end. I think I mentioned Node, right? Why would I test backend code on a phone?


I agree it'll happen in time but my point is that Apple adopting ARM will not necessarily cause an instant chain reaction across the industry. My concern is whether my next laptop (perhaps 4 years away) will be ARM - and I'm not sure that there will a suitable machine on the market (I need Linux support) even if there is a perfectly capable range of MacBooks.


Is compilation a problem though? It's an unattended process, so it doesn't matter how long it takes, 1 hour or 2 hours, both equal infinity for all practical purposes. I thought ARM has problems only with FPS games, which are not parallelizable and sensitive to single core performance.


What do you mean by latency here exactly?


I'm not exactly sure what I mean and that's part of the issue, when using an ARM CPU with desktop software which has not been optimised to hide UI "lag" there's something that feels like it just.. lags.

You can feel it on a rpi if you use it for a day, it's not that it's underpowered, you can grab a 15yo celeron that is much slower and single core and it will have less perceptable latency. The Celeron will "feel" as if it's working, the ARM chip will "feel" as if it's laggy.

I can't quantify this behaviour, but it's consistent across all ARM CPU's I've ever used with traditional desktop software.

Notably these issues don't seem to exist on Android or iOS, someone once told me this is due to the UI elements having a higher priority than anything else on the system.


If your experience with "desktop UI on ARM" is based on the RPi and other ARM SBCs, the overwhelmingly likely explanation for this perceived lag is lacking I/O performance. Running a Raspberry Pi (or any system) from any cheap flash storage setup like SD cards or cheap eMMCs degrades desktop responsivity a lot. The ARM desktop hardware most of us have experience with has significant I/O problems, but that's due to them being cheap hardware, and not due to ARM.

I/O is actually the real prevalent bottleneck for desktop interactivity on all modern hardware. Even older x86 systems with spinning platter hard drives are set up for a more responsive desktop experience due to the I/O performance being suited for the desktop. Modern Android phones and iPhones also have much better I/O setups for interactive use (UFS for high end Android, Apple actually uses NVME because responsiveness is so crucial) .

Certainly CISC vs RISC has nothing to do with it as that discussion really has no bearing on the situation anymore on discussions about modern ARM and Intel (see my other post).


> Running a Raspberry Pi (or any system) from any cheap flash storage setup like SD cards or cheap eMMCs degrades desktop responsivity a lot.

Not really, if the system has enough free RAM to use as disk cache. For many scenarios, that suffices to run quite acceptably.


RAM as disk cache is only useful for things you have already loaded. It doesn't help your boot times, new program launches, or loading new content in your program.

I'm not saying SBCs are "unusable" for desktop use cases, I'm saying I/O is the likely explanation for desktop responsiveness issues they encounter on SBCs.


Nobody here is talking about boot times, I was talking about responsiveness.. it's easy to test your theory, you can boot USB on the new RPI, and USB3.0 has quite a high bandwidth, definitely comparable to SATA2, nearly SATA3


There's a good reason that they're making it more possible to boot a Raspberry Pi from USB. It's more reliable storage and has better performance for many use cases :)


Do those ARM systems have graphics chips, or are they rendering the UI on the CPU? Modern x86 processors have integrated graphics for GUI basics, video, etc. Linux desktops can theoretically run off of just the CPU, just... not super well.


Even Raspberry Pi has an integrated GPU. They feel slower because they're lower-frequency in-order chips, and while the chips themselves have multiple cores many applications don't use them properly - more specifically, they don't render the UI in a dedicated thread.


RPI are cheap hardware not really designed for desktop computing I guess.


Yes, I've noticed this as well. ARM Linux desktops feel sluggish. It is not IO, because I have done tests after everything has been cached, loaded once. I have a RPi 4 here, as well as some older Pine64 boards.


I forget the specifics, but risc winds up feeling slower because it’s a more “correct” arch than cisc. Been a while since this came up!


RISC cpus have historically been under-clocked and perceivable slower than their cisc competitors. Apple was running on risc cpus before going intel x86. They were constantly marketing their speeds as less important than speeds matter to x86 architecture.

I hope they’re able to devote two cores to UI/UX at a much higher frequency than the general compute cores. That would take some engineering know how, but could negate the performance complaints largely.


"RISC" isn't a single architecture, it's a philosophy for designing architectures. The "RISC" Apple was using before x86 was called PowerPC, and it was designed mostly by IBM. It was also a horrible piece of overheating garbage that I'm still honestly surprised wound up in the Xbox 360 and PlayStation 3. The "RISC" Apple is using now, ARM, is a far more practical architecture that Apple has already beat Intel with in single-thread perf... at a lower clock rate, with strict power limits. Letting the A13 stretch it's legs with wall power and a big heatsink would completely crush Intel and x86 in every benchmark except multi-threading.

User-perceived latency has little to do with the clock rate of a processor, though. What you need is:

1. Immediate feedback upon user input 2. Progress animations that move, even if progress is indeterminate 3. High frame rates on your animations 4. No frame stutter or torn frames

Out of the four, low clock rate will mainly impact frame rates and pacing, but that can still be mitigated by using what you have as efficiently as possible.


That's just a matter of the process node Apple's PowerPC chips were on. When RISC chips came out their big selling point was that they could be clocked many times faster than CISC chips at the time could because it was easier to pipeline them and because you could put the entire CPU on a single chip.


Qualcomm's Snapdragon 8cx is a step in the right direction, as it's the first "PC-class" (more like mainstream laptop-class) ARM processor, a processor actually designed with laptops in mind, unlike all the other ARM processors that were put in laptops before that.

But even the 8cx is limited to mainstream "browsing-only" type of laptops for one simple reason - there isn't demand for anything else from OEMs right now. So Qualcomm will not design a more powerful chip until the demand appears.

This is in contrast to Apple, who can "generate the demand" simply by making such much more powerful Arm laptops available and convincing people to buy them.

This is also one of the main reasons why Apple tends to have the more powerful processors in the smartphone market. No OEM is asking for a mobile GPU powerful enough to run advanced video editing on the devices and whatnot. So the suppliers comply.


> It is not out of the question that Apple, within a year or two, has by far the best performing laptops and desktop computers on the market, just as they do in mobile.

This quote from the article is the big takeaway. Apple isn't going to make a lateral move here. A processor transition that nets Apple merely comparable performance would destroy the Mac brand. Apple needs a CPU that runs native code significantly faster than Intel on every device and it needs to be able to emulate Intel chips at reasonable speeds.

I suspect Apple knows this and that their chips are going to hit the mark. But we'll see.


My money is not only on the performance, but also an adjustable thermal envelope, which means perhaps even 5 times the battery life of normal windows laptops.


With some Ryzen Renoir machines hitting 9-10hs of battery (when paired with a decent 80+ watt-hour battery, that is), would you expect a mac hitting 50hs of battery life?

I mean I could see that happening when pairing a current A13 with a much bigger battery like a notebook has, but as others have said, the performance will have to be superb, so maybe we'll be seeing half the power efficiency? Were you thinkg 50+ hours or more in the lines of 20+ hours?


Considering a laptop can be at most 100Wh for US air travel - I doubt you'll see 50 hours anytime soon. That would mean the display is drawing less than 2 watts of energy while the rest of the system is near 0. Typical displays apparently use 2 watts - https://www.digitaltrends.com/computing/intel-low-power-disp... If Intel released a 1-watt display then we might see the 50-hour level reached but that's going to only be on really light workloads. (Light text editing? Playing back a special low-power-codec video?)


Right on point with your considerations, both on maximum battery capacity and the display being a large factor un notebook battery life. Battery capacity can't be expanded, since it's regulation and we're tied to it. Don't really know what drives display power consumption and how it could be improved to get better battery life.


I was thinking about 20 hours of actual battery life. Seeing as the A13 has the performance of 9900K in many cases (as mentioned in the article), I'm assuming it's pretty easy to achieve around 20 hours of battery life with an A13-based chip.

That being said, you could argue that the sporadic and discrete nature of phone usage has nothing to do with continuous "strain" on the CPU in a Desktop usage which could change the consumption.


There's one thing that I've noticed and it's the memmory subsystem. SPECINT and SPECFP dabble on the "raw power" of the chip, and it'd seem in that area the A13 would be king, but as most "micro"-benchmarks go, we no from real world performance that they don't quite tell the whole story.

So I went looking for performance numbers related to memmory, and found some anandtech tests[0], the summary being:

* L1, L2 and L3 are comparable to AMDs 4900 processors (which I'll take as baseline for "best x86 mobile", with a grain of salt).

* LPDDR4 1600 MHz vs AMD's LPDDR4 4266Mhz (dual channel, can't find if A13 is DC too, but I'd expect it to be).

* RAM latencies in excess of 340ns, vs "65+"ns for AMDs. Let's take that as "sub 100ns" since they're not giving straight numbers[1].

* RAM bandwidth to memory in the order of 20-40GB/s depending access patterns. That places it around the x86 "counterpart" which is capable of "around" 25-50GB/s

* Bandwidth in the caches is different too, but I didn't want to risk a bad interpretation on my side. It'd seem the Ryzen hits larger numbers (so better), but I admit I'm not used to reading these charts and feel a bit "out of the water" there.

Seeing all that... I'm not sure the memmory system is on par to what x86 has to offer with its "best contender" right now. And let's also remember that Intel is still beating AMD in memory latency (which helps them drive better numbers in some "real world" benchmarks as we see in certain games, for example, where Intel still leads despite the "raw power" Zen2 has shown).

My take is that a "slow memory system" could be a big drawback for a CPU in real world tasks, resulting in a chip that, on paper should be great, but once you sit at a computer to use it "feels slow".

Of course this is complex to test and try, and far more than I can analyze sitting at home driving searches over the internet.

[0] https://www.anandtech.com/show/14892/the-apple-iphone-11-pro...

[1] https://www.anandtech.com/show/15708/amds-mobile-revival-red...


Depends on what you mean by performance. The same speed with double or triple battery life would be worth it.


For native apps, yes. But these will need to run older Intel software as well. It needs to be able to run the x86 version of Microsoft Office or Photoshop at reasonable speeds. The only way that is going to happen is with a blazing fast CPU with a good emulation layer. That is more-or-less how they went from PowerPC to Intel, it's likely how they will get from Intel to ARM as well.


It all makes sense on paper but I'm still skeptical.

For the last couple of years every time Apple has tried to do some acrobatics on the Mac platform it has failed miserably.

- Yosemite

- Butterfly keyboard

- Touchbar

- Catalina

I'm not much of a CPU guru but a switch to ARM seems like the most ambitious and risky of all these, by far.


This is a good point. I don't know if it's the absence of Steve jobs, or what, but Apple does seem to be consistently botching important details whenever they try to make an significant change lately.

The biggest win around their 2020 lineup of macs is that they walked back the keyboard design and are offering a new version of the air that doesn't have the touchbar.


“I’m skeptical of a CPU change because of a bunch of things unrelated to CPU architectures”

Don’t forget Apple has done this twice already, and it’s the only company that has done this at this scale and succeeded. If there is a company in that can pull off a cpu architecture switch, it is Apple.


The question is not whether they can "technically" do it - of course they can, as they've already demonstrated. The question is whether they can make it work from a market perspective.

Last time they transitioned from a niche architecture to the architecture that was already mainstream - and happened to be faster and more power-efficient as well. That was basically a no-brainer, it could not fail from a market perspective (from a technical perspective however, it could), because it essentialy tore down previously-existing compatibility barriers between whatever software the mainstream of computer users used and what Mac users used. This time, they would transition AWAY from the mainstream laptop and desktop computing architecture to a niche architecture in that area, which, while still maybe being more power-efficient and maybe even faster (we'll just have to give them the benefit of the doubt there until we hold actual hardware in our hands), erects new compatibility barriers between the mainstream architecture and all people using software based on it and Apple's computer offerings.

This can only work from a market perspective if they either are big enough to drive such a momentum of customer and developer attention to this "new" platform that they essentially manage to make it successful all by themselves, or if they actually happen to be the harbinger of a future movement to ARM in desktop computing scenarios which will at some point in and out of itself be much larger than Apple's share of desktop computing space.

If either one of these happens to turn out to be true, they will be able to make that transition work. If none of these turn out to be true, they will fail (measured by falling Mac market- and mind-share), regardless of whether they execute the transition well from a technical standpoint, which I agree they will be very likely to do.


> This can only work from a market perspective if they either are big enough to drive such a momentum of customer...

It's Apple. Developers will make software releases in just months if they think the market will be there, and they will. Knowing that a new market is opening up that is not so crowded (like Nintendo Switch), will make developers very, very happy.

Even Palm Pilot succeeded. You just have to convince people that Mac-users are willing to pay for software, which seems to be the case.

The amount of developers nowadays are also crazy.


That only really works for small projects though.

AFAIK today Apple is the only big developer making exclusive macOS products. If you think about it the vast majority of big apps are crossplatform projects (Adobe, Autodesk, Microsoft, all browsers, audio, video editing, animation, 3d, etc).


Just like USB type C, headphone jack and adapters.

There are ARM servers, Apple processors may become server processors. They could run iPhone, iPod applications on Macbook. Full day on one charge. Linux works fine on ARM. x64 in VM for compatibility.

It is the best time - Intel is record low with vulnerabilities and process - switch to AMD or switch to ARM.


No, I'm skeptical because for years Apple has been distracted about the Mac. It's a very different situation now than when Apple switched to Motorola or Intel.

Edit:

For example, how many times has Apple implemented a new keyboard on the Mac? And how many of those times they failed spectacularly?


The last two times there were obvious perf increases to smooth over the transition.


You’re forgetting the most underrated piece of hardware to come from Apple in the last couple of years, the 12” MacBook. The weight to capability ratio of that machine is off the charts and it’s the perfect encapsulation of what a portable Mac should be.


It was a nice computer, but it was essentially a netbook really. Browsing basic internet sites is about the limit of what it could manage, critically underpowered.

Even things like the Intercom web dashboard - a use-case you'd think a laptop like this would be ideal for with a customer support staff member - it simply couldn't manage comfortably. Though to be fair a high-spec £400 Chromebook couldn't comfortably run Intercom either...


The idea of it was what a portable Mac should be.

In the end it could very well be the perfect demonstration of why a move to ARM would be worth it.

The main issue/downfall of the Macbook was the slowness and the heat. Those are also the two strengths of the A series of chips- performant w/o excess heat.

You take the exquisite form factor and marry it to Apple's chips, and you potentially have a drool worthy portable powerhouse, though the sticking point may be getting a magic keyboard in that form factor vs the butterfly.


Since everyone else is replying to hate on this machine, want to throw in that I loved it. My situation maybe wasn’t typical, but living on the road it was such a joy to have something that for all purposes disappeared into my pack and was totally capable for my basic needs. This style body w/ better arm chip is for sure future of “gen pop” Apple laptops.


I'm glad you like that machine, but I would not put it in the success camp. It was the start of the butterfly keyboard, and Apple ended up discontinuing it after just a few years.


It's not really correct to say they discontinued it. They basically just started selling it under the MacBook Air brand.


The new Macbook Air is a very different computer. Much larger and thicker, even compared to the 11" Air that was discontinued.


If it's so perfect, why is it so underrated?

"What a portable Mac should be" is going to mean different things to different people.

My perfect portable Mac (based on what's available) would be a 16" MBP, because I care about screen real estate first and foremost, followed by a desire to have a fast chip/gpu, battery life be damned. Other people prefer other models like the 13" MBP or the 13" Air for other reasons.


I couldn't disagree more.

IMO it encapsulated everything wrong about Apple's approach to laptops during these past 5-6 years like the butterfly keyboard.


I have one, and I do a lot of programming on it, but it has serious overheating problems, even on things like video calls or driving a 4K screen (god forbid you try a video call on a 4K screen). It brings my colleagues great hilarity when I have to get up and get an ice pack in the middle of a call.


^ I really think this should be taken seriously. Yes, Apple has gone through architecture transitions before, and they were all mostly smooth. But that was a while ago, and Apple has been on a really bad streak with the Mac recently.

Frankly, if I had to point to one corollary between when the changes worked and when they did not, the most obvious would be that Steve Jobs is no longer around...


> the most obvious would be that Steve Jobs is no longer around...

The other one is the popularity of iOS.

Edit:

Also the departure of Forstall in 2013.


Scott Forestall is definitely another a big one. The timing aligns perfectly with when, in my view, the Mac platform officially went to crap, and it also apparently led to an internal restructuring where the Mac and iOS teams were less silo'd.


Yes, when Federighi took over both iOS and macOS.

Maybe what happened is that Ive took over the reins and made the life of the engineering teams too difficult. I'm optimistic his departure will bring good things to the Mac. So far we've gotten a new keyboard.


How about APFS on the Mac? That seems to have succeeded.



Well, they upgraded hundreds of millions of iOS devices to AFPS seemlessly. Seems like a sucess to me.


I think you can add using the same 720p camera on the MacBook (Pro) line for the last n years. At the same time the iPhone's camera has seen dramatic updates almost continually. Heck, they could probably just replace the camera in any MacBook with the one out of the lowest priced iPhone and instantly have a better product. But they haven't.


I don't really understand the positive opinion around this. Isn't the break from the Windows and Linux default architecture and thus software too high here? Like in gaming - are Apple users really okay with not having access to a single Steam game?

Okay, the ones using this only for ~~work~~ office work, the browser and Netflix won't mind, but for the rest this presents a huge separation from what is available elsewhere. I must miss something.


I game on Mac, and I’m not excited about losing access to Steam. But I have a 2018 MBP and I probably wouldn’t upgrade for a few years yet anyway.

Depending on how the announcement and rollout goes I could see myself buying one of the last intel 16” MacBooks, or just being content with my 2018, and either way not upgrading for a while.

I suspect that if the transition is “successful,” the first year or two on ARM will include a few noticeable trade offs, but those will quickly be “washed away” by a substantial performance gap where Apple’s hardware is just a LOT better than everyone else.

And if the transition is “unsuccessful,” well, I may end up finally buying a gaming PC... or just deciding I don’t care about the small handful of PC games I play and sticking to console.

Gaming on Mac has been second class for.... ever, so I would say anyone who really cares already has a windows machine.

But one last thought, keep in mind that while the quality of games is somewhat lower, the quantity on iOS is off the charts, including many games with high end graphics.

In a weird twist of fate, it’s possible that Apple’s decision to merge its desktops and laptops into the same architecture as its mobile phones actually increases the total addressable market so much that game makers are _more_ interested in doing the small amount of extra work it would take to make a game work on Mac as well as iOS. This happened when Firaxis started making Civ for iOS and then ended up updating the Mac version to use Metal, and when they did the performance was worlds better.


Here's a counter point to this argument: take Fortnite for example. Its arguably the most popular game on the world. The current Mac version runs fine but not great. They could modify the iPad version (probably pretty easily) and get better performance on these new ARM Macs than the x86 version was ever going to be due to Apple refusing to put in great gaming GPUs.


Very few people use Macs for gaming, and many of those that do probably just play Apple Arcade games (which I'm sure will have a smooth upgrade path). And most of the games people play that are on Steam are (these days) built with either Unity or Unreal, which will no doubt have 1-click-rebuilds for the new architecture. In Unity your game logic doesn't even need to compile to native code.

Even beyond games, the thing is that nearly every development platform these days is an abstraction above the hardware - be it the JVM, the web browser, etc - and the rest (of the new ones coming out) are built to be cross-platform from the get go. Just look at how smoothly Rust/Cargo can target different architectures. I'm sure the same is true for Go.


> And most of the games people play that are on Steam are (these days) built with either Unity or Unreal, which will no doubt have 1-click-rebuilds for the new architecture.

I highly doubt that. The architecture difference is too big, and gaming studios are not known to press that button even if it's available. The native Linux gaming support story speaks to that. That they will port existing games is even more unlikely.

Will be funny to re-read that comment in the future, be it right or wrong.


The entire draw of Unity and Unreal is that they target all platforms. With regards to Linux, Epic Games has released their games for Linux in the past. There is no commercial incentive to release games for Linux, though. The game engines will support Mac ARM if game developers require that feature, and game developers will target Mac ARM if there is a commercial case for it.

The existence of gaming on Mac I think is largely due to companies that support it by choice, like Blizzard, and I don't see why they would abandon the platform due to a CPU architecture change.


I disagree with almost everything in this comment.

Unity and Unreal do not only exist for their multiplatform target, but because they are premade engines to make games with in general. Many games made with them support one single platform.

There is absolutely a commercial incentive to release Linux games. Developers would not do it otherwise, or at least they would have stopped after one try. They did not.

I highly doubt that Blizzard supports MacOS out of goodwill, and I doubt even more that if it was based on that, they could easily continue doing so after a cpu architecture change. They'd have to port whole custom engines plus middleware. That would not happen in that case.

One thing I assume is correct: If there is money to be made here engine developers and game developers will make it happen. But I don't see why it would be that easy.


Just giving you some data points, there are many people that play older games on their laptops, one such community is the Sims3 players, they got screwed when 32 bits support was dropped, some still hope EA will make a 64 bit soon (IMO it makes no sense for the devs to do the work for free for this old game).

So the people the play games on Mac are not the same group that play latest AAA shooters or arcade games, they are people that play some casual games or retro games on their laptops so no "hardcore gamers"


The story is admittedly different when we're talking about legacy games that are no longer supported. Even if pushing the button is easy, if there's nobody there to do it, that doesn't help.

But the good news it that game life cycles have gotten much longer; where in the past a game was seen as a fire-and-forget project, these days a game may get active updates for five or more years after launch. And not just AAA, but indies too.


Sometimes we do not need updates, we need existing old games to continue to work. Ubuntu tried also to drop 32bit support but the community responded strongly with use cases like games and some other old apps, then Canonical/Ubuntu listen and 32 bit support was partially dropped, 32 libraries needed for games and different apps are still supported.

Meanwhile I see Mac players upset their 32 bit games no longer work (like the Sims3 players and I also see some Mac users in r/winegaming ). Again Apple can do whatever they make them more money but for example gamers voice was heard in Ubuntu/Linux and Canonical did the work (for free) to continue supporting gamers.


That’s not really “data points” in any useful sense of the word. The sim3 community could be a few thousand people - still large enough to be a community where this came up as a noticed issue, but not meaningfully large in any real sense as far as Apple is concerned.


Sure, I don't say that Apple should do X op Y. I just wanted to present some example of people that game on Mac, because some people in this HN forum would think that gaming means only latest AAA games or console gaming. There are people that don't have a dedicated gaming device, they are not that many as percentages but they exist and they are affected by the 32 bit drop and other backward incompatibility updates.

So it is possible that this people when they buy a new laptop would not buy an ARM Mac buy a Windows one to continue playing the casual or old games - sure Apple can ignore them because as percentages is nothing for their giant vault of money.


I'm totally fine without having a single Steam game. Most Mac users didn't have access for many years, and I'm not sure that gaming is a big use case for Macs, at least historically. I would be interested to see if this has changed.

And most software that I depend upon is either 1) open source, or 2) developed for Mac first and foremost with a solid backup app that is open source. The one exception is Microsoft Office, whose Mac versions are second thoughts and second rate software, but necessary for collaboration with a world that has locked itself into closed formats. Office would likely get an Arm version for Mac too, I would think.


People always underestimate how important games are to PC sales, and Apple is traditionally the worst about this given their bicycle for the mind rhetoric. I don't think it is a coincidence that Apple laptops became very popular after you started being able to boot them in Windows to play a game. You got the benefit of MacOS, Apple hardware and PC game libraries.


I doubt very many people are buying Macs to run Windows games. Mac hardware doesn’t exactly come with the greatest GPUs.


"How many people buy Macs to run Windows games" is the wrong question.

I don't buy computers just to read websites (I can do that more comfortably on an iPad), or just to write documents (ever heard of the FreeWrite[1]?), or just to play games (I'd rather use a game console). Rather, I buy a computer because it's a single device that can do all of those things.

I almost added "and switch between them seamlessly" to the end of the last sentence, but of course on a Mac with Bootcamp, that's the one exception—you have to reboot in order to play a game. Still, on a laptop, I consider the ability quite valuable—I don't want to carry around a second computer just to play a quick game of Spelunky from time to time. If I couldn't do that at all on a Mac, it would be a point in favor of Windows.

---

1: https://getfreewrite.com/


And the casual game market is much larger and those are mostly on mobile. Besides, Apple will probably already have a large number of casual games via Apple Arcade.


I know a lot of people wi Th Macs, but they are all developers or scientists, and I've never known anybody to install Windows via boot camp! So I'm clearly not in this demographic. However I do wonder how large it is compared to the rest of the Mac userbase.


Mac support in games is less important now than ever before. Services such as Shadow are how I game on my Mac now.


I'm not so sure that generalizes. Steam has a huge userbase and according to https://store.steampowered.com/hwsurvey, 4% of that is using MacOS. That's a big number - especially keeping in mind how much inferior MacOS is in gaming support compared to Linux, where almost everything works. I don't see that going well with the ARM switch.


1) Let’s take the latest amount of active steam players per month, which is 90MM. That is a number for 2019, I will assume that it grew a little per the usual percentage it does (accounting for Covid) to 100MM. 4% is 4MM.

2) Macbook Pro & Macbook Air have a total share of 78%. I’m ignoring other models, since we are talking about ARM on notebooks, not towers. That brings the number of players to 3,1MM

3) We can assume that most people still rock macbooks since 2012. But let’s start counting from 2013. (We don’t have unit sales, but only the total amount, so I work with the amount that most websites quote) Apple, on average, sells 18MM Macs per fiscal year. Till today, that is 135MM Macs the market.

4) We need to estimate the amount of Macbooks vs Macs, but I cannot find a single stat on this. I’m gonna pull this out of my ass and say that it’s around 90%. The Macs have always been supplemental to Macbooks and Apple has been “forgetting” about them a lot. This brings the total to 121,5MM

5) That is 2,56% of people who game on Macbook. I would say Apple doesn’t care about ~3%. This stat also doesn’t account for hardcore vs casual players. Eq. Those who won’t care vs those who will switch platforms.


Gaming on Apple hardware is dead. Apple killed it when they killed 32 bit support.

But I have had a pretty nice experience with GeForce Now. I don’t see why that would have to change with macOS ARM.


Good point, streaming games would continue to work.


1) Let’s take the latest amount of active steam players per month, which is 90MM. That is a number for 2019, I will assume that it grew a little per the usual percentage it does (accounting for Covid) to 100MM. 4% is 4MM.

2) Macbook Pro & Macbook Air have a total share of 78%. I’m ignoring other models, since we are talking about ARM on notebooks, not towers. That brings the number of players to 3,1MM

3) We can assume that most people still rock macbooks since 2012. But let’s start counting from 2013. (We don’t have unit sales, but only the total amount, so I work with the amount that most websites quote) Apple, on average, sells 18MM Macs per fiscal year. Till today, that is 135MM Macs the market.

4) We need to estimate the amount of Macbooks vs Macs, but I cannot find a single stat on this. I’m gonna pull this out of my ass and say that it’s around 90%. The Macs have always been supplemental to Macbooks and Apple has been “forgetting” about them a lot. This brings the total to 121,5MM

5) That is 2,56% of people who game on Macbook. I would say Apple doesn’t care about ~3%. This stat also doesn’t account for hardcore vs casual players. Eq. Those who won’t care vs those who will switch platforms.


Assuming this is targeting the lower end of the Apple line up, AAA gaming wasn't really much of a thing to begin with. In general though, iOS/iPadOS has a pretty expansive gaming lineup. I’d expect that library to mostly end up working on any hypothetical ARM-based macOS machines. Especially considering the iPads are already adding keyboard and mouse support.


I have about 35 Mac games on my Steam account. I avoid Steam like the plague because its interface is awful and the experience of using it is awful compared to most other software.

I love Apple Arcade's subscription model, but it's still coming up to speed. The offerings it has aren't great yet, but I suspect they will be with time. If Apple can improve its offerings on Arcade, I'll never look back. I don't need to play the latest FPS and I suspect other Mac users also don't fit that demographic. If Valve doesn't want to port Steam to an ARM-based macOS, I'll simply drop Steam. It's that easy because Steam offers very little of value for me, personally. I've heard from others that they feel the same, though I can't say how prevalent that feeling is among Mac gamers.


Intel has been stagnating and dropping the ball for years. AMD is catching up but still just caught up to Intel, more or less.

I welcome competition. I want CPUs to get faster again.


https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

Only 4% of people using Steam are using a Mac, gaming isn't that popular on macOS, unfortunately. And it's getting worse with the OpenGL deprecation.


Macs only have ~10% market-share overall, so this just tells us that Mac users are (very roughly) 50% less likely to play games than PC users. That's certainly a drop, but not such a catastrophic one.

And, Apple's asinine decisions around OpenGL don't tell us anything about what users actually want.


That's still about 5 million users. Looks huge to me. Well, maybe that's tiny from some perspective.


Negligible compared to the 738 million iPhone users.


That's not how any of this works. It's about the ecosystem of developers and users. How much VR work is happening on macs? How much AI?

This is a long term race that Apple will fail because it failed to nurture its roots.


But we are talking about the MacOS users, not iPhone users... They won't be affected by this.


Apple sells about 20-25 million Macs a year. That’s negligible to Apple to lose some small percentage of a segment that’s only 10% of it revenue.

Think about how many games died when they dropped 32 bit support.


There are many good arguments in this thread and I honestly appreciate to see the many perspectives, also this one.

But I can't get behind that argument. If all you sell is 20-25 million, to possibly lose 5 million of that is huge. Think about how much they would have to save with the processor move (or how much performance they'd have to gain to make up for it in new sales) to make it worth it. Also, the effort will be huge to make the transition work.

All while there is the easy alternative to just use AMD processors if they wanted a performance boost?

I mean, I'm happy about this. It will probably further cement Linux as the only relevant gaming OS next to Windows. But I'm still surprised. It will be interesting to see how this plays out, if it happens.


For reference, they can save $200 per laptop and up to $500 on the high end CPU option.

That ought to put things in perspective.


I guarantee you that Apple isn't paying 500$ for a CPU. And developing a good laptop/desktop ARM will cost conservatively north of a billion dollars a year.


The top i7 generally open around $1000 a piece and can stand there for years. Noting this might change a bit with Ryzen entering the competition on the high end.

Even if Apple is getting some bulk discount for buying 10000 CPU, I have troubles imagining 50% off. Intel is known for enjoying their margin, not for giving steep discount.


Except Apple doesn't buy 10000 CPUs, they buy 10 000 000 CPUs.

If they were paying even 50% of list price, Intel's revenue coming from Apple would be much higher than what it is right now. Also, laptop CPUs are priced very differently from desktop CPUs.


They already have a good laptop ARM chip....


All of those 5 million users on steam didn’t buy the Mac that year. Assuming you buy a computer every 3 years. That number is a lot smaller.

Besides how many new users might Apple get by selling laptops that are cheaper and have longer battery life.


> That means that the next version of the MacBook Air, for example, could be cheaper even as it has better battery life and far better performance

As someone who does web development work on a MacBook Air and plays games on a Windows Desktop it’s good news for me. I think the fraction of Mac users that play games on Mac is pretty slim.


> [...] are Apple users really okay with not having access to a single Steam game?

The fraction of people using Macs that care about gaming is tiny, and among them those who care about Steam even more so.


I agree. I do care about gaming but I have a dedicated PC for that.

OTOH would Mac users have cared more about gaming if the platform hadn't shipped with obsolete versions of OpenGL and Metal had been introduced much sooner?

Before Metal was introduced in macOS the included OpenGL version was like 4-5 years behind.


I don’t care about steam on mac - i have a gaming PC.

Companies that still care will compile an ARM version just as easily.


Didn't they drop most of the games already with Catalina, since most of them are 32-bits?


> Overall, in terms of performance, the A13 and the Lightning cores are extremely fast. In the mobile space, there’s really no competition as the A13 posts almost double the performance of the next best non-Apple SoC. This year, the A13 has essentially matched best that AMD and Intel have to offer.

Wow.


You can thank Intel for that. Single thread performance has stagnated since 2016. Some say 2014.


Skylake was released in 2015.


Hmm. Random question here... what prevents Apple from building in an x86-to-uops decoder stage into their desktop ARM CPUs?

That way they could run old code simultaneously with new code - x86 apps get their memory flagged (by the MMU perhaps?) that all code from this memory region gets executed with the x86 decoder stage, and code in normal memory gets executed with the ARM decoder stage.

And to be honest this is entirely what I'm expecting of Apple - yes they are not concerned with backwards compatibility but no one is going to buy a multi-ten-thousands-dollars Mac Pro or even a multi-thousands MacBook Pro if it can't run Photoshop or their favorite sound editing/DJ suite.


ARMs have a more relaxed memory model than x86 so that would be a complication, though I suppose you could just have the decoder put barriers everywhere.

x86 is infamously hard to decode, though. Often you speed it up by tagging instruction boundaries in the L1$ which you wouldn't have in the Apple ARM devices by default. If you're willing to just decode one x86 instruction per cycle I suppose the overhead wouldn't be unreasonable.


You could also just pin each x86 process to a single core, then you don't have to worry about the memory model.

And I've seen some x86 uarchs that tag boundaries in L1, but I've seen some that don't too. I wouldn't consider it a blocker.


Photoshop is already running on ARM—specifically, on Apple's A-series chips.[0]

Unless you're doing something at least moderately esoteric, if you have an actively-developed macOS application today, you'll be able to produce an ARM version of it tomorrow (or, y'know, whenever Apple actually activates this capability) by ticking the right box in Xcode.

There might very well be a compatibility layer available for a transition period (there was for 68k->PPC, Classic->OS X, and PPC->Intel), but that's not at all the same thing as saying Apple will build in hardware-level x86 support in perpetuity.

[0] https://www.adobe.com/products/photoshop/ipad.html


> [0] https://www.adobe.com/products/photoshop/ipad.html

This is Photoshop only in name, in reality it's a rewritten dumbed down version that shares a few of the UI tropes.

Can we please stop pretending these cut down iOS apps are comparable to the real software


> if you have an actively-developed macOS application today, you'll be able to produce an ARM version of it tomorrow (or, y'know, whenever Apple actually activates this capability) by ticking the right box in Xcode.

Will this also take care of existing low level optimizations? I can easily imagine Adobe and everyone in the sound/3D area to run custom specific x86 Assembler code in hot paths.


I think I covered that with my "doing something esoteric"...


You can see the PowerPC 615 for an example of this (dual powerpc and x86 decoders).

There it was implemented as new jump instruction for both archs in order to mode switch.


I don't think that you necessarily have to have an x86-to-uops decoder: there is a Chinese MIPS clone which has additional instructions which helps a lot emulation such as 80bit FP operations, etc.


This is interesting approach. I suspect that developing x86-64 compatible CPU could happen lawsuit.


Licensing?



The base x86_64 patents are about to expire.


What are the chances TSMC announced the plans to build a fab in Arizona because Apple wants to say their chips are produced in the USA?

https://www.tsmc.com/tsmcdotcom/PRListingNewsAction.do?actio...


I was wondering the same. Also wondering if a new fab was required in order to scale up for the potential new demand from Apple.


Ha!!, man, the end of that sentence made me laugh so hard...

"Intel, by integrating design and manufacturing, earned very large profit margins on its chips; Apple could leverage TSMC for manufacturing and keep that margin for itself and its customers."


Have you seen the iPhone pricing strategy? Yes they’ve moved up asp for the high end but are also now selling a $400 phone with chips that outperform anything else in the Android market. It’s very likely some of this margin will be shifted to consumers


Apple’s pricing in the services world: extraordinarily competitive low end to lure you in. Excessively-margined high end halo products. High margin, but attractive in comparison middle tier.


Dropping the per/unit cost of CPUs by $200/ system would allow Apple to retain their margins while decreasing prices. Or (more likely) push performance at the same price.

Take a look at the "Low End" iPad and the iPhone SE. They are both excellent price/ performance devices with highly competitive processors.


"the i3-1000NG4 Intel processor that is the cheapest option for the MacBook Air is not yet for public sale; it probably costs around $150, with far worse performance than the A13"

In so curious so see this. Unless I missed it we haven't seen consumer high end ARM chips that are readily available running a general purpose OS.

Like the Raspberry Pi is cool, but it's still trash in comparison to an average x86 laptop.

Let's see if those really high geekbench scores translate to an excellent computing experience.


I mean, is the iPad Pro not evidence enough? The A13 is much, much better than the chips in RPis.


The iPad Pro uses the A12Z (https://en.wikipedia.org/wiki/IPad_Pro). The A13 is used in the iPhone 11 variants + iPhone SE. Your point stands, these chips have insane geekbench scores.


I'd really like to see an iPad Pro crunch through as many VSTs as my old 2600k did in its 2 millisecond budget


As highlighted in the article, Apple's A13 is on par with Intel's i9-9900K in SPECint2006 and only 15% behind in SPECfp2006.


those are benchmarks, but to restate my comment I'd really like to see that hold up in actual real-world use cases


I think it'd do just fine. The A12X has 10 billion transistors. Sandy Bridge has 1 billion.


But my sandy CPU had not much trouble being OC'ed to 4.5ghz while apparently the top freq. of the A13 is 2.7 ghz? How many, say, convolution reverbs can you chain on the A12 or A13 would for instance be a super relevant metric to compare things.


And I bet your sandy CPU had active cooling. The A13 tops out at 2.7GHz because that's what Apple decided was safe for a passively cooled phone. I don't doubt that it could run at 4.5GHz in a desktop package, with a proper fan and heatsink.


You'd almost certainly need to redesign the hardware for that; the A13 already pushes its efficiency curve during boosts.


I don't understand, isn't this whole conversation about using these ARM chips for the next desktop macs ?

> I don't doubt that it could run at 4.5GHz in a desktop package, with a proper fan and heatsink.

well, I don't know at all. I'm pretty sure you can't take a, say, 2.something ghz U-series intel i7 CPU and make it go a lot beyond 3ghz


And the A13 was built on ULP node with a maximum power budget of less than 6W. There is no reason why an A13 built on HP node with power budget of 20W and active cooling cant reach 4Ghz +.


How is this a valid comparison?

You have active cooling for the sandy CPU the iPad does not.


I do some light VST development with JUCE and have tested out my synth on my iPad Pro, worked a charm.


> I do some light VST development with JUCE and have tested out my synth on my iPad Pro, worked a charm.

I mean, of course a single VST works like a charm, single VSTs already worked fine on 500 mhz CPUs with win98-era computers and Cubase VST. The question is how many tracks with synth + 20 filters behind can you layer before having to freeze things.


The irony in the narrative surrounding the ascendence of ARM is that the key innovation occurred in the Apple Newton. [1] History rhymes, they say. If I remember correctly, it was Digital Equipment Corporation's (DEC) silicon team that transformed ARM into a performance-per-watt powerhouse (no pun intended) to meet the requirements of the Newton.

[1] https://en.wikipedia.org/wiki/Acorn_Computers#ARM_Ltd.


ARM has always been a performance-per-watt powerhouse. The story goes the first processor was tested without a proper power connection and ran on the IO-pins.


The Wikipedia page on the history of the StrongARM [0] makes the following claim:

> According to Allen Baum, the StrongARM traces its history to attempts to make a low-power version of the DEC Alpha, which DEC's engineers quickly concluded was not possible. They then became interested in designs dedicated to low-power applications which led them to the ARM family. One of the only major users of the ARM for performance-related products at that time was Apple, whose Newton device was based on the ARM platform. DEC approached Apple wondering if they might be interested in a high-performance ARM, to which the Apple engineers replied "Phhht, yeah. You can’t do it, but, yeah, if you could we'd use it." [1]

Regardless of the degree of DEC's contribution, the ARM phoenix arose out of the flames of the Apple Newton.

Further irony is that Intel acquired StrongARM from DEC as part of a legal settlement, then sold StrongARM/XScale, then created Atom for this market, while in parallel they developed a low-power 386 for the RIM Leapfrog/Blackberry but later concluded that this market was not large enough (forcing Blackberry to switch to ARM with the introduction of their Java based SDK)[2].

Intel has made some magnificent missteps.

[0] https://en.wikipedia.org/wiki/StrongARM#History

[1] https://archive.computerhistory.org/resources/access/text/20...

[2] Graham Tubbs "Harvesting the Blackberry" https://books.google.ca/books?id=oSaoNQcFTzMC&lpg=PA25&dq=Pi...


The original ARM team was given a power budget of 1 watt, but when the first silicon was made they found they had accidentally made it so efficient that it ran on a tenth of a watt. There is a nice little first-hand history of the early days of ARM here: https://www.theregister.com/Print/2012/05/03/unsung_heroes_o...


> The story goes the first processor was tested without a proper power connection and ran on the IO-pins.

That's real common actually. The current from the diodes on the I/O pins has to go somewhere.


Then it is odd I have ever only read the story about ARM. Do Intel's/AMD's proto's run on their I/O pins?


It's not really a fair comparison, because it's not the same TDP niche.

I imagine Intel Quarks will though. Probably any of the SoC versions as well.

I've actually had to design around this in a board once. Had a chip where the reset line didn't hook up to all the logic inside it. Originally just put FETs on the power lines, but that wasn't enough and I had to gate off all the I/O lines too.


The first ARM was designed as a CPU for a desktop computer ( the Archimedes )


Sure, and I bet a 486 (so a comparable processor for the time) can run off of it's I/O pins. A quark is a 486 core.


A while back Linus said something to the effect that x86 was the default for servers because that's what people used on their desktops. I personally find this news very exciting because the x86 architecture is just so messy and filled with cruft from ages ago.

If ARM becomes the standard on the desktop for a significant portion of machines, then I think that we'll probably start to see more ARM processors making inroads in the server and workstation market, which I think will be a huge win.


It seems a bit silly to discuss servers based on Apple's architecture before they reveal something with several cores. Mainstream x86 processors have 64 cores per socket. These are the top contenders in SPEC performance-per-watt benchmarks. Notably, nobody has ever even bothered submitting a SPEC result for an ARM server, in particular not Ampere, whose product "provides industry leading power efficiency/core" even though there is no evidence to back this claim.


It is a lucrative market, they have money and followers. I would not believe they could design smartphone processor but here we are. Lets check in five years.


To be fair, ARM has a ton of cruft too.


Can you elaborate? x86 boot protocols are ... arcane at best; making 16 -> 32 -> 32 protected -> 64 transitions, etc. Not to mention tons of instructions that are basically never used because they made sense once but don't any more.

What is the ARM cruft? I think there's a processor mode transition, but I don't know the architectural details. The worst thing about ARM (as in any RISC processor) is that the instructions are devoid of semantics -- there's no "call" that is just a plain instruction, there's a "jump with flags saying to put the current instruction pointer in the location pointed to by another register and increment that register". This is kind of interesting as it allows all sorts of crazy non-tree-like call patterns, but practically speaking the scope of what compilers can generate is limited, so the implementations of the architecture will tend towards supporting certain usage patterns, which will mean seemingly equivalent instructions will have different performance characteristics, etc. Are there other specific cruftiness about ARM that you can point to (either here or referring to other posts or blogs)?


This is directly addressed in the post and Linus is directly quoted. Haven’t you read it?


I completely missed that section in my excitement. I skimmed the "Implications of ARM section" when the article started talking about the fact that Apple can tune the architecture, which I found irrelevant and uninteresting. Saves me the trouble of digging up Linus's statement.


Stuff like this is one area where Debian and other GNU/Linux distros are so valuable. The architecture doesn't matter much, and the flexible distros are poised to adapt well to this heterogeneous world.


> Windows, particularly given the ability to run a full-on Linux environment without virtualization

An effort that Windows seems to be in the direction of abandoning. (Plus, writing these kinds of compatibility layers is complicated but not super complicated. What you really want is performance, and that’s hard.)


Are you referring to the fact that WSL2 is technically a VM?

You're not wrong, but the point seems a bit pedantic—Microsoft is clearly investing a lot into making WSL2 feel native, and it runs acceptably fast.


Yes, the VM approach won’t work if your underlying architecture changes. And it’s hard to make it fast if you’re emulating, as I know firsthand from working on this for iOS ;)


> Yes, the VM approach won’t work if your underlying architecture changes.

Well, but that's Ben Thompson's point, I think. If it's true that developers prefer to deploy to the same architecture they develop on, then one of two things are possible:

1. Developers largely abandon the Mac.

2. ARM becomes more common on servers.

Apple is presumably betting on #1 (or is just willing to lose the developer audience), but it's a gamble.

Lots of developers don't need to deploy code to servers, of course, but it's easy to see a situation where the momentum is felt by all. macOS's UNIX support has made it, well, I don't quite want to call it the "de-facto" development platform, but something approaching that status.


Apples extends its ARM, gives Intel the finger.


Hoping for a job with El Reg?


And with Apple moving to ARM, any hope of serious gaming on a Mac will be gone.


That ship sailed decades ago. Rightfully so, because everyone that has the money for a Mac can easily buy a console, and much gaming has shifted to mobile platforms. Unless you think rainbow colored keyboard backlighting is a key innovation area for a computer company, the gaming market is a complete waste of time except for driving graphics hardware, which Apple has always lagged at even for their Pro products.


Some of us can afford Macs and consoles, but still prefer a computer for playing games (graphics, kb/mouse controller, load times, etc.)


I’m not sure Apple’s goal is to replace their x86 line of products; that will alienate too many users and would involve a huge number of software to be ported to ARM. I think they’re probably going to release an iPadOS device in a laptop form factor. Either foldable like the Lenovo Yoga or with a detachable keyboard similar to what Brydge Pro has done [0]. They’re already heading down that path with the new iPad Pro keyboard [1].

  [0] https://youtu.be/Kkn9CLppLkI
  [1] https://youtu.be/JLQAh6IHPc4


Apple doesn't tend to do things by half measures. If they see Apple on ARM significantly out-performing Intel across their product line, then they will move. There will be frustrations and pain, but it will happen. As Ben hints, this may cause a bigger industry shift as well. With Apple on ARM, it makes Windows and Linux ARM more palatable, in particular this could give ARM servers a significant boost.


You act as if Apple hasn’t already made a processor transition twice and that their two largest third party software vendors - Microsoft and Adobe - didn’t make the transition each time.


Will LLVM IR allow developers to deliver binaries that will run on both Intel and ARM?

I see the Swift compiler can output LLVM IR, can you do the same thing with Objective C code bases?


Not really. LLVM is not a single interoperable IR, it's more like a family of IR's with many arch-specific details. WASM+WASI could work though.


No, LLVM IR has platform specific parts baked into it like the ABI, and you can't really jump archs like that.


Adding to what the other comments said, compilation from source to IR discards a lot of information, such as C preprocessor directives (which might take the form of "if x86" or "if ARM"). So, no.


Not by itself, but of course the LLVM infrastructure has deep support for cross-compilation. (Objective-C can be translated to IR as well; it goes through largely the same pipeline once the frontend is done with it.)


No. That rumor was dispelled by Chris Lattner during an interview on the Accidental Tech Podcast.


I'm not a business guy, but if Apple's going all-in on ARM processors, and then they expand into the server market (which the article speculates on), could we potentially see Apple opening a new product branch devoted to competing in the Cloud space with AWS, Azure, and GCP?

Imagine developing apps on an ARM-powered macbook, deploying onto ARM-powered servers owned by Apple, specifically for applications to be used on MacOS & iOS devices.


Would it be possible for Apple to have an optional dual processor system, an ARM main processor and an optional Intel coprocessor? That way for people who need support for “legacy” x86 apps or for development they could get it.

Could you have an external Intel coprocessor like we have external GPUs?

(I know nothing about how this would work, obviously the traditional way would be to just have an remote x86 server for running those tasks)


I expect Apple to do something smarter, e.g. extend the ARM instruction set with operations that speed up x86 emulation.

For example, emulation of x86 on ARM is expensive due to subtle differences in memory model. Why not make an ARM chip that can operate with x86's memory model?


I am beginning to think if it make sense to segment the Mac into two Category.

The Pro model will use x86. And retain the compatibility of all macOS apps, some of them may likely never made the move to ARM. They just keep patching it to run on latest macOS. Along with Windows Bootcamp option. That will be Mac Pro, iMac Pro and MacBook Pro.

The Non-Pro model will use ARM. You will end up with a 12" Macbook that is priced at $799. With a possible 14" Macbook at $899 ( The BOM cost of an iPad Pro 11" would be the same as 12" Macbook, you are essentially swapping the cost of back camera modules to additional 128GB NAND, the Touch Screen And Glass Panel to Track Pad and Keyboard. ) It will be like the Macbook 2015, except it doesn't cost that ridiculous $1299. And $799 might have been the cheapest portable Mac in recent history as far as I could remember. Even the 11" MacBook Air was priced at $899.

It would greatly expand the macOS market shares. Which has very much stagnated for the past few years. Out of 1.4B PC market, Apple has 100M macOS users, 7% marketshare. Compare this to 4.5B Smartphone Apple has 1B iOS users, 22% marketshare. And the most important thing to me would be that Apple also accept / admit Tablet computing will never take over the Desktop / Notebook, or keyboard / trackpad paradigm. It is good enough for large enough of a market to worth continuing the investment into Mac other than trying to get iPad / Tablet to kill it.

The only problem with this hypothesises is that the software would be very messy. Would Xcode force all Apps by default to compile with Fat binary? Is Apple going to tell its user the different ? ( Not that I think it would matter to Non-Pro user )

Anyway, this will be an interesting WWDC. ( Or may be everyone got it wrong Apple isn't doing anyway to the Mac )


I wonder what an alternate timeline would look like where Apple went for AMD semi-custom x86 chips instead of Arm. The Mac Pro would have 64 cores, MacBooks would have great APUs, and Apple would be able to segment their products at their own discretion–not Intel's.

I also wonder if any chiplet designs will find their way into Apple's Arm-based lineup.


At the time Intel was pushing on performance-per-watt, AMD was not. So AOM was significantly further away from getting the iPhone contract than Intel was.

And Apple is already extensively using multiple small chips put together to form their A13 APUs, so they are not far from the full "chiplet" already (they just are not using silicon wafers as the interface yet).

Right now I think that the main barrier to the Mac Pro already being on AMD CPUs is the lack of Thunderbolt. Already there is 1 Thunderbolt AMD motherboard, and with Thunderbolt becoming part of USB 4, that barrier will fall next year. So it will be an interesting race: is Apple pushing for AMD Mac Pros, or ARM ones?


> and a company from a territory claimed by China was.

A company from a sovereign country claimed by China.


He big reason to move production offshore was labor cost (straight comparative advantage); now it’s path dependency as the whole supply chain has migrated. This was the US’s big advantage 1890s-1980s but that time has passed.

Advanced robotics will offer a chance of a fundamental restructuring as labor costs continue to contribute a declining proportion of COGS. The factories and supply chains could be completely distributed and resilient. But the example of the Internet is discouraging: the ultimate end-to-end system ended up highly centralized too.


Oof, fun times at Intel right now. There are pockets of Intel not forgetting what made it successful, but they're usually the most difficult places to grow a career as they get snuffed out pretty quick


Duplicate of https://news.ycombinator.com/item?id=23538826 (which was posted a few minutes earlier).


Rightfully or not, I think this is the version of record at this point, given that it's sitting at the top of HN right now. They will probably get merged later.


Intel has lost its manufacturing lead, but these changes are not permanent. There is a good chance TSMC stumbles as transistors shrink. Intel has a chance to catch up, but it seems like Intel has structural problems that prevent it from winning deals to manufacture third party designs. Relying on complete vertical integration doesn't seem like a winning business model anymore.

Perhaps Intel should buy GlobalFoundries or license Samsung technology to develop competency in building designs for other customers.


The problem, as explained here, is also what Linus and recently GKH have mentioned is that the lack of any consumer ARM hardware like laptops and desktops mean that kernel developers or app developers are basically not able to develop and test their code easily to be able to deploy it in the cloud ... if Apple was able to sell their macs with ARM and not only that but also provide instances of the same platform then they might be essentially at the core of architecture revolution


I doubt many kernel developers would like to develop on the Mac, considering that as of late it’s been a bit problematic to get Linux to work on.


Apple having a really fast ARM chip is both an advantage and a disadvantage. If no Windows or Linux box will run on ARM for the same typical workloads a lot of stuff never gets ported. It would help Apple if there would be powerful ARM Windows or Linux PC's.

People say Qualcomm is catching up but I'm not sure what kind of performance we'll get from the first ARM PC processor from Apple. It will probably run circles around the 8cx.


It is a misstatement to say that developers develop on Macs. They don't, they use Mac as a convenient environment from which to do Linux/cloud development. If it's MS/cloud development, they're running MS. The 3rd party app world on Mac is dying. Apple will be happy with their end to end commercially controlled device. That they put up with 3rd party apps at all was just a convenient anomaly.


The article mentions that while the Apple ARM chips are faster per core, the Intel chips are faster at multi-core operations. How is that possible?


I think a big chunk of that is power thermals -- Apple's ARM chips are only used currently in passively-cooled, handheld, power-restricted chassis (phones & tablets). Intel's chips are generally actively cooled and run on mains power, so they can run hotter and draw more power, so they can make up the difference by stuffing more hot cores onto the chip compared to what Apple has been doing.


Easy answer: Intel chips have more cores than Apple ARM chips.

But if they have the same number of cores it's generally a power tradeoff. A 12W chip can give 12W to one core or 3W to 4 cores. Transistors are optimized for a particular voltage & current, so when they run at a different power, performance scaling is not linear.


The funny thing is, the main selling point for intel processors for me was their better single thread performance over AMD chips and now Apple comes up with a faster single thread performance chip. I might get an ARM Macbook when they are in sale after all.


Heterogeneous Multi-Core Processing. 2 big cores. 4 small cores. Meanwhile Intel CPUs have 6 big cores.


Imagine two football teams.

One full of excellent prima donna's who don't work together at all.

The other full of good players who work really well as a team, communicate well and get out of each others way.

Who would you bet on winning?


> Third, the PC market is in the midst of its long decline. Is it really worth all of the effort and upheaval to move to a new architecture for a product that is fading in importance? Intel may be expensive and may be slow, but it is surely good enough for a product that represents the past, not the future.

This is one of the key things they got wrong.

Mobile and tablet devices are not real computers. I can't run any software I want on them. They're locked down, managed, console type devices. There are other weaknesses at the OS and hardware level too, but the lack of flexibility and control is the most fundamental and hardest to change.

If you opened up the OS to versatility and user control you could easily scale up an iPad with a bigger battery, more cores, and cooling, and you'd have a fantastic laptop. The UI would need some work too to make it suitable for more complex tasks, but that's an area where Apple excels when they want to.

The locked down nerfed nature of mobile OSes is hard to change though because it exists for good reason. It addresses a primary need in their market niche. These devices are marketed to non-technical users, and in today's constant war zone environment a non-technical user without those controls will be inundated by malware and spyware.

Still, this keeps them from being viable replacements for desktop/laptop systems in the professional market. For that reason desktop is not going away.

We have two markets here that really do seem to demand two solutions.

The desktop market decline wasn't the death of desktop. It was the loss of the low-end of the market, with non-technical and casual users migrating to phones and tablets. That transition is nearly complete, and the desktop/laptop market that remains is likely to remain until and unless something appears that truly does address that market niche.

It's not a growing market but it is probably a stable one. Having a strong presence in that market is important for Apple because it's the market where developers live, and without developers who is going to make software and ecosystems for the rest of their devices? Who will ensure that these devices reach their full potential and thus are maximally appealing to consumers?

No, doing everything in cloud is not that thing. I want to actually own my data, and broadband is still far too slow to make that workable for a wide range of tasks anyway.


> Third, the PC market is in the midst of its long decline. Is it really worth all of the effort and upheaval to move to a new architecture for a product that is fading in importance?

Yeah this is just an Apple journalist bubble belief that doesn't really reflect the real world in any way and is starting to get tiring.

We keep hearing from them how iPad Pro is a computer and how it's so much more powerful than a real laptop yet we still never see it do anything more than tweeting, writing articles, extremely basic video editing and a painting app.

Yet I look around every workplace and its still just a sea of laptops doing incredibly varied work across multiple applications and systems most of which still just isn't possible on iPads.


I would be more open to this if the new machines had Intel and ARM chips in them.

If Apple pulls any walled-garden nonsense around what apps I can and cannot install on my Mac today, at the extreme I can install Windows/Linux and run the majority of software that's available for those platforms.

If they move to ARM-only, I lose that choice.


Matching the Spec2k6 scores of a high end desktop does not mean that the A13 has similar real world single-core performance. I've done ARM vs Intel perf comparisons in the past and we mostly ignored Spec2k6 because it looked nothing like real world code.

I'm curious if there are any better benchmarks that will run on iOS.


I shudder to think of the huge number of applications this will break. This has all happened before with PowerPC (ironically an RISC architecture Apple abandoned ~15 years ago). It's probably necessary but nonetheless frustrating.


two ifs for me

If they pass on the savings to consumers, looking at the price escalation with iPads I don't see why we should expect it to happen.

if they don't build that garden wall higher which can include no bootcamp support

while TSMC is doing great they are sill in Taiwan, a country which China has been making a lot of dangerous noise about and considering what is happening in Hong Kong how long before Taiwan suffers a similar or worse fate. That is a lot of manufacturing tied up in one area


It's very unlikely that they pass on savings to people by coming up with a cheap ARM laptop, after all their whole thing is gouging people with unrealistic prices for the hardware and huge margins on everything.

That said, if a laptop rolls around that doubles the 10 hour battery life on existing configurations - they'll scoop up a ton of users regardless whether it can run Windows or not.

I don't know how doable that would be, but the iPad Pro has a 28 watt hour battery, the MBP 13 has a 59 watt hour battery, a theoretical Macbook without a discrete GPU and a smaller motherboard footprint of an A13 chip, leaving space for a bigger battery and providing about the same performance could possibly hit 20 hours of battery in about the same space.


How are their prices on hardware unrealistic? Let's look at laptops: any other manufacturer(razer, dell) charges comparable prices for unibody ultrabooks.


Their RAM upgrade and SSD pricing is off the charts though. 8 to 16 GB is $200 - which other manufacturer does that?


its sad to see a king (Intel) slowly dying. Microsoft got its groove back with Satya Nadella and turn Microsoft into player two in the cloud computing and unlock .Net.

AMD ryzen to EPYC with Lisa Su and TSMC became king of pure-play foundry under Morris Chang.

i think there is a pattern here. a good engineer CEO have a vision of what a company can be while a CFO turn CEO only see the bottom line.

i don't know how long Intel can keep squeezing 14+++++++++++++++ nm.


Is the issue here not the software who cares what he hardware is ? MacOS will be ARM based, but if all the software out there needs to be rewritten for ARM then we will lose a lot of great software. A lot of indie Mac software refuses to use the AppStore for commercial reasons, and maybe now will be locked out forever? I understand there will be an emulator but won't we end up with what we had before when we moved from PowerPC > Intel where the old software was slow and buggy and no one wanted to update as was not worth commercially worth it


Will I be able to have an executable file with all of 68k, PPC, x86 and ARM versions included?


Apple's .app system already has that ability, brought over from the NeXT purchase. Those regularly had PPC and x86 binaries in the same package.


(Speculation) Jim Keller to Apple soon?


I'm both excited and troubled, mostly because I fear the loss of LTS on x86-64: how long will Xcode, macports, and even brew support both architectures. Apple isn't one for backwards compatibility so this might be rough few years.

I shudder to think about how long it will take to debug all of the Python3 wheels and Node packages macOS devs are accustomed to. I already have issues with versions of both language's packages failing to install with other versions, and that's on the same platform, OS and CPU architecture.

Still, I'm excited. Intel is way overpriced compared to Arm at similar performance and power.


I should note that Xcode supports ARM perfectly well and MacPorts does the same for PowerPC. I'm sure they'll be fine.


Note that Brew doesn't even support macOS Sierra[1]. It doesn't outright refuse to run, but there's a big warning message that stuff will break and maintainers won't care—and lots of stuff does indeed break.

Macports's support for legacy versions of OS X is unique, and dare-I-say incredible.

1: https://docs.brew.sh/Installation#macos-requirements


O damn, that probably also means no more building docker images on the mac.


This article has some issues. The author states that apple's best ARM chip is comparable to one in a "top of the line" imac. This is utterly false.

They also state that the A13 trounces every other ARM chip by 2x yet uses a chart that doesn't include the Qualcomm 865 which is in current flagship android phones.

It doesn't really hurt the main point of the article but the logical appeal falls flat when you don't seem to really understand what you are discussing at a technical level.


Here: https://www.anandtech.com/show/15207/the-snapdragon-865-perf...

A13 still comes out a fair bit ahead in performance, but not in energy usage.


> the logical appeal falls flat when you don't seem to really understand what you are discussing at a technical level.

Qualcomm 865 doesn't make that much difference. Not to mention the 865 wasn't out when those benchmark were done.


> To that end, while I am encouraged by and fully support this bill by Congress to appropriate $22.8 billion in aid to semiconductor manufacturers

I fully don't agree. It appears as the exact opposite of what the rational use of market forces should be.


It's also about national defense, not just economy.

Republic of China is a big target of People's Republic of China. Sure, there shouldn't be a war, but in the worst case scenario, USA is kinda screwed by having all their chip manufacturing there. Notice that the new TSMC factory is with big emphasis on military contracts.


> but in the worst case scenario, USA is kinda screwed by having all their chip manufacturing there.

Nobody should have any illusion: "in the worst case scenario" this "where" any chips are manufactured will be simply totally irrelevant. It's obviously a wrong model to plan for anything. The realistic model is the scenario which is not the "worst case." And then, the question is just is the "aid" reasonable or is it better compared to other existing or possible market forces.

Why giving away money "to spur the construction of chip factories in America" when

1) the U.S. is anyway an immensely big customer and simply deciding on how it buys what it needs can already influence the markets

2) the existing companies already have "chip factories" in America.


It's not just about blowing up factories with missiles. There's a much greater persistent security threat.


Semiconductors are not just about the market for electronics, they are a national defense issue.

As the article talked about, relying on chips that can only be sourced from allies that are not defensible from your primary potential adversary is a major risk.

Why do you think Boeing et al are on the government defense teat? It's not just as a disguised socialist job program and Congressional slush fund. There are some legitimate reasons for government funding.

It's not "aid", it's government spending on national defense infrastructure and manufacturing capability.


> It's not "aid"

I don't see that claim supported by the text:

https://www.reuters.com/article/us-usa-semiconductors/u-s-la...

"A bipartisan group of U.S. lawmakers on Wednesday introduced a bill to provide more than $22.8 billion in aid for semiconductor manufacturers, aiming to spur the construction of chip factories in America"

Do you have some quote to support your claim?


It's a bad choice of words on the part of the journalist, but that's journalists for you. "Aid" implies an economic motive, whereas here the motive is very clearly national security. We don't give "aid" to the Marines or the Air Force. We spend public dollars on things are are important to national defense and security. Semiconductors fall into that category.


> whereas here the motive is very clearly national security.

It doesn't look as such. As far as I see it's mostly additional tax breaks for the companies that would anyway be awarded taxpayer's money through the defense contracts, and the "security" is a magical excuse for that.

The contracts could already have the desired conditions, but this is a way to avoid showing the real cost of them.


We're not talking about good economic policy here, we're talking about what makes sense from a national security standpoint.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: