Hacker News new | past | comments | ask | show | jobs | submit login

OK I love this, but I am pretty sure my chances of getting a mac in the future are zero. From which side should I expect the PC response? Intel? AMD? Microsoft? ARM? Samsung? I think Apple willingly or not has just obliterated the whole consumer pc market.



The new Ryzen mobile processors should be interesting. Their GPU drivers (while not of the best code quality) are in the mainline Linux kernel. So it all should just “work”


Currently using a Renoir laptop. It's smoking fast but I had to install a bleeding edge kernel to get the display driver to work at all. That should be fixed in the next ubuntu release though.


5.10.x kernels are very stable and feature complete with AMD Ryzen Renoir - I update one almost weekly once new patch version is out on Ubuntu Kernel PPA. Here is nice script which makes the update trivial: https://github.com/pimlie/ubuntu-mainline-kernel.sh


Weird, I had no issues with the built in display on Ubuntu 20.04, but I had to update the kernel to 5.8 to get display out over USB-C to work. Now that Ubuntu 20.10 is out and uses 5.8, I'm just using that so I don't have to mess with custom, unsigned kernels.


I just installed 20.04.1 on a 4750U Lenovo T14s and everything just works as far as I can tell.


What I want is to take a mobile CPU and put it in a normal form factor for power saving and excellent integrated graphics. Why is this not possible?


The only company making mobile CPUs that are at all competitive in a laptop is Apple. We’ll have to wait to see what Qualcomm et al. manage to build.


The new mobile chips are basically a mix of new and old stuff, with them rebranding ryzen 2 parts. Kind of disappointing.


But the slightly tweaked "old stuff" is relatively low-end - up to and including the 5700U. You'll find it in thin 'n light, budget and business laptops that will have more than enough power from a Zen 2 core. If you really absolutely need more power, you'll know it and you'll be shopping for a 5800H (or above).

If you don't know enough about CPUs to even read reviews that compare the CPU to other CPUs, then you either don't need the extra IPC of Zen 3 (and you won't notice when you use your laptop day to day) or you just... don't care.

If you care, get a 5600U/5800U or H line and it will never affect you. The laptops these come in should be priced accordingly.


What exactly do you expect? x86 is quite competitive. M1 might be slightly better, but it's not like it's miles ahead.


I want a return to the status-quo when for 80% (it could be even less but let's not dwell in the number) of the price of an Apple laptop I could get a windows/linux machine matching or surpassing its specs(including stuff like battery life,energy consumption, screen dpi, noise, etc). This is not true now and I am not seeing an option in the short term.


That was never really possible, or certainly wasn't ever since the laptops went to HiDPI displays. There were just a lot of teenagers showing off their ability to make gamer PC parts lists and claim that it's better than a Mac, but the value of the display/trackpad/integration have been good for a long time. It's just now the quality of the SoC is that good.


I’ve had my m1 air for a few weeks now and going back to the trackpad on my Dell XP’s 15 feels like the stone ages... it’s amazing... and safari on the Mac even makes Jenkins and Jira fast...


Perhaps it is still true, but Apple is somehow becoming the budget option in that equation?


Fiddling around on an M1 MBA it felt faster than my 2020 16" MBP. It's half the weight, seems to get double the battery life, and costs less than 1/3.

I just can't even imagine what the gap is going to look like when Apple really refines this down.


>It's half the weight

Its 70% the weight at 70% the volume of the 16". What is the point of comparing the weight of a 13" and a 16" laptop?


It is faster despite the lower battery capacity and thermals. In fact, the Macbook air has no fan.


You should compare it to other than the old Apple laptops. The Ryzen models, such as ThinkPad T14 are very fast, and if you want to go tenfold from there, there is no comparison with the modern Ryzen desktop CPUs. Why Apple always failed with Intel and its thermals is why they feel so slow compared to the M1.


A 2020 16" Macbook Pro isn't old by any measure. This argument seems disingenuous.


But they are extremely bad with their thermals and Intel is way slower than whatever AMD is pushing right now. The thing is Apple just made quite mediocre hardware. And now they're back where the competition is.


Intel is not way slower than AMD in the mobile space.


The response from Intel seems to be betting on the Evo platform [1], with third parties announcing laptops like the XPG Xenia XE

1 https://www.intel.com/content/www/us/en/products/docs/evo.ht...


You have to read between the lines here. They make no claims about CPU performance - just integrated GPU performance from Xe (which is a big improvement from previous Iris GPUs.) Then they claim battery life (9 hours, FHD, 250-nits, etc.)

What that means is laptop OEMs will have to limit TDP on CPUs - probably 15W or less. Given current Intel chips being very power hungry, these are likely NOT going to be great CPU performers.

The only competition in CPU space to M1 will be Ryzen 5000U chips in the 15-25W thermal envelope. They should be ~19% more powerful/efficient than Ryzen 4000U chips, but I would not expect M1 levels of cool or battery life yet.


> Ryzen 5000U

Am I right to assume we can see benchmarks within the next few weeks?


Some laptops seem to be on sale now, and others were targeting January 26 as the release date. That being said, I do not know if there's an NDA from AMD or if all variations of the chips (i.e. H, U, HX, etc) will be available this month or over time in the next few months.

So I think there's a good chance in the next week or so we'll start seeing benchmarks.


So they're announcing a "platform". Smacks of managerial bottom-covering. Almost like forming a committee to investigate the problem.

- where was this 1/2/10 years ago?

- how would this address the fundamental CPU performance gap?

- Intel has no competitive GPU offering, yet another glaring failure on their part

- why would OEMs go along with this when Ryzen is a better CPU, GPU, aside from getting Intel bribes and the usual marketplace branding momentum?

- will this actually get ports migrated to laptops faster? It was criminal how long it took for HDMI 2.0 to hit laptops.

I get Intel doesn't own the stack/vertical integration, but Intel could have devoted 1% of its revenue to a kickass Linux OS to keep Microsoft honest a long time ago and demonstrate its full hardware.

Even if only coders/techies used it like Macbooks are standard issue in the Bay, it would have been good insurance, leverage, or demoware.



"I get Intel doesn't own the stack/vertical integration, but Intel could have devoted 1% of its revenue to a kickass Linux OS to keep Microsoft honest a long time ago and demonstrate its full hardware." Interesting point.. makes one wonder why didn't they do it while having a mountain of cash.


They were so in bed with Microsoft.

Microsoft being a true monopoly might have struck fear in the timid souls of Intel executives that they would go headlong for AMD.

Or Google had this opportunity for years, and half-assed ChromeOS. Or AMD. Or Dell/HP/IBM who sold enough x86 to have money on the side.

I don't buy that it would have been hard. Look at what Apple did with OSX with such a paltry market share and way before the iPhone money train came. First consumer OSX release was 2001.

Sure, Apple had a massive advantage by buying NeXT's remnants and Jobs's familiarity with it and the people behind it, but remember that Apple's first choice was BeOS.

So anyone looking to push things could have got BeOS, or an army of Sun people as Sun killed off Solaris. The talent was out there.

Instead here we sit with Windows in a perpetual state of the two-desktop tiled/old frankenstein, Linux DE balkanization and perpetual reinvention/rewrite from scratch, and OSX locked on Apple.


They do have something – Clear Linux [0]. Definitely not too mcuh investment, but they do differentiate by compiling packages for much newer instruction sets compared to other distros.

[0]: https://clearlinux.org/


The real differentiator would have been an army of good driver coders and contributors to KDE/GnomeX.


Actually Intel has small army of Linux driver developers. Last time I counted in git logs there was around 100 of developers who one way or another contributed to Linux graphics stack: kernel drivers, Xorg, Mesa, etc. We can't really know how many people are working behind the scenes.

Yeah of course it was possible for Intel to do more, but they're clearly largest contributor to Linux graphics stack anyway.


They had Intel Clear Linux, a server-oriented distro. Quite good at what it targeted.


Intel Clear Linux still exists and is still developed by Intel.


> I get Intel doesn't own the stack/vertical integration, but Intel could have devoted 1% of its revenue to a kickass Linux OS to keep Microsoft honest a long time ago and demonstrate its full hardware.

Moblin, MeeGo.


Yeah, on top of it all, given all of the shots of the reference models look vaguely like a Mac Book, it really feels to me like Intel dug around in their couch cushions to come up with a response.


That looks more like ultrabooks but not watered down again - I'm not impressed.


Hardly. Apple has nothing to offer for high end gaming and I don't think they care about it.

For me it's Linux with AMD both for CPU and GPU.


I'd be excited to see arm competition in the desktop space but I don't believe there are any arm chips that compete in performance to high end x86. Arm can do lots of cores which is very good for servers, but single threading performance is still a significant necessity on user end hardware.

The M1 is great because it's low power with optimized performance, but on a desktop you can have well over 500W+ and that's normal.

I don't see anyone else making only mobile arm chips for laptops other than trying to be like a Windows Chromebook. The software compatability will be a nightmare.


M1 cores outperform Comet Lake cores and are basically tied with AMD Vermeer despite using a fraction of the power.


Who cares if you have the darn thing plugged in anyway? Does the M1 outperform the Threadripper 3990x?


I think to understand the M1, you have to be a Mac laptop user. For years, Mac laptop performance lagged years behind high-end desktop performance -- they have been stuck on 14nm+ process chips with a mobile power budget, while desktop users have had 7nm chips that can draw 500W with no trouble. As a result, what M1 users tell you is fast is what PC desktop users have had for ages. A Threadripper and 3090 will blow the M1 out of the water in raw performance (but use a kilowatt while doing it, which a laptop obviously can't do).

At my last job, they issued us 2012-era Macbooks. I eventually got so frustrated with the performance that I went out and bought everyone on my team an 8th generation NUC. It was night and day. I couldn't believe how much faster everything was. The M1 is a similar revelation for people that have stayed inside the Mac ecosystem all these years.


Yeah after years of company-issued Macbook Pros I built myself a Ryzen 3900x dev machine last year and it was like waking up from one of those dreams where you need to do something urgently but your legs aren't cooperating.

Given the benchmarks I've seen I imagine the M1 would be a somewhat comparable experience, but using a desktop machine for software development for the first time since...2003(!) has really turned me off the laptop-as-default model that I'd been used to, and the slow but steady iOSification of MacOS has turned me off Macs generally. Once people are back to working in offices I'd just pair it with an iPad or Surface or something for meetings.


I've been high-end desktop Linux user all my life and Thinkpads were my primary choice for laptops. Last two years I've been using x1 carbon gen3 released in 2017 so it's not so far from current-gen Intel laptops.

Yeah Air on M1 cannot beat triple 1440p 164HZ monitor setup with high end desktop hardware, but it's still damn impressive. It's has slightly better 16:10 screen, more performance than any of current x1 thinkpads and Air that I bought is absolutely silent too.

Also might be in the US you have comparable prices on Thinkpad or some Linux-friendly laptops, but where I live I bought Macbook for 2/3 of comparable Thinkpad price. I've used to buy used ones, but now they became much more expensive due to high demand and much less of air travel (less smugling I guess).


I would never buy a Thinkpad outside of edu-deals. They cost less and get 3y on-site support. It's a good deal, but otherwise they are overpriced for what they delivered recently...

Right now, I am torn between a *FHD*, AMD X13, 16GB RAM/500GB SSD for about *1000€* and the Air 16/500 at 1400€. I hate my life for that decision right now...


I think it's a moot question, since they're not comparable here. Laptops have inherent thermal limitations (not just power) that don't allow something like the Threadripper to be workable.

If you wanted a fair comparison you'd wait to see what processor Apple puts in their Mac Pro in 2-3 years, and compare that to whatever is the Threadripper equivalent then.


Personally I like the noise level in my room being 32 dB and don't like PCs having to run the fan at full speed to show the desktop picture.


Any desktop PC, even with the beefiest Threadripper, won't run the fan at full speed when you're just browsing the web or watching videos.

Heck, if you're not squeezing the absolute maximum performance from a chip by overclocking (I am :P) you can run a 5950X on a regular tower cooler with the fan curve never coming close to 100% and it will still be incredibly fast.


That is fine and I get it. Just realize there are lots of people that need/want more power, variety, compatibility with a lower cost. Water cooling is plug and play these days for desktop rigs. The gaming laptop market has become shockingly good for professional level CAD/Engineering apps with the industry focused on drastically reducing noise levels (albeit still quite loud in comparison to fanless). Trade offs... trade offs as far as you can see...


They are also on a different process shrug


So they are competitive, because they are a node ahead?


When you shrink nodes you can either increase performance, cut power, or go with some mix of the two. TSMC's 5nm is 30% more power efficient OR 15% faster than their 7nm process.

Apple took the power reduction.

https://www.anandtech.com/show/16088/apple-announces-5nm-a14...


5nm, they are two nodes ahead compared to Intel’s 10nm


Process node improvements don’ty bring that much performance.


Given the massive drop in power consumption and therefore heat, they seem to bring inordinate amounts of performance in a mobile chip.


The apple a14 brought no improvements over the last generation despite being on a superior process node.


Even evolution within the same process node can bring noticeable performance improvements. Launch day AMD Zen 2 (TSMC 7FF) chips could barely clock to 4.2GHz, ones manufactured months later can often do 4.4.


> I don't see anyone else making only mobile arm chips for laptops other than trying to be like a Windows Chromebook. The software compatability will be a nightmare.

I'd expect Microsoft to make a run at this with their Surface line.

They've been trying to make ARM tablets for a years (see Windows RT), and they just recently added the x86-64 compatibility layer so that it could be actually useful instead of "it's an iPad but worse".

https://blogs.windows.com/windows-insider/2020/12/10/introdu...

Will it see any success with 3rd party developers probably not bothering to support ARM? Maybe for some people who spend most of their time in Edge, Mail, or Office. I have a hard time seeing it being as successful as Apple's change, since the messaging from Apple is "This is happening in 2 years, get on board or you'll be left behind" and the messaging from Microsoft is "Look, we made an ARM tablet, and will probably sell at least 10 of them."


I do think that's a relevant point for desktops. The M1 is incredible, but if you don't care about TDP, desktop can catch up fairly quickly for either Intel or AMD.

I don't see an obvious player currently working on a "broad market" high performance ARM chip for the commodity desktop.


> but on a desktop you can have well over 500W+ and that's normal.

You only see that on gaming rigs and high end workstations. The typical office machine or ma & pa PC is either a laptop or a mini-PC with mobile parts.


The rest of the PC market has a chicken and egg problem. Software won’t be developed for a new CPU until it has market share, but a new CPU has little chance of getting market share without software.


> but I am pretty sure my chances of getting a mac in the future are zero

Why?


For me:

1. A huge catalog of apps and games, old and new, that are mostly windows and linux, but all exclusively x86.

2. I don't want to get locked in to an ecosystem with no variety. For example, if Dell's offering laptop has a keyboard I hate or poor selection of ports, then I can switch to Lenovo or HP or go for a gaming laptop with an Nvidia GPU and use it to train NN, etc, the variety is amazing. While if Apple's next machine has a flaw I dislike, then too frikin bad, I'll be stuck with whatever Apple bestowes upon me.


I cannot speak for OP, but software is the reason for me. I can not run Solidworks on MacOS without bootcamp or other tricks, for example. Photogrammetry apps? GIS apps? CNC CAM apps? I mean, compare the Mac[1] compatible apps to the catalog of Apps Autodesk has.

The fact of the matter is GPU support on MacOS is just not there in the same way it is with windows. It is really hard to justify the price of Apple when you compare directly to Windows offerings spec for spec. When you compare the sheer volume of software available for Windows as compared to MacOS, especially if you need a discrete GPU there are really no comparisons.

[1]https://www.autodesk.com/solutions/mac-compatible-software


I'm always surprised when I see discussions about somebody dumping one operating system for another. Isn't your operating system choice dictated by the applications you want to run?


Presumably these people use cross-platform software.


Price for one, also I still consider PC and Linux (and god heavens even windows) more open than Mac. I also work with many windows-only automation software.


sorry but what's "PC" above, given it's not windows?


A generic term for hardware able to run linux and/or windows. Or any personal computer not build by Apple. You can have any combination of open/closed hardware and OS.


Apple willingly or not increased their market share? Seriously, the AstroTurf in this thread is off the charts.


They have bitten off an even bigger chunk of the meal they had nearly finished already:

- Developers

- Creatives

- Dutiful citizens of The Ecosystem

What they are no closer to biting off is gamers and hardware enthusiasts. Anyone who actually needs to open their PC for any purpose whatsoever.

I know Apple wants to be the glossy Eve to all the PC market's Wall-E's, but I will continue to shun them forcefully as long as they remain hell-bent on stifling the nerdy half of all of their users.

I assume the developer popularity is a result of the iPhone gold rush, which Apple exploited using exclusivity tactics. Therefore I consider it an abomination to see developers embrace the platform so thoroughly. iPhone development should feel shameful, and when there's no way to avoid it, it should be done on a dusty Mac Mini pulled from a cardboard box that was sitting in the closet ;)


> I assume the developer popularity is a result of the iPhone gold rush, which Apple exploited using exclusivity tactics. Therefore I consider it an abomination to see developers embrace the platform so thoroughly.

That's not why most developers I know use macs. We use them because we got tired of spending a couple days a year (on average) unfucking our Windows machine after something goes wrong. When you're a developer, you're often installing, uninstalling and changing things... much more than the average personal or business user. That means there are a lot of opportunities for things to go wrong, and they do. Even most Google employees were using macs until they themselves started making high end Chromebooks; tens of thousands of google employees still use macs. Some users have moved on to linux, but most stay on mac because they want to spend time in the OS and not on the OS. I can appreciate both perspectives, there's no right answer for everyone.

I share your wish that the hardware was more serviceable but everything is a compromise at the end of the day and that's the compromise I'm willing to take in exchange for the other benefits.

Some complain about the price, but the high end macbook pros aren't even more expensive than windows workstation laptops. Our company actually saved several hundred dollars per machine when we switched from ThinkPads with the same specs. Not to mention, our IT support costs were cut almost in half with macs.

So, aside from gaming or some specific edgecase requirements, it's hard for me to justify owning a PC. That said, I have one of those edgecase requirements with one of my clients so I have a ThinkPad just for them. But, it stays in a drawer when I'm not working on that specific thing.


On the gaming front, I've been trying GeForce Now recently and while it may not be great for competitive FPS games, it has otherwise destroyed any reason why I'd ever purchase another gaming PC unless they start jacking up the monthly price. It works on basically all platforms (including my iOS devices), it doesn't spin up my MBP's fans and doesn't eat through battery life. I don't have to worry about ARM vs x86, I don't even get locked in to the platform like Stadia, it connects to Steam, Epic and GOG.


Wow, obvious bias much? Great way to engage in reasonable, level headed conversation is to lead with telling people they should be ashamed for not holding your values and opinions. I honestly can't think of a better way to demonstrate to most people why they _should_ get a Mac than to just show off comments like yours.


Convince me otherwise! :)


> I think Apple willingly or not has just obliterated the whole consumer pc market.

They definitely won the "general public use laptop" market (if you conveniently ignore the rest of the laptop and OSX, which are utter crap IMO), but its important to understand that they really didn't invent anything, they just optimized. And optimizing something like this means you make the stuff that is most commonly used better, while reducing the functionality of the rest.

Compare the Asus Zephyrus RoG 14 laptop with the Macbook pro. G14 has the older Ryzen 9 4900HS chip, and while the single core performance of the MBP is better, the multi core is the same despite the 4900HZ being a last gen chip. G14 gets about 11 hours of battery time for regular use, MBP gets about 16, but the G14 also has a discrete GPU that is superior for gaming than the integrated GPU of the mac. Different configurations for different things.

Then, even ignoring the companies decision to mix and match components, the reason why you can buy any windows based laptop and install Linux on it and have 99% of the functionality working right out of the box is because of the standardization of the cpu architecture. With the Apple M1, even though Rosetta is very well built, its support is not universal across all applications, some run poorly or not at all.

And while modern compilers are smart, they have a long way to go before true cross operability with all the performance enhancements. Just look at any graphical linux distro, and the fact that all android devices out there run linux, but there isn't a way to natively run it on them without the chroot method that doesn't take advantage of the hardware.

So in the end, if you are someone that just wants a laptop with good performance and great battery life, and don't care about any particular software since most of your time will be spent on the web or in the Apple Ecosystem software, the M1 machines are definitely the right choice. However, if you want something like a dev machine with Linux where you just want to be able to git clone software and it work without any issues, you are most likely going to be better off with the "windows" based laptops with traditional AMD/Intel chips for quite some time.


And Apple will court (allow) developers until competition shows up. Then they will shut it down and obfuscate. The fact that Apple refuses to provide FOSS drivers themselves indicates this.


Apple has never provided drivers for other operating systems, including windows. The drivers used for BootCamp aren’t even developed by them.


Well, they clearly at least commissioned them from someone else, since they are the ones distributing them.


Intel Macs are literally PCs. The Windows drivers that Apple bundled up for Boot Camp were the same drivers that the respective component manufacturers provide for other PC OEMs to redistribute. In the entire history of Intel Macs, there have been very few components used by Apple for which there wasn't already an off-the-shelf Windows driver written by someone other than Apple for the sake of their non-Apple customers.


No, Macs have a lot of custom components, and the ones with Touch Bar or T2 added even more. Those have Apple developed Windows drivers.


They do include these too, but I'm mostly talking about stuff like the trackpads (before SPI, the protocol they ran over USB was also custom, not HID)


> Apple willingly or not has just obliterated the whole consumer pc market.

Apple probably has the best laptop out there as of today, but I don't think Apple sales performances are impacted that much by their hardware perf actually: around 2012-2015 or something they had several years with a subpar mobile phone, both on the hardware a and the software side, and it still sold very well. A few years later, they have the best phone on the market and… it didn't change their dynamic much: it still sells very well, as before. On the laptop market, they have been selling subpar laptops for a few years without much issue, and I guess it won't change much that they now have the best one.

Apple customers will get a much better deal for their bucks, which is good for their long term business, but I don't think it will drive that many people out of the Windows world[1] just for that reason (especially with the migration/compatibility issues which are even worse now than they where running on Intel).

Also, many people outside of HN just don't listen to Apple “revolutionary” announcement, they have used that card too much, for no good reason most of the time, so people just stopped listening (even my friends who are actually Apple customers).

[1]: which is where most people are tbh, and I don't think that many Linux people would switch either.


Agreed - since MacOS is even more of an entire eco-system, moving in and out of that is much more of a long-term commitment for most regular users.

People who are multi-platform in daily life, are much more likely to switch - and that's a rather small percentage of computer users (and of course very much over-represented here at HN).

> they have used that card too much

you can never have enough "magic" :-)


Wow it looks like I really pissed off a bunch of Apple fans by saying that they have at some point sold subpar products…


You might have to bite the bullet and get a mac - I don't see much promise from others in this space.

Intel's CEO change may save them, but they're definitely facing an existential threat they've failed to adapt to for years.

Amazon will move to ARM on servers and probably some decent design there, but that won't really reach the consumer hardware market (probably - though I suppose they could do something interesting in this space if they wanted to).

Windows faces issues with third party integration, OEMs, chip manufacturers and coordinating all of that. Nadella is smart and is mostly moving to Azure services and O365 on strategy - I think windows and the consumer market matter less.

Apple owns their entire stack and is well positioned to continue to expand the delta between their design/integration and everyone else continuing to flounder.

AMD isn't that much better positioned than Intel and doesn't have a solution for the coordination problem either. Nvidia may buy ARM, but that's only one piece of getting things to work well.

I'm long on Apple here, short on Intel and AMD.

We'll see what happens.


I just got my M1 Air. This thing is unbelievably fluid and responsive. It doesn't matter what I do in the background. I can simultaneously run VMs, multiple emulators, compile code, and the UI is always a fluid 60 fps. Apps always open instantly. Webpages always render in a literal blink of an eye. This thing feels like magic. Nothing I do can make this computer skip a beat. Dropped frames are a thing of the past. The user interface of every Intel Mac I've used (yes, even the Mac Pro) feels slow and clunky in comparison.

Oh, and the chassis of this fanless system literally remains cool to the touch while doing all this.

The improvements in raw compute power alone do not account for the incredible fluidity of this thing. macOS on the M1 now feels every bit as snappy and responsive as iPadOS on the iPad. I've never used a PC (or Mac) that has ever felt anywhere near this responsive. I can only chalk that up to software and hardware integration.

Unless Apple's competitors can integrate the software and hardware to the same degree, I don't know how they'll get the same fluidity we see out of the M1. Microsoft really oughta take a look at developing their own PC CPUs, because they're probably the only player in the Windows space suited to integrate software and hardware to such a degree. Indeed, Microsoft is rumoured to be developing their own ARM-based CPUs for the Surface, so it just might happen [0]

[0] https://www.theverge.com/2020/12/18/22189450/microsoft-arm-p...


So much this. M1 mini here. I am absolutely chuffed with it. It’s insanely good.

I’m going to be the first person in the queue to grab their iMac offering.


Are you saying you don't see much promise for AMD, Intel and Nvidia in the GPU space or with computers in general? I had a hard time following your logic.

Apple may own their stack, but there are a TON of use cases where that stack doesn't even form a blip on the radar of the people who purchase computer gear.


My prediction is x86 is dead.

External GPUs will remain and I think Nvidia has an advantage in that niche currently.

The reason stack ownership matters is because it allows tight integration which leads to better chip design (and better performance/efficiency).

Windows has run on ARM for a while for example, but it sucks. The reason it sucks is complicated but largely has to do with bad incentives and coordination problems between multiple groups. Apple doesn't have this problem.

As Apple's RISC design performance improvements (paired with extremely low power requirements) become more and more obvious x86 manufacturers will be left unable to compete. Cloud providers will move to ARM chipsets of their own design (see: https://aws.amazon.com/ec2/graviton/) and AMD/Intel will be on the path to extinction.

I'd argue Apple's M1 machines are already at this level and they're version 0 (if you haven't played with one you should).

This is an e-risk for Intel and AMD, they should have been preparing for this for the last decade, instead Intel doubled down on their old designs to maximize profit in the short term at the cost of extinction in the long term.

It's not an argument about individual consumer choice (though that will shift too), the entire market will move.


> My prediction is x86 is dead.

I don't see that. At least in corporate environment with bazillion legacy apps, x86 will be the king for the foreseeable future.

And frankly I don't really see the pull of ARM/M1 anyway. I mean, I can get a laptop with extremely competitive Ryzen for way cheaper than MacBook with M1. The only big advantage I see is the battery, but that's not very relevant for many use cases - most people are buying laptops don't actually spend that much time on the go needing battery power. It's also questionable how transferable this is to the rest of the market without Apple's tight vertical integration.

> I'd argue Apple's M1 machines are already at this level and they're version 0

Where is this myth coming from? Apple's chips are now on version 15 or so.


This is the first release targeting macOS, I'm not pretending their chips for phones don't exist - but the M1 is still version 0 for macs.

> "And frankly I don't really see the pull of ARM/M1 anyway. I mean, I can get a laptop with extremely competitive Ryzen for way cheaper than MacBook with M1..."

Respectfully, I strongly disagree with this - to me it's equivalent to someone defending the keyboards on a palm treo. This is a major shift in capability and we're just seeing the start of that curve where x86 is nearing the end.

“No wireless. Less space than a Nomad. Lame.”


> but the M1 is still version 0 for macs.

Fair enough, it's just important to keep in mind that M1 is a result of decade(s) long progressive enhancement. M2 is going to be another incremental step in the series.

> to me it's equivalent to someone defending the keyboards on a palm treo. This is a major shift in capability ...

That's a completely unjustified comparison. iPhone brought a new way to interact with your phone. M1 brings ... better performance per watt? (something which is happening every year anyway)

What new capabilities does M1 bring? I'm trying to see them, but don't ...


> "That's a completely unjustified comparison. iPhone brought a new way to interact with your phone."

People don't really remember, but a lot of people were really dismissive of the iPhone (and iPod) on launch. For the iPhone, the complaints were about cost, about lack of hardware keyboard, about fingerprints on the screen. People complained that it was less usable than existing phones for email, etc.

The M1 brings much better performance at much less power.

I think that's a big deal and is a massive lift for what applications can do. I also think x86 cannot compete now and things will only get a lot worse as Apple's chips get even better.


> People don't really remember, but a lot of people were really dismissive of the iPhone (and iPod) on launch.

I do remember that. iPhone had its growing pains in the first year, and there was a fair criticism back then. But it was also clear that iPhone brings a completely new vision to the concept of a mobile phone.

M1 brings a nice performance at fairly low power, but that's just a quantitative difference. No new vision. Perf/watt improvements have been happening every single year since the first chips were manufactured.

> I also think x86 cannot compete now and things will only get a lot worse as Apple's chips get even better.

Why? Somehow Apple's chips will get better, but competition will stand still? AMD is making currently great progresses, and it finally looks like Intel is waking up from letargia as well.


>M1 brings a nice performance at fairly low power, but that's just a quantitative difference. No new vision. Perf/watt improvements have been happening every single year since the first chips were manufactured.

I'd say the M1's improvements are a lot more than performance per watt. It has enabled a level of UI fluidity and general "snappiness" that I just haven't seen out of any Mac or PC before. The Mac Pro is clearly faster than any M1 Mac, but the browsing the UI on the Mac Pro just feels slow and clunky in comparison to the M1.

I can only chalk that up to optimization between the silicon and the software, and I'm not sure that Apple's competitors will be able to replicate that.


> "Why? Somehow Apple's chips will get better, but competition will stand still?"

Arguably this has been the case for the last ten years (comparing chips on iPhones to others).

I think x86 can't compete, CISC can't compete with RISC because of problems inherent to CISC (https://debugger.medium.com/why-is-apples-m1-chip-so-fast-32...)

It won't be for lack of trying - x86 will hold them back.

I suppose in theory they could recognize this e-risk, and throw themselves at coming up with a competitive RISC chip design while also somehow overcoming the integration disadvantages they face.

If they were smart enough to do this, they would have done it already.

I'd bet against them (and I am).


RISC vs CISC is not real. Anyone writing articles about it is uninformed and you should ignore them. (However, it's also not true that all ISAs perform the same. x86-64 actually performs pretty well though and has good memory density - see Linus's old rants about this.)

ARM64 is a good ISA but not because it's RISC, some of the good parts are actually moving away from RISCness like complex address operands.


Very much this. Intel is not that stupid. They went in the lab, built and simulated everything to find that the penalty of extra decoding has an upper bound of a few percent perf/Watt max once you are out of the embedded space.

OTOH, apple is doing some interesting things optimizing their software stack to the store [without release] reordering that ARM does. These sorts of things are where long term advantage lies. Nobody is ever ahead in the CPU wars by some insurmountable margin in strict hardware terms.

System performance is what counts. Apple has weakish support for games, for example, so any hardware advantage they have in a vacuum is moot in that domain.

Integrated system performance and total cost of ownership are what matters.


I'm confused - I thought the reason that it's hard for Intel to add more decoders is because x86 ISA doesn't have fixed length instructions. As a result you can't trivially scale things up.

From that linked article:

--

Why can’t Intel and AMD add more instruction decoders?

This is where we finally see the revenge of RISC, and where the fact that the M1 Firestorm core has an ARM RISC architecture begins to matter.

You see, an x86 instruction can be anywhere from 1–15 bytes long. RISC instructions have fixed length. Every ARM instruction is 4 bytes long. Why is that relevant in this case?

Because splitting up a stream of bytes into instructions to feed into eight different decoders in parallel becomes trivial if every instruction has the same length.

However, on an x86 CPU, the decoders have no clue where the next instruction starts. It has to actually analyze each instruction in order to see how long it is.

The brute force way Intel and AMD deal with this is by simply attempting to decode instructions at every possible starting point. That means x86 chips have to deal with lots of wrong guesses and mistakes which has to be discarded. This creates such a convoluted and complicated decoder stage that it is really hard to add more decoders. But for Apple, it is trivial in comparison to keep adding more.

--

Maybe you and astrange don't consider fixed length instruction guarantees to be necessarily tied to 'RISC' vs. 'CISC', but that's just disputing definitions. It seems to be an important difference that they can't easily address.


People are rehashing the same myths about ISA written 25 years ago.

Variable length instructions are not a significant impediment in high wattage cpus (>5W?). The first byte of an instruction is enough to indicate how long an instruction is and hardware can look at the stream in parallel. Minor penalty with arguably a couple of benefits. The larger issue for CISC is that more instructions access memory in more ways so decoding requires breaking those down into micro-ops that are more RISC like, in order that the dependencies can get worked out.

RISC already won where ISA matters -- like AVR and ARM thumb. You have a handful of them in a typical laptop plus like a hundred throughout your house and car, with some PIC thrown in for good measure. So it won. CISC is inferior. Where ISA matters it loses. Nobody actually advocates for CISC design because you're going to have to decode it into smaller ops anyway.

Also variable length instruction is not really a RISC vs CISC thing as much as also a pre vs post 1980 thing. Memory was so scarce in the 70s that wasting a few bits for simplicity sake was anathema and would not be allowed.

System performance is a lot more than ISA as computers have become very complicated with many many I/Os. Think about why American automakers lost market share at the end of last century. Was it because their engineering was that bad? Maybe a bit. But really it was total system performance and cost of ownership that they got killed on, not any particular commitment to a solely inferior technical framework.


I agree that's a real difference and M1 makes good use of it, it's just to me RISC ("everything MIPS did") vs CISC ("everything x86 did") implies a lot of other stuff that's just coincidences. Specifically RISC means all of simple fixed-length instructions, 3-operand instructions (a=b+c not a+=b), and few address modes. Some of these are the wrong tradeoff when you have the transistor budget of a modern CPU.

x86 has complicated variable instructions but the advantage is they're compressed - they fit in less memory. I would've expected this to still be good because cache size is so important, but ARM64 got rid of theirs and they know better than me, so apparently not. (They have other problems, like they're a security risk because attackers can jump into the middle of an instruction and create new programs…)

One thing you can do is have a cache after the decoder so you can issue recent instructions over again and let it do something else. That helps with loops at least.


Remember, M1 is on the leading edge 5nm fab process. Ryzen APUs are coming and may be competitive in terms of power consumption when they arrive on 5nm.

Apple software is also important here. They do some things very much right. It will be interesting to run real benchmarks with x64 on the same node.

Having said all that, I love fanless quiet computers. In that segment Apple has been winning all along.


> At least in corporate environment with bazillion legacy apps

They could just run them as virtual desktop apps. Citrix, despite its warts, is quite popular for running old incompatible software in corporate environments.


Ok, I still have questions.

To start... How would a city with tens of thousands of computers transition to ARM in the near future?

The apps that run 911 Dispatch systems and run critical infrastructure all over the world are all on x86 hardware. Millions if not Billions of dollars in investment, training, and configuration. These are bespoke systems. The military industrial complex basically custom chips and x86. The federal government runs on x86. you think they are just going to say, "Whelp, looks like Apple won, lets quadruple the cost to integrate Apple silicon for our water system and missile systems! They own the stack!"

Professional grade engineering apps and manufacturing apps are just going to suddenly rewrite for apple hardware, because M2 or M3 is sooooo fast? Price matters!!!! Choice Matters!!!

This is solely about consumer choice right now. The cost is prohibitive for most consumers as well, as evidence by the low market penetration of Apple computers to this day.


Notice how the only counter examples you came up with are legacy applications. This is the first sign of a declining market. No, Intel will not go out of business tomorrow. But they are still dead.

The growth markets will drive the price of ARM parts down and performance up. Meanwhile x86 will stagnate and become more and more expensive due to declining volumes. Eventually, yes, this will apply enough pressure even on niche applications like engineering apps to port to ARM. The military will likely be the last holdout.


You make bets on where the puck is going, not on where it currently is.

"How would a city with tens of thousands of HDDs transition to SSDs in the near future?"

It happens over time as products move to compete.

Client side machines matter less, the server will transition to ARM because performance and power is better on RISC. The military industrial complex relies on government cloud contracts with providers that will probably move to ARM on the server side.

It's not necessarily rewriting for Apple hardware, but people that care about future performance will have to move to similar RISC hardware to remain competitive.


Wow I feel like I'm back in 1995 or something. Stupid Intel doubling down on the Pentium! DEC, Sun, and Motorola RISC will extinct them!


I think we'll see a lot of ARM use cases outside of the Apple stack and x86 is dead (but it will of course take its sweet time getting there). For the longest time everyone believed at a subconscious level that x86 was a prerequisite due to compatibility. Apple provided an existence proof that this is false. There is no longer a real need to hold onto the obsolete x86 design.

The only way for Intel and AMD to thrive in this new world is to throw away their key asset: expertise in x86 arcana. They will not do this (see Innovator's Dilemma for reasons why). As a result they will face a slow decline and eventual death.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: