Hacker News new | past | comments | ask | show | jobs | submit login
World of Warcraft 9.0.2 client runs natively on Apple Silicon (blizzard.com)
202 points by tosh on Nov 18, 2020 | hide | past | favorite | 221 comments



Just installed WoW on my M1 MBP and was blown away by the performance. I was running on battery, graphics set on max and the game was running 60fps without dropping a frame. And the fans never went on.

How is this even possible on such a small machine?

Here's a link for screenshots and video (notice how instant the game closes when exiting)

https://www.notion.so/emads/MBP-M1-Chip-running-WoW-at-60fps...


I'd be curious to see what happens in a raid. I currently get 60 FPS on my quite old Hackintosh in most places, but when you pack in a lot of spell effects from many tens of players at the same time, performance starts to drop fast. Especially now with the pre-expansion event and the world bosses.


I'd be curious too. I don't play WoW anymore so I won't be investing time to test.

It's important to note that I tested WoW on my MBP 15" 2018 max spec and the game ran ~25fps (Same zone, same character, doing quests) on lower graphics, and my MBP fans were spinning like a freaking jet engine.


That video is stuttering for me and it looks like you have about 10 fps, even when it's completely loaded (but I believe you, no worries, it's just funny) :P

This is Chrome on Linux and I don't usually have problems with video, e.g. YouTube.


It's 60 FPS 2880x1800 video at like 51 Mbps. YouTube's recommendation for 1440p 60 FPS uploads is 24 Mbps and most likely they serve at a lower bitrate.


Try Firefox on FreeBSD, no stuttering on my side :) And it really looks like 60fps very smooth and impressive for a Arm Machine.


That video was definitely 60 fps, no glitches or stuttering.


Would be nice if you set a higher resolution. I wonder what rendering size can you set before FPS dives


Yup, "1440x900" is far from max settings for me


This guy here gets around 43 fps on max: https://www.youtube.com/watch?v=mIbEIhizoXY


>it uses the intel version with rosetta 2 runs smooth.

That's not the native ARM version.


Ah, that was stupid of me.


Small heads up for phone or other bandwidth-constrained users, it's a 200+ MB video.


This is a related tangent, but I haven't seen this addressed anywhere else yet so might as well ask it here. What does Apple silicon mean for the future of gaming on Apple hardware? With the death of bootcamp and the apparent difficulty with emulation, is the only hope for individual developers to update to native support like Blizzard is doing here? And what does the future of GPUs look like? Are there expected to be dedicated GPUs on future Macbook Pros, Mac Pros, and iMacs or are they all expected to be SOCs going forward?


For Unreal and Unity-based games at least, ARM MacOS will be more or less just another compile target. For less-popular licensed engines or in-house ones, MacOS was never all that popular to begin with, so I think Apple Silicon may actually not represent that big of a change for Mac gaming.

As another comment mentions, there is the iOS to MacOS pipeline, but given that a large proportion of these are already Unity-based, the fact that MacOS now shares an architecture with iOS may prove largely incidental.


Gaming on the Mac was basically dead and will be revived with Apple Silicon, I believe. Even the MacBook Air now has enough power to run modern games smoothly, and more powerful models are to follow. A considerable amount of iOS games are suddenly playable on the new Macs, too. And yes, native support for Apple Silicon and Metal is clearly the way forward.


I just ran Kerbal Space Program on the Air, in addition to being smooth and having no discernible lag. Htop says it's only using half of one core.. and the laptop isn't even heating up.

This is the dream laptop. It's the one product that is 5 years ahead of everyone else, like the iPhone 4.


I don't believe the main problem with gaming on mac was the architecture or their anaemic graphic cards but rather their non-standard mac-only graphics API (metal), their deprecation of OpenGL, spotty support of Vulkan (if any) and removal of 32-bit support (for older games). The first three are not really a problem for big engine makers, as they can afford to implement additional backends but it is an issue for smaller studios and indie makers. 32-bit support is probably going to be less important (unless Rosetta can also translate x86 only code). It's good that they packed a decent graphics subsystem in their new chips but that's only part of the solution.

I believe their main concern is to bring iOS games to macOS rather than bring new games into the platform. Let's be realists here, iOS games bring in the cash. If they can run it in another form factor with potentially better graphics then even more money in the long run.


Metal doesn't get enough hate for how awful and idiosyncratic it is. If you want to remain cross platform and support Metal you're so severely restricted that the best you can hope for is to just special case a bunch of features for non-Apple machines and not support them at all on Mac. It really feels like Apple wanted to kill gaming on Macs entirely. I understand that they want to maintain control over their own graphics stack, but for this to work the stack has to be better than the competition and that's not the case here at all.


Could you go into more detail about how it's idiosyncratic? I'd love to find more information on what it does wrong vs something like vulkan.


For some indication, check out a lot of issue threads for WebGPU (the Rust cross platform graphics library). Many, many features need to be restricted or put behind feature flags due to either lack of Metal support, or Metal having bizarre restrictions compared to other platforms. As a random example, border color is limited to 2 or 3 colors for Metal, and has no restrictions at all for anything else. There are also entire features (like geometry shaders) missing from Metal; for many common usecases you can replicate the functionality with something else, but usually not in a way that easily translates to any of the other APIs (and pretty much nobody on Earth writes games just targeting Macs).

Hence, the easiest thing to do for most games is to just drop Mac support and not worry about it. IMO that's the biggest reason by far why people stopped bothering to port new games to Mac; between not updating OpenGL past 4.1 at best and Metal being so idiosyncratic, it's just not worth it, and since people have learned gaming on Macs sucks (largely because of the outdated and idiosyncratic APIs), most people who want to play games on a Mac have just been installing Windows anyway.

This is all not even to mention that Mac laptops tend to have far too high a resolution for the onboard GPU to perform well for any but the most rudimentary games; hopefully this has been addressed with the M1 batch having a much more powerful GPU. But the issues with Metal remain and will continue to remain until Apple either fixes Metal to make it a more Vulkan-like target, or adds really good Vulkan support.


Unsure but league plays at like 100 FPS through Rosetta - better than intel igpu.

I would expect they'll have an X version chip or future that are even more competitive with discrete laptop GPUs


League of Legends is not a demanding game. It will run at 120fps on a GTX 960 on max settings 1080p. (The 960 is about 1/4 as powerful as the current flagships according to passmark.)


We're talking about a laptop, and it sounds like you're talking about desktop GPUs, unless you meant GTX 960M.


My point is that a very old discrete GPU is capable of running League at very high settings, and therefore that the new M1 chip's capability to run League well doesn't prove much.

In a more relevant benchmark, it looks like the M1 is about 30% better than the GTX 1050 Ti, which was released in Oct 2016. So it's maybe two to three years behind the state of the art in low-power discrete GPUs? I wouldn't be worried about my discrete GPU business if I were AMD or Nvidia, at least not yet.


But the M1 chip isn't competing with discrete graphics, at least yet. The M1 is replacing integrated graphics in these machines. Integrated graphics also trail discrete graphics by years so it isn't really a knock on the M1 that it does too.

That leaves two primary questions. Does the integrated graphics of the M1 beat integrated graphics of Intel and AMD? What does Apple plan to do to compete with discrete graphics in their higher end hardware and will that be a more powerful SOC, discrete graphics hardware from Apple, or something more traditional from AMD?


> But the M1 chip isn't competing with discrete graphics, at least yet.

I understand that. I was responding to a comment that referenced a near future in which the Apple chips would compete with discrete graphics. Observing the distance they have to go is relevant to that speculation.


I'm just not sure the distance between integrated and discrete graphics performance is a better indicator of Apple's hardware capabilities than the differences between Apple, Intel, and AMD integrated graphics. It is a safe assumption that Apple isn't going to release an identical M1 as a competitor for discrete graphics. So why compare this to something in a different product class rather than judge it on how well it achieves the purpose it is designed to achieve?


> I'm just not sure the distance between integrated and discrete graphics performance is a better indicator of Apple's hardware capabilities

It's not a better indicator in any way, except in terms of being more relevant to the comment I was responding to at the top of the thread, which contemplated the possibility that Apple's chips would become competitive with discrete graphics cards in the foreseeable future.


That's pretty impressive, for mobile hardware. Did you test that yourself?


No I've been watching YouTube reviews all day. At 2500x1600 it was going 55-70 FPS.

Esports games are not intensive but the igpu appears to be better than intel Xe- so it's a straight upgrade in terms of YoY upgrades if they stuck with intel CPUs.

The only question mark is how they'll approach dGPU and eGPU performance levels.


Might start to make sense to do "upscaled iOS ports". iOS is lucrative enough that many developers will support it anyway. Adding mouse+keyboard support and gamepad support isn't insane and I believe Apple Arcade games require gamepad support. You won't get the highest end AAA games, but those games usually aren't coming to Mac anyway


>You won't get the highest end AAA games, but those games usually aren't coming to Mac anyway

But even if they didn't come to Mac they were always available through bootcamp or emulation. Is it simply not an option on this new hardware? That seems like an important step back.


I never found Bootcamp nor Parallels/emulation to sufficiently play a modern video game. A $3k mac always underperformed a $500 desktop windows PC in my experience


What does that mean? Bootcamp is just running windows on the machine. You're gaming on an Intel cpu and an AMD gpu. That doesn't change because Apple installed another OS on the system.

Sufficient or not is a weird way to put it.


I mean that mac hardware is terrible for video games in general, and that at some $1.5-$3k avg price tag, "Macs" (in the hardware sense) are almost always outclassed by PCs (in the hardware sense) that are less than half their cost.

The OP commented games were "always available through bootcamp or emulation", which is true of a Windows-only game running on your Mac hardware via Bootbamp/Parallels... my point is, Macs provide a sub-par, no good, very little fun experience for anything but the simplest of games.

And before I sound like a typical PC bro, this line of thinking is coming from someone who has exclusively owned about a dozen Macs starting with the iBook Clamshell, and only recently built my first gaming PC during COVID. I'd never try to play a game on a Mac again lol


The GPUs that Apple typically uses are far outclassed by what's available for Windows.


I suppose that would depend on what you put inside that $3k mac. It's very easy to slap a price tag on something 'wrong' and then complain it doesn't do the right thing ;-)

A PC with $4k worth of PCIe storage but no video output or networking would be very expensive but not very good at games.

A Mac with a reasonable video card (say, an RX580 or better) and a general i5-level CPU would run Windows and most games just fine. Won't be a 144Hz 4k party, but 1440/1080 for recent games on decent settings and at least 60FPS will work. But that's not a lot of difference to any other computer...

Difference would be what you start off with. If you buy any $1000 laptop that only has a CPU-integrated GPU but does have Thunderbolt 3, you can add an eGPU for that GPU power that you need and you'd be good to go.

If you buy a cheaper laptop that happens to have some sort of 1050 in it, it would be cheaper, but it probably won't have Thunderbolt 3 and then you'd be stuck with a GPU that will be a bottleneck faster than your CPU.

The biggest problem I'd see is that most Macs don't come with dedicate GPUs, and when they do it's a midrange AMD or workstation type GPU, neither of which are great for games (regardless of what computer you stick those in). The higher tier AMD GPUs did come to later Macs, but at that point (the 5000 series) it wasn't really all that good. Too bad with the 6000 series we don't know if Apple will even add any third party GPU anymore. Also too bad that they got in a fight with nvidia over their crap QA years ago and never got back together. Could have been great...


If you watch the WWDC keynote that’s literally what they are doing. You can write one set of code that adapts to iPhone/iPad/Mac and use the native sizes and tools for both.


It's gonna be the nVidia GeForce Now train that's currently pulling out of the station.


> What does Apple silicon mean for the future of gaming on Apple hardware?

It is definitely the death of gaming as we know it on PCs.

Blizzard didn't bother porting Overwatch to macOS, their latest engine, and even that game was released in 2016. They ported to the PS4. It's just too small of a market.

Nintendo Switch level games should port fine, many are developed on cross platform engines like Unity that already support Metal on iOS. However Switch games sell for $50+, already higher than PC or macOS titles, so it's usually pretty bad economics - you'd rather the user buy on their Switch than their Mac.


Curious why you think it's the death of pc gaming, unless you just mean Mac gaming


He means to say 'death of PC gaming on a Mac' because some people forget that all computers that are stand-alone for a person to use are "Personal Computers" (and not mainframes which is what the distinction is for).


> some people forget that all computers that are stand-alone for a person to use are "Personal Computers"

If you call a Mac a PC, you'll only confuse your interlocutor. No-one ever refers to Macs as "PC", except to point out that "Macs are 'Personal Computers'".


Over here, we just call them all 'computers'.


"PC" comes from the "IBM PC" which was the first MS/Intel computer. Then clones arrived, also called "IBM PC Compatible" because they were compatible with the IBM PC. Then we ended up calling them simply PC, and that was in comparison to Macs, but also Atari, Amiga, Amstrad etc. computers.


Some people forget the "I am a PC, I am a Mac" ads.


I suspect it was sarcasm. Overwatch is Blizzard's new flagship, and the first game by them to not run on Mac (Hearthstone did). They'll Soon (tm) release Diablo 4 and Overwatch 2, so we'll have to see what they're going to support with that.


Apple should just support valve with proton(wine). This would enable 90% of all Windows games on the Mac.


CrossOver runs on macOS just fine, including Apple Silicon macs.


Wow, I'm surprised! Great job Blizz

Blizzard has always been good at supporting Mac's, even going back as far as Warcraft 3.


Thankfully long before that. I recall Diablo 1 was released for Mac OS (9?) only a few weeks/months after PC, and Warcraft I and II had full Mac Support!

Too bad the GPU race has left Macs in the dust.


Diablo was released for PC in January 1997, and didn't show up on Mac until May 1998. StarCraft had a similar delay (March 1998 to March 1999).

All three WarCraft games were simultaneous releases for Windows and Mac OS, though.


IIRC it was't purely a post-release-port, they did have parallel development tracks before anything was released and for some titles they didn't want to risk the upfront cost for the relatively small percentage right away.


I remember going over to a friends place who owned a Mac in order to play Warcraft 2 network games.


Multi Player Game > TCP/IP ...

Miss those menus


Warcraft 2 was my first experience playing a cross-platform network game. I was on a PC and I dialled in to my friend’s Mac. Understanding how things are architected now it makes perfect sense, but as a kid my mind was blown since I considered these to be wholly incompatible platforms.


Except for Overwatch, RIP. Also Diablo 3 has always run terribly on MacOS. Pretty much only WoW runs decently. Maybe Starcraft 2 used to be okay but it since became way worse.


They probably ported WoW because they cannot afford to drop support for platform where they have tons of existing paying customers.

If they truly cared about the Mac, they would release Overwatch for macOS. It runs just fine on modern Macs via bootcamp.


It does certainly feel like they forgot about Mac when they got the console base with Overwatch.


Yep, oddly enough they made the game engine themselves like they did with almost all of their other games; apparently this time they didn't make it cross-platform.

Pretty strange considering the other (arguably more complicated) engines were supported from day 1, even their unreleased FPS (Ghost?) was going to be launched like that.


I'd LOVE to have a native macOS client for Overwatch. Since I upgraded to a 2019 16" MacBook Pro, my eGPU no longer works in Bootcamp due to the dreaded Error 12 and I can't resolve it. My eGPU works flawlessly in macOS. Plus, I hate using Windows and want to avoid it if at all possible.


You can fix Error 12 by using an eGPU loader in UEFI (you can just put it on a USB drive). The problem is that Windows sets the PCIe address space wrong and then it can't properly talk to the GPU anymore. It happened during one of those large Windows feature updates last year. You can find setup and documentation on egpu.io, goalque is the author. (it's called something like automate-eGPU EFI)


Thanks for the suggestion but I've already tried that and many other reported possible fixes that didn't work in my case.


Odd, the EFI loader puts the eGPU in a fixed space that always works (mainly because it no longer looks like an external GPU to the rest of the system), and Windows can't mis-configure it anymore.


What's wrong with Hearthstone on macOS?


I actually happen to still have my Warcraft 1 manual, and just checked to see that it supported Macs starting at System 7.


Going back to WarCraft 1. And StarCraft 1. And Diablo 1. But also Blackthorne, before they made any RTS.


Except for Overwatch.


And Warcraft 2!


That is excellent news for a wow enthusiast in the market for a new iMac (current iMac is a late 2013 model that runs wow just fine at lower settings).

If this would indicate other games coming to macOS as well, I can finally ditch Windows completely.


I haven't seen any benchmarks on how other Games perform. DO we know how games like CS GO would perform on Apple Silicon?


AnandTech did some testing of the GPU as well.

https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...

>The first Apple-built GPU for a Mac is significantly faster than any integrated GPU we’ve been able to get our hands on, and will no doubt set a new high bar for GPU performance in a laptop. Based on Apple’s own die shots, it’s clear that they spent a sizable portion of the M1’s die on the GPU and associated hardware, and the payoff is a GPU that can rival even low-end discrete GPUs. Given that the M1 is merely the baseline of things to come – Apple will need even more powerful GPUs for high-end laptops and their remaining desktops – it’s going to be very interesting to see what Apple and its developer ecosystem can do when the baseline GPU performance for even the cheapest Mac is this high.


I was going to test this soon (I’m finishing some set up stuff). I’m curious because CS:GO is utterly unplayable (<24fps) on my 2015 13” MBP, so I always had to resort to my desktop computer. Will update this when I do try it out !



There's a lot of benchmarks coming up on r/macgaming. CS:GO doesn't currently launch at all


Will it actually work though? I had to switch from Mac to Windows during Tomb of Sargeras because of an issue where ground-targeted AoEs (many of which straight up kill you from full health during a fight in which your group cannot afford any deaths) were invisible on Mac.


Where is the Linux client?


I know that some people successfully running WoW with wine. You might want to try it.


The very first versions of WoW were actually faster with Wine.

And I think they still officially-unofficially support playing WoW through Wine. Basically they check that it still works after every release and don't use any super-weird APIs Wine might have issues with.


Can confirm. I played WoW recently, even the 9.0 prepatch and worked perfectly with Wine Staging 5.21.


This is why Windows on ARM has largely been doomed while Mac on ARM will immediately succeed: Apple told its partners that this is the future of the Mac and expected them to deliver if they intended to continue to be on Apple platforms.

Microsoft has constantly and aggressively failed to commit. Be it to Mobile or ARM or HoloLens or whatever. They simply haven't made it clear that supporting their new platforms is compulsory/expected, and consequently, most vendors have simply passed on supporting them.


No, this is not the reason. The reason is that Apple didn't half-ass anything, and they executed with competence. The Apple M1 is so good, it outperforms most everything else natively in single-thread, and every other previous x86-based MBP when using Rosetta, all the while offering superior battery life, and its not even that expensive because those are the entry-level models.

Microsoft never had anything remotely like this. Their hardware is ok, but they forgot one important detail, that Apple has taken to heart like nobody else:

"People who are really serious about software should make their own hardware. (Alan Kay)"

Why would anyone want to switch to ARM on Windows if it wasn't a lot better than x86? What if their WOW x86 emulation was really slow?

For Microsoft, "own hardware" means plugging cots parts together. The best they can achieve is a "premium look & feel", but they can't do anything game-changing. For Apple, it means owning and designing the most important chips themselves. Because they are controlling the whole hardware & software stack, they can do things nobody else can. This is the difference in understanding design as "how it looks" vs "how it works".

Apple has a custom MSR in their M1 implementation to toggle Total Store Ordering, which enables them to run either the native, more relaxed ARM or the more strict x86 memory model. They probably use other tricks as well, but there is a reason Rosetta is as fast as it is.


Never half-ass two things, whole-ass one thing. - Ron Swanson


While funny, the moral I perceived is "Apple whole-asses everything thanks to vast invested resources", which is unfortunately not as actionable as the Ron quote


Apple was routinely criticized for years for not doing more things. Why don’t they have more models? Why doesn’t their software have infrequently-used features X and Y? Why aren’t there a million levers and dials so I can customize my machine?

It’s because they were whole-assing one thing.

I haven’t been keeping track, but I think it’s true that the breadth of their product line is bigger or nearly as big as it has ever been, and certainly much bigger than 10 years ago. They got there organically over the last 20 years, instead of developing shovelware.


spit my coffee out...


Do you happen to have a source on the TSO switch ? It sounds fascinating.



Almost every software offering under the sun runs natively on Windows, so it doesn't exactly make sense to give MacOS a pat on the back for that particular achievement with WoW.

I am all for innovation and I welcome any chip that runs cooler while offering better performance with the apps I use. But so far, 90% of the praise I've seen of M1 either comes from benchmarking software that doesn't reflect my real world usage, or directly from Apple's marketing.

It's a bit early to draw conclusions about how much of a smash hit M1 is going to be. We don't know much about its performance, aside from a few benchmarks here and there.


Eight years after Microsoft's first attempt to move to ARM with Windows RT, you still can't run legacy 64 bit Win32 software on a Surface Pro X. There is absolutely no excuse for that.

Just as there was no excuse for the Surface RT not having a touch native version of Office despite them selling it as a tablet, or the Surface Pro X not having ARM native versions of ALL of Microsoft's software.

Where's the ARM native version of Visual Studio?

How can you expect other vendors to take Windows on ARM seriously if Microsoft doesn't?


Are we really sure ARM is where the advantage here is with Apple? Not the 5 nm process?

Let's be real, it's just on par with the Zen 3 in single threaded perf which is on a 7nm+ process. When Zen 4 comes it might be moot.

AMD's entire market cap is practically on par with Apple's cash on hand (so very different companies in terms of size/capability) yet they are equal in performance single threaded? Not only that, in practical terms the AMD chip is very desirable in many situations because it has twice the cores.


AMD has beat Intel despite being at a process node disadvantage in the past.

We do have an entire shopping list of ways in which Apple's ARM core implementation leads vs. x86 in their five watt iPhone chip.

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...


Fair point. Without moving the goal posts, I should have clarified that I was referring to Windows on x86, and have never owned a Windows ARM device. What I was trying to say is that, if you want compatibility with the broadest swath of software, you don't generally buy a Mac. This is not as true as it used to be, but it's still true.

I don't have a Surface Pro X and would never buy one, but it's an example of how ARM isn't necessarily the answer to the user's problem, because the user may well use his device for more than running Microsoft Office and browsing the internet.

A glimmer of hope from https://docs.microsoft.com/en-us/surface/surface-pro-arm-app... on Oct 1, 2020:

With 64-bit emulation coming soon in Preview via the Windows Insider program, you'll be able to run 64-bit (x64) apps on Surface Pro X.


64 bit support should be coming soon [1]: With 64-bit emulation coming soon in Preview via the Windows Insider program, you'll be able to run 64-bit (x64) apps on Surface Pro X.

> Where's the ARM native version of Visual Studio?

Well, there is VSCode... [2]

[1] https://docs.microsoft.com/en-us/surface/surface-pro-arm-app...

[2] https://www.zdnet.com/article/microsoft-new-vs-code-update-i...


> We don't know much about its performance, aside from a few benchmarks here and there.

I don't know what to say to that, except that we must be living in different universes.


I haven't used an M1 device. Frankly, outside of Apple no real-world user (in my universe, can't speak for yours) has long-term experience with it.

Do you always believe everything that a marketing department tells you?

Does all of your real-world experience with consumer products tend to line up with benchmarking software and short-term reviews based on a test drive?


[flagged]


To the contrary, I placed an order for an Intel i7 MBP13 (32GB RAM, 4TB SSD) around the same date as the Apple Silicon presentation took place. I've read every review I can find, seen the benchmarking runs, and -- in light of the fact that my new device won't ship until mid December -- I could still cancel/return my order, buy an M1-powered MBP13, and save some cash.

That is to say, I have skin in the game.

Say what you will about my igneous domicile, but buying a new-to-market processor architecture is not something that I am willing to do just because some guy at The Verge or Tom's Hardware or AnandTech made some bar graphs or felt it necessary to treat hype as fact. I'm not saying they're dishonest or anything, just that the information is thin because the typical user isn't an instance of Geekbench.

My advice to them: Use the device for a couple of weeks, install all of your work software on it, put it through its paces, and then tell me about what you observed. That's the kind of review that would satisfy my curiosity. I do a lot of financial stuff on my machine, and the software (or software-based services) in that world is often proprietary, poorly developed, and/or expensive. Compatibility and performance of those programs are more important to me than battery life while browsing YouTube or render time in Final Cut Pro.


I take your point there - maybe I was a little short with no reason.

I'm in a similar position, I bought a 16 i9 at basically the same time. Certainly not adopting the first generation of a new processor architecture on a machine I use for work.

However, render time in FCP and number of channels in Logic Pro cannot be understated if they're really this good. It's crazy to see a sudden uplift in what's capable there. A lot of what I've read even indicates that general system use (windows opening /closing, scrolling, system animations) are noticeably faster on the M1.

Plus, I'm happy to take Apples word as they were totally right with the A12z in the iPad Pro. That thing outperforms most windows laptop in creative workflows.


It’s probably more complicated than that: Microsoft’s customers have traditionally been very different from Apple’s, and they valued perpetual backwards compatibility over Apple’s approach.


Basically Apple is selling a cozy computing environment that some people really like, whereas the operating system division of Microsoft is selling a compatibility layer between a bazillion of long tail niche applications and every single piece of hardware in existence that isn't either significantly smaller or larger than a PC (phones, supercomputers) or made by Apple. If Microsoft had an LTS version of Windows 2000 that was 64 bit, a lot of people would be willing to pay a premium. Microsoft thinks they are selling another cozy computing environment, but generally people don't really want more OS level bells and whistles.

Off topic ramble: unfortunately, Microsoft to have a bit of an acquired blindness regarding what users actually want (LTS Windows, innovate in Office or other installable software if you like) because paid software updates has so long been their culmination of visible success. Had they somehow switched to subscriptions by the time they moved to 32 bit Windows would be an entirely different beast today.


Microsoft is trying to have it both ways; turns out that's not what people really want (and when it's all about people they just fail -- Windows Phone anyone?), but because business clients (which generally doesn't mean the actual business end-users but some other department) tend to buy whatever looks good on paper.

It all just keeps going on with no real improvements. This disconnect between people and products has many forms, be it an IT project that is over budget and doesn't deliver the right stuff (often comes down to business people making impossible promises and requirements that were never properly gathered and keep changing).

Getting something that is good for the user instead of getting something that is short-term good for the business would be a good start.


>(and when it's all about people they just fail -- Windows Phone anyone?)

Everyone I knew who had a windows phone seemed to love it. If MS were taking a run at it today it would likely succeed. The nail in the coffin was Google refusing to write a Youtube app for Windows Phone and THEN blocking the app that MS wrote themselves.

I'm still shocked the EU didn't tear them apart on that move. It's about as anti-competitive as it gets. Honestly I'm still sad they gave up, we could've used a third player, even if they were going to always be a minority player.


There were hundreds of nails in the Windows Phone coffin. The straight incompatibility between 7 and 8 was a gigantic fuck you to every early adopter, dev or otherwise.


I still consider the Windows Phone (8 and 10, not 7) the best Phone OS I have ever used, and the Nokia Lumia hardware was great. The problem, as you mentioned, was that third party app support was dismal and the few apps that did exist where on the whole of a far lower quality than the Android or iOS equivalent.


> The nail in the coffin was Google refusing to write a Youtube app for Windows Phone and THEN blocking the app that MS wrote themselves.

How the hell were they not sued for that?? That is anti-competitive as hell.

Google is still doing shit like that e.g. by actively hindering Picture-in-Picture on iOS for years.


Google was de facto running the Federal Trade Commission during most of the Obama administration. The lead commissioner was a paid shill who wrote "academic" articles about how Google shouldn't face antitrust scrutiny while a law professor.


I had one and I loved it as well.

> The nail in the coffin was Google refusing to write a Youtube app for Windows Phone and THEN blocking the app that MS wrote themselves. There were at least 2 great third party youtube apps and youtube worked well in the browser as well. Including background play support which seems not to work on other platforms easily.


It works very well on Android, but it is a feature exclusive to YouTube Premium.


It also works great with Newpipe. They now even have their own F-droid repository so that you can get the latest updates asap when YouTube makes a breaking and they fix it.


It works very well on Firefox on Android, out of the box, with uBlock to get rid of commercials too. I shudder to think that one day Mozilla will tire of supporting FF on Android and I'll have to start using a YT app again...


Of course there were people that loved it, just like there were people that loved BlackBerry phones, people that loved PocketPC and people that loved the foot-x-ray-machine at the shoe store when that was a thing. I wasn't trying to say that it was bad or universally hated, and even if it was you'd find a niche that likes it ;-)

It's just that it wasn't loved by the masses, just like Zune wasn't loved by the masses, Windows S-mode isn't loved by the masses, and pretty much any Windows-on-ARM effort isn't loved by the masses.

Regarding YouTube: odd that they wouldn't make an app, you'd think that even with trident support they could at least put a web view in an app or something. You can't force a company to maken an app for another company, and with the current media/rights laws you can't make one yourself either, so not much of a lawsuit there.


Indeed, when I had a chance to play around with one I thought it was a pretty decent take for a first try.

I had no idea Google played this dirty trick, if that’s how it went they’d indeed deserve a thorough spanking for it.

I suspect Microsoft being Microsoft they must’ve made other blunders to murder the platform so badly though


I'd say the bigger nail in the coffin was, ironically, Microsoft's level of security/sandboxing.

tl;dr - Snapchat and a lot of other 'killer apps' never were written for the platform, and at least some of that was because the apps on Android/iOS did things Microsoft wouldn't let you (I think tracking when the user took a screenshot was one)


Snapchat refused to make a Windows Phone app. Microsoft even offered developers and resources to help, and they still refused.


I wonder how would it work if they ripped out all the crufty apps and UI configuration tools and replaced them with simple (dumbed down) controls like Gnome or Apple do.

All the more advanced knobs made available via Powershell, also like Gnome or Apple. Indeed any professionally managed configuration isn’t maintained with point’n’click panels anyway...


It's not as easy it sounds. It took a whole generation to decouple printing subsystem from UI libraries.

Also power users tweak a lot of little advanced settings via these point'n'click panels anyway. Even tuning a desktop PC needs a lot of tinkering down the road.

Windows is a very powerful platform as is but, it's not well suited to everything due to its architecture and design choices. This situation is equally valid for all operating systems out there.


Maybe it’s time for them to split out NT and a home/office-light edition again


They do, with Server 2019 for corporate servers. But your point is that no one is running Server 201x as their desktop OS, unlike the NT 4 .1 days, which then was rebranded into windows 2000, or NT 5


There's also the LTSC (Long-Term Servicing Channel) build of Windows 10[1], which doesn't have any of the consumer bits and bobs, and other than security fixes isn't designed to have any feature updates post deployment.

---

[1] https://techcommunity.microsoft.com/t5/windows-it-pro-blog/l...


LTSC is fantastic in that it ditches all the cruft. I wish they had a consumer-friendly license for it, though. Microsoft's vision for consumer windows 10 is no longer an unobtrusive platform that doesn't get in the way of the user, much less force unwanted features on them.


I wouldn't say _no one_, but it's definitely not common: https://www.windowsworkstation.com/

I definitely remember running windows 2008r2 on my home desktop because at the the time that was all that was available in the "your educational institution neither gives .edu emails or has arranged anything with microsoft" tier of dreamspark.


During my stint at MS many years ago, we were running 2008r2 as a Workstation OS and it was pretty good.


And the fact that qualcomm seems to be lagging behind apple's SoC.


Microsoft should split windows into two distinct versions.

Microsoft Windows Enterprise - focus on long term compatibility etc (essentially what windows is now)

Microsoft Windows X (seeing as they like using that letter) - which is way friendlier to breaking changes and advancing the OS.

I can't see any way for Windows to actually improve long term without doing this.


That "X" would basically be what's running on the ARM surface and arguably the whole line of precursors back to CE could be described like that. They end up wildly unpopular because they have all the bells and whistles people begrudgingly accept to get the one feature they want (continuity) but lack this one that they accept the others for. Continuity isn't just for enterprise. Non-technical users abhor having to re-learn anything or give up the procedures they struggled so hard with adopting (there are probably still some who feel uneasy if they haven't run defrag for more than two months, despite of having switched to NTFS more than a decade ago) and gamers and the like have lots of niche hardware and software solutions that easily puts the unsupportedness of vintage enterprise software to shame. Continuity is a key feature.

What I think could work, and would eventually lead to the same situation as "Enterprise/X" would be the reverse: a "regular" Windows aiming to eventually build a Microsoft equivalent of the Apple walled garden and a "continuity subscription" that's more conservative than what Windows is now (trivial example: opt into Cortana instead of opt out, barely)

PS: I understand that you are talking about low level technological differences and I seem to be hung up with UX, but I believe that separating groups on the high level would be the only viable way to eventually follow different paths on the low level.


The worshiping of windows compatibility /continuity (which I do consider to be impressive from an engineering standpoint) is astounding to me. It leads to what we have now - which, to me at least, is a frankly unusable operating system. A hodge-podge of 20 years worth of patterns, menus and systems. Some menus in windows feel they're in a completely different operating system.

This is why they should have 'Enterprise' and 'X' to me - they dont even have to look at different, but it frees one windows kernel team to do whatever they want and not have to worry about supporting loads of legacy systems. They'll naturally deviate.

It's actually kind of ironic to me that the best feature they've released in years (WSL) is essentially jamming another operating system into the mix. Like there's not enough going on there already!

I'm reading so much about how good Proton is that I'm considering dropping windows even for my gaming machine.


I consider the wildly mixed UIs not a symptom of compatibility but a symptom of the succession of hopeless attempts to be more than that. The incomplete new GUI generations are surely not caused by compatibility burdens affecting the kernel team.


No but they're a symptom of the 'all work on the same product' situation - if they were split, the people wanting to update the UI etc would work on the new version alongside the kernel team for the new version.


Microsoft's main problem is marketing - to be more specific, a naming problem.

Windows everywhere

They doomed Windows phone with the name WINDOWS.

Their windows on ARM, Win RT push was also handicapped and doomed the moment they named it Windows...

Explanation.

Windows means:

1. BSOD, instability, viruses,

2. Expectations of ability to run billions of existing windows software

The first thing that popped in anyone's mind about Windows Phone was 1 above.

For ARM, many frustrated customers returned their store-bought Windows ARM devices after they got home and could not run normal vlc, chrome, existing games...

Microsoft having realized this, has spent the last 4 years+ working on the ability to run classic windows apps on windows for ARM. They placed their bet on faster chips - Apple got there first.

If Microsoft names their ARM operating system something else, initial marketing dollars would be initially higher. But it would be worth it.

To developers, nothing will change, it's the same UWP, same APIs.

Imagine xbox was called "Windows Gaming". It too would be dead.


> Imagine xbox was called "Windows Gaming".

isn't Windows gaming the richest market for games since a proper market for games exists?


Yes! Today is a good day to reread Phil Haack’s 2014 essay on the lengths MS goes to for backwards compatibility: https://haacked.com/archive/2014/05/27/backwards-compatibili...

Edit: Haack heavily cites Raymond Chen from 2003: https://devblogs.microsoft.com/oldnewthing/20031015-00/?p=42...


On the flip side, supporting all that backward compatibility has been a massive anchor on Microsoft's ability to move Windows forward from decisions made in the past.


Yeah Microsoft is a like King John signing the Magna Carta to keep the thrown. Apple is like Louis XIV.


I believe you mean throne.

Still, as developer on the JVM I find the concept of a "Game of throwns" exceptional.


Ahah, "Game of throwns". Can only imagine the disaster :D


Be sure to catch every episode.

And optionally rethrow.


Oops!



Apple is also delivering some immediate obvious value to make for the migration pain, given the huge battery life increase and what look like substantial performance gains over the previous models.


Oh, sure, and Enterprise SKUs are going to be pinned into those holes forever, but Microsoft could've been more aggressively pushing their platform on the consumer side if they wanted to make any of the changes they've tried in the last ten years stick.

The view from the outside on every Windows platform initiative in the past year is dismal failure, even if the platform has incrementally benefitted a bit from each.

Most software is effectively still written on the same platform it would've been for Windows 7. That's a bad look for everyone building everything there since.

Even internally: Microsoft's own platforms largely failed to embrace UWP. If your own internal developers aren't buying in, why would anyone else?


If you liked apple because it ran your Mac apps, you already got off that train when they moved to Intel in the first place.


No, you got off that train during the 68K to PPC transition.


6502 to 68k


I actually did leave Mac at 68000 to 68020 transition. Color was interesting but too much money for a high school kid... Cue a shiny 386 for which you could by Turbo Pascal!!


Now that was a real disaster that made me leave the Apple ecosystem.


I think we'd see a serious boom in Windows on ARM if ARM vendors had a performance benefit to offer and Microsoft implemented a backwards compatibility layer as Apple has in this case. The difference here is in the offering. Microsoft was making a tablet offering where they're just not competitive with no ability to use x86 software while Apple here is offering much more - backwards compatible and all.

Until then there's just no reason to bother with ARM and vendors and users alike can see that.

Only recently has Microsoft announced a backwards compatibility layer, if we can see some hardware vendors step up to Apple levels of performance, there might be a serious shot now: https://www.extremetech.com/computing/315733-64-bit-x86-emul...


> Microsoft implemented a backwards compatibility layer as Apple has in this case

Windows-on-ARM already has 32-bit x86 emulation - and 64-bit x64 emulation is coming soon:

* https://docs.microsoft.com/en-us/windows/uwp/porting/apps-on...

* https://www.extremetech.com/computing/315733-64-bit-x86-emul...

I haven't looked too deeply, but it seems to use the same WoW (Windows-on-Windows) mechanism that's been present in Windows NT going back to at least Windows XP 64-bit Edition (no, not 2005's "Windows XP x64", but the original 2001 Windows XP for Intel Itanium IA-64).

> The WOW64 layer of Windows 10 allows x86 code to run on the ARM64 version of Windows 10. x86 emulation works by compiling blocks of x86 instructions into ARM64 instructions with optimizations to improve performance. A service caches these translated blocks of code to reduce the overhead of instruction translation and allow for optimization when the code runs again. The caches are produced for each module so that other apps can make use of them on first launch.

Looks like Windows dynamically translates code in DLLs/EXEs on a JIT/on-demand basis and caches each translated run-of-code.


Running x86 code via Rosetta on the MacBook Air is significantly faster than it is running on the previous generation. With the Surface X, that isn't remotely the case. It's a pretty major gap.

This is from the Verge's review of the Surface Pro X:

> The worst part is that even if an app installs, it doesn’t mean you’re going to have a great experience. Photoshop installs and opens just fine on the Surface Pro X, but the usability of it is terrible. https://www.theverge.com/2019/11/6/20950487/microsoft-surfac...

Maybe it's improved since then? Everything I've heard suggests there is a pretty big performance penalty running x86 code on Windows ARM. If there is a significant performance hit on the M1, the performance gains of the M1 are so good, it erases the difference.

The only way an emulation layer works is if the new platform is fast enough to erase that difference. When Apple moved from PowerPC to x86, it nearly did. With this transition, Apple knocked it out of the park. Microsoft? Not so much.


Rosetta seems to have a similar performance hit to Microsoft's solution (10-30% overhead compared to native, depending on the workload).

The difference is that Microsoft is locked into using Qualcomm chips, which just aren't very close to Apple's performance.


The impression I got was that it was a lot slower than a 30% performance hit, but perhaps the Qualcomm CPU is that much slower? If so, it really begs the question—Why bother making a tablet-laptop thing which performs so poorly?


The tablet-laptop form factor is great, and a lot of people simply don't need high performance but do need to run some specific line of business application from 2003, so including x86 emulation makes sense.


It's a bad cycle.

Lackluster performance means low adoption. Low adoption means developers have no incentive to support it. Lack of developer support means little or no software will be written for the platform so performance never improves. Repeat.


> Lackluster performance means low adoption

And yet Electron rules cross-platform desktop development, not Qt nor wxWidgets or openstep.

Developers are fine with worse performance - even lackluster performance - if a platform or framework offers some other compelling advantage, especially _developer performance_ (i.e. it saves money).

Electron saves money because creating one-off custom UI widgets can be done easily and cheaply with just HTML+CSS+JS. While there are many things that are literally impossible with HTML+CSS+JS at-present (such as "splitting" a rendered UI, like how iOS 5 did with its notification banners and home-screen folders), but besides certain flashy effects everything is doable quickly. Whereas building a custom UI in any other platform would take days or weeks just to get a prototype done if you couldn't hack-on additions to an existing "UI component".

So your line of reason would be improved by saying:

"Lackluster performance with no other benefits means low adoption", the rest then follows.


Because those performance stats are the usual level, Apples own chips just exceed the general trend by a huge amount. The hardware is just better than Qualcomms or any other widely available ARM cpus.


Basic example is how even the MS's very own Visual Studio is horrible slowbloat (won't even mention how ugly), compared to let'say IntelliJ editors on the same Windows that are noticeably faster.


Have you tried running VS in Software-mode without hardware acceleration of the UI/graphics?


Huge difference in performance though.

M1 seems to run things on Rosetta emulation in some cases faster than they ran natively because of how much uplift the hardware has.

Microsoft ARM devices have such lower performance than any emulation will run worse than natively, making it essentially unusable as a serious offering.


Both behaviours make perfect sense for the respective companies.

Apple is selling hardware (and now services), the software at most should break even. Apple gets obvious benefits from its approach - verticle integration, hw sales...

Microsoft sells software. MS doesn't care if the platform is x86 or ARM. Why would it? MS gets paid either way. As far as its concerned it would offer the option (to make Windows future-proof and run on more devices) and let the market decide.


Microsoft cares because if Mac laptops have twice the battery life of windows laptops, nobody will blame Intel


Won't they blame Dell/Toshiba, not Microsoft? In my experience, users think of the vendor as being entirely responsible for battery, even when the software may be using it inefficiently.


Users will see all their friends using Apple products getting much better battery life, and their friends not using Apple products getting much worse battery life.


Agreed. People blamed Apple for Intel’s foibles up until a few days ago.


It still doesn't make sense for MS to push ARM that way - Apple isn't offering its ARM processors to 3rd parties. The best non-Apple processors are obviously x86 right now.


My question is what is the play for Intel and AMD right now? Are they shitting their pants or is this something that was somewhat expected given the progression of Apple's chips and their announced plans? Do Intel/AMD understand how Apple managed to so outmaneuver them? Is the performance gain Apple has delivered an inherent advantage of ARM over x86, and if so, does that imply that we are going to be seeing a migration to ARM in the PC space in the coming decade?

I'm deeply fascinated by these questions. I'd love it if someone in the chip design industry who is knowledgeable about the situation would comment.


AMD is fine. Its processors are competitive (often superior) and it's gaining marketshare.

Intel has been in trouble for some time due to its own failure to downscale, a failure which has little to do with x86. They'd be losing to someone else if Apple/AMD hadn't been there.

Intel still has some time to recover: it has plenty of money, and Apple/AMD are unable right now to take full advantage of their position. Apple/AMD rely on a supplier (TSMC) which does not have the capacity the supply all or even most of the processor demand, so Intel gets quite a bit of marketshare almost by default.


Given that Apple only has about 10% of the world desktop market, and zero server presence, very unlikely.


Apple isn’t selling to third parties, but I bet Google wants to. Google must be putting pressure on Qualcomm, and maybe Nvidia, to make something competitive. If they try, within a few years I’m sure someone can.

Imagine 3 years from now there are a bunch of Chromebooks selling at Chromebook prices, but offering intel performance. That would start to hurt Microsoft.


From the way they have been selling in Europe, there is a very long path to go.

Chromebooks are only relevant in US school system.


Chromebooks are actually widely used in schools in the Nordics.


Here in Germany they just do timid apperances in some consumer stores and then are reduced in endless offerings until someone finally takes them away.

Very few people are willing to pay the same price as a regular laptop for what seems just a big browser OS.


Is it though, or is it lacking the software investment that Apple made.

There are a good amount of chipsets that are now on 5nm processing. But only Apple made a full desktop OS to go with it.


Having used linux on High end ARM phone chips, my 4 year old Intel laptop, also runnign linux, is far more performant than they are.


I think one of the other reasons is that Apple designed their own cpus. And put a lot of effort in. Microsoft just grabbed some off the shelf stuff for the surface. So it’s not even close in terms of performance.

Imagine if Microsoft said you could get a surface with arm processor, 16gb of ram, with 20+ hours of battery life and the same perf as an i5 processor. And it lived up to the hype. People would love it.

My 2c


Surfaces are very, very custom. Their x86 versions even are almost entirely bespoke: There are custom Surface drivers for nearly every component. Their hardware team is top notch, and I guarantee you hardware is not where Microsoft is falling flat.

The Surface Duo is another example: The hardware was ready for years. If it'd launched when it was, they'd have scooped the foldable trend super early. But the Windows on ARM platform was still such a joke, they gave up and put Android on it.


> Surfaces are very, very custom. Their x86 versions even are almost entirely bespoke: There are custom Surface drivers for nearly every component.

Yet almost all use Intel CPUs, and the performance of the Surface Go / Surface Go 2, is abysmal compared to the M1.

The Surface Pro X?

Single Core score ~800 vs ~1700 for the M1

Multi Core score ~3000 vs ~7500 for the M1

https://browser.geekbench.com/v5/cpu/compare/4799088?baselin...

It's a ARMv8 CPU... How is it 'very custom'


> Their hardware team is top notch

That keyboard... it makes the bad Apple ones look top shelf. I know removable keyboards are a different thing but I cringe remembering the clicks and trackpad.


Honestly, the typing experience for me of the Surface keyboard is perfectly fine... it's just that the form factor makes lap-use difficult. I didn't think it'd be a problem for me, but a year or two in, I felt regret there.

The Surface Book seems like a vast improvement in that regard... but it's extremely expensive, and relied on an overengineered latching mechanism. Dell made a really neat Book alternative, the Latitude 7285, which was really compelling, but by the time I was in the market, they'd abandoned that form factor already. :/


I don't know, I think the Surface pro keyboard is surprisingly good all things considered. I agree the trackpad doesn't feel great, but it is accurate and usable enough and the typing experience is certainly no worse than any other similarly sized laptop I've used and better than many. For a removable keyboard I think they've done a great job.


Microsoft can't even let go of x86 and make x86_64 mandatory. It could them years to recognize ARM as a valid target.


At it‘s most basic, Microsoft‘s makes billions by enabling enterprise customers to remain lazy.

If you don‘t force them they will never move an inch. They‘d rather pay insane amounts for support for decades old shitty software than to finally modernize.

It‘s easy money for MS. Though I‘m sure Windows devs hate it.


Microsoft was committed to supporting Windows on Itanium but Intel mucked it up in several areas - if they perhaps got AMD onboard (a free license to the architecture - which would have made AMD64 moot) and made the first variant faster than x86 and consumer priced/focussed.

I am afraid that X86 will live on forever more with just more wacko instruction extensions that nobody will use.


Apple own the whole vertical of their platform.

Microsoft don't: they are in Google's position with Android, that they have to work with the third party hardware vendors and their value-subtracted software and so on.


>they are in Google's position with Android

Not true, Google just doesn't have to care what CPU is in your Phone.

You can have Mips/Arm/x86/powerpc whatever you want.


Microsoft still has a severe UX disadvantage with Windows and failed to understand that simplicity over complexity always wins. Most Microsoft products from Ballmer era are trying to do too much. VS Code is the prime example that you can produce a product with few features, people are going to use it and if you don't commit the previous mistakes that poison the UX of Word and Windows you are going to succeed. 5 years ago if somebody told me that I am going to be using a Microsoft product more than Vim for programming and text editing I would have not believed.

Microsoft has the ability to innovate and produce amazing products it just they carry too much weight from the past and that holds back entire product lines.


Why Windows on ARM is doomed? It's there. Microsoft just doesn't have immediate need or reason to switch to it immediately, while Apple does.


I'd say it's doomed because not even Microsoft has bothered to create ARM native versions of all their software.

Apple went all in on ARM native versions of their stuff on day one.


Well consider this, when WOW came out it supported PowerPC and Intel so WOW has effectively lasted longer than Apple on Intel did


World of Warcraft came out over half a year before Apple announced the Intel transition.


Also Adobe release Photoshop beta for Mac/Windows ARM.

It is ironic that windows ARM was released long before, but Adobe support it until Mac ARM on Market.


[flagged]


They are consistently making wildy successful games over many genres: Warcrafts, Starcrafts, Diablo, WoW, Hearthstone, Overwatch.

I wouldn’t be so trigger-happy to call a company dead just because it has been a few years since the last major title. Same goes for Bethesda.


The most recent of Blizzard’s mainline games was released about 5 years ago. The second-to-most recent a decade. Diablo III, HOTS, and Starcraft are all now in maintenance mode, with no new development. Warcraft III was recently “remastered” into oblivion, replacing the original game for all its fans.

Only WoW and Overwatch are currently active, and both are fueled mostly by their microtransactions. Yes, WoW has a subscription, but despite having a fraction of the player count they used to, they’re making more money than ever thanks to those microtransactions.

Reportedly, DI, D4 and Overwatch 1.5 are in development, but how long will that last now that the last of the old guard leadership (saying nothing about the talent) from Blizzard of old is gone?

Blizzard, the company of old that we loved for its innovative and polished games which were released when they were ready, is dead. There’s nothing left but the name and thoroughly flogged IP.


The Blizzard I know (from when I was playing a lot more video games) has never been releasing a lot of games. Between 2000 and 2010, they released only four games: Diablo II, Warcraft III: Reign of Chaos, World of Warcraft and StarCraft II: Wings of Liberty. I'm sure even today's Blizzard can somehow manage to release four games between 2020 and 2030, although I doubt they will have as strong an impact as these four.


Those four are all legendary, one can note. They were big titles.


Hearthstone is still quite active! World of Warcraft and Hearthstone were both top-20 games streamed on Twitch in the past week, with Overwatch coming in at #33. Even though Hearthstone was released in 2014, it's still the premier digital card game. It's just a different model than the old "big release" model - there is no need to release a "Hearthstone 2", the game constantly gets new content and new game formats based on the same IP.


I fear that is all pretty much true. I suspect a downside of the mass market game business becoming so generalised and generic. Business goes where the money is, and apparently it's not in the niches of old (which is what PC gaming was tbh - we thought it was big but it was nothing compared to where we are now)


In fairness, the only real moneymakers from that list are Overwatch and Hearthstone. We could maybe add WoW to that list due to the die hard players still paying a subscription (if that's still a thing?) but Warcraft, Diablo and Starcraft and basically hibernating IPs at this point. None of those hibernating IPs have seen any major movement or releases in _many_ years. Starcraft is still viable as it's considered to be a mainstay of completive RTS (and therefore great for branding of Blizzard in general) but I don't think it's earning them buckets of money.

So we're down to 2 properties making the majority of earnings, both on Free to Play but not Pay to Win models. That model has a very distinct shelf life of, I'd say, 5 years before you need to either release a sequel or change it up with a new IP.


According to ACTI, WoW is making more money than ever, largely thanks to the microtransactions they added. Their subscription numbers haven’t maxed out again, but their revenue has.

And isn’t that something.


I haven't played WoW in 10 years, but I know a number of people who still love and prefer the latest releases. From what I hear, there is still usually a lag when content patches stop for a given expansion, but most of the real complaints I hear about the game are from people who were on their way out of love for it anyway, whether they realized it or not. I've admittedly been tempted a number of times to play around with it, but I know it's akin to just having a beer after becoming an alcoholic and then quitting. Not a great idea.


The subscription is still a thing. I briefly looked into it early into lockdown when I was completely off work for a month and a half: they wanted 15 bucks a month, and were still charging $50 for an expansion that was about to be irrelevant. I don't know how/if they're drawing in any new or even returning players at that steep a buy-in.


WoW broke a billion dollars in bookings for 2020 according to the most recent earnings call. It's currently Blizzard's highest revenue game, and has a full size team working on making expansions for it.


So that's Blizzard. ACTIVISION on the other hand? (the other half of the same parent company): straight up killing it.


All those games you list are bloated games that are really only fueled by nostalgia players now. When is the last time Blizzard really innovated and made something we all had to play like Diablo I and II, Starcraft, and of course WoW? I don’t think it can anymore. It sold its soul to Activision and all the passionate people are gone, judging by their recent games.


I completely agree. Overwatch was a reactionary game, meaning they wanted to cash in on the success of games like it. Heroes of the storm is also just that; a reactionary game.

Usually it's a very bad sign when a company turns after the marked instead of being a leader. It's a sign of growing too big and getting conservative. It's a steady decline that is difficult to get out of.


They took a concept (coop/role FPS such as TF, and MOBA) and tried to make a great franchise and implementation of it. I'd say they did the same with Hearthstone (hello Magic: the Gathering), and RTS (Warcraft I wasn't the first RTS). Not sure if Diablo was the first ARPG. Starcraft obviously was not the first RTS. Arguably, the only original game they ever made was their coop platform game, Vikings back when it wasn't even called Blizzard.

Overwatch is a completely new franchise though, so in that regard its novel.

The cashing in can be seen in different ways. Selling out their Diablo IP to NetEase for a mobile game (and having the guts to market it for a PC crowd). Their microtransactions in WoW while still having the subscription model as well. Them laying off nearly a thousand people while profit is at all-time high, while CEO gets bonus. All the silly DLCs for SC2. At least they didn't bother with another DLC for Diablo 3 after RoS.

Mike Morheime left for Dreamhaven together with a crowd of entrepreneurs and daredevils. Hopefully he succeeds where Schaefer brothers failed.


All their games have been reactionary, though? It's not like Diablo was the first hack and slash, nor Warcraft the original RTS. Blizzard's strength have always been refinement - take an existing genre and make the very best version of it. I think they succeeded with that in both Hearthstone and Overwatch.


no, that's not what i mean. nothing is really original when in some respect. Reactionary means wanting to get in on a trend. Heroes of the storm was made because games like dota and league was making bank. Overwatch was made based on the growing fps genre's sucess.

Their early games was not made on the sole reason to emulate something popular.


Overwatch is a very popular ISP for Blizzard. All my friends "needed" to buy this $70 AUD game, in a time when games like CS:GO ($15), League of Legends, and Dota were very popular and most of them F2P.

That was only 2016...


However you want to trash them, the fact is that they maintain 3 of the top 20 games viewed on Twitch at this moment.


And McDonald’s is the top fast food chain but that doesn’t make it good. Rather it caters to the lowest common denominator, something Activision-Blizzard has excelled at. I’m talking about quality.


McDonald's is an amazing company. We are discussing product/financial health here, not whether you like them.


Wow peaked at ~20 million players. Overwatch has 40 million and Hearthstone 100 million. I think by most metrics more people are passionate about the newer games.


Since World of Warcraft they’ve released Starcraft 2, Diablo 3, Hearthstone, Heroes of the Storm, and Overwatch. Those all seem to be quite popular (though admittedly D3 took some time to find its footing).


Of those, one is active, one in “slowed” development, and the rest have been put into maintenance mode. The downside of live services; they get shut down eventually.

Plus, WoW was released 16 years ago? And their most recent - Overwatch - almost 5 years ago now.


That would be Hearthstone but your point still stands.


Yes, dead company when it comes to creativity, taking risks and building innovative products. Dead company when it comes to making money? No, not at all.


There is an ongoing legal battle between Epic and Apple that may have an impact on any public communications they do regarding the engine and Apple platforms. Other than that, Unreal already builds to ARM so it doesn’t seem like a stretch to expect it, from a technical perspective.


First good news about new Macs.


This shouldn’t be seen as a great accomplishment. Basically you just recompile the source code with the new compiler. Unless the compiler has bugs, you’ll get a new Apple M1 binary build.

Please don’t make this seem bigger than it is.


How about all the routines that are written in x86 assembly? Of which, in a performance sensitive application such as a game, there probably are a lot.


Yeah.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: