Hacker News new | past | comments | ask | show | jobs | submit login
The End of Windows (stratechery.com)
583 points by samsolomon on April 2, 2018 | hide | past | favorite | 597 comments



It isn't really the end of Windows - it's the end of Windows' dominance of Microsoft mindshare.

Windows 10 was the giant step in that direction - an always updating, maintenance mode, single OS that MS will need to support. Arguably until Win 7 dies out they won't be entirely there but once they are, maintaining Windows is a much simpler affair than yesteryears.

None of this means Windows itself is dying - it only means the OS has matured enough for the needs of present and any future needs will be addressed as and when they arise - instead of inventing the future they will abide with it on their own schedule.

This allows Microsoft to focus on things that truly matter for the future without killing Windows or spending a lot of resources on advancing/supporting it. It's not hard to imagine Windows stays a very dominant client OS for a long time to come - even if the overall PC market share continues to decline, because none of the alternatives are there and nobody is going to invest the resources to get them there.

I will also add that Satya's success lies in speeding up this strategy that started late under Ballmer and also in executing so well on it. Considering the scale of what MS is doing - Office ports to iOS, Android, Windows 10 releases, Azure, ton of other Enterprise stuff (Exchange, O365, Development tools, cross platform stuff like .NET Core, VS Code etc) - the change of course and the success they are having with it all - you can't argue it's not a phenomenal achievement.


You can see a nice graph showing (Firefox) users steadily upgrading from Windows 7 to 10 from Mozilla's Firefox telemetry:

https://hardware.metrics.mozilla.com/#goto-os-and-architectu...


That hardware report is surprising to me in many ways.

* In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.

* In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

* A surprising number of people still have Flash installed.

I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768. Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?


> In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.

Most of the tasks in the real world doesn't need multiple 4K displays, including low level and systems development. Most people read stuff, and as a developer and system administrator, when the text is looking good, this means the resolution is enough.

> In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

8 GB is more than enough for most people. My family's Win10 desktop is happy with 4GB, my office desktop is cozy with 8GB. My own desktop has 16GB of RAM, but it runs many, albeit small, virtual machines. "The hardware is cheap, let's waste it" mentality doesn't help anyone and it's wrong. I've written some state of the art algorithms which use 1.5MB of RAM and make the CPU scream for cooling (I develop high performance computing software as a side-academic gig), so like every resource, RAM should be used sparingly.

Edit: I've no comment for flash. I think it's a forgotten remnant of old systems.


As someone who reads text all day, going up to a 27" 150 PPI monitor was huge. There is a tangible improvement in how that text looks, especially on an accurately colored monitor.

The other footnote should be that display prices have been crashing recently. You can get an IPS 24" FHD monitor for like $90, and a QHD version at 27" for about $150. Those would have been twice as expensive a few years ago.

That being said, all those 768p screens are crappy plastic laptops with really slow hard drives. That I guess is what we end up with the Intel took what should have just been the natural evolution of notebooks - small SoCs running a high PPI display in a metal frame - and made them into some premium brand name product with a huge margin on the prices of chips that cost peanuts to manufacture because they didn't have any real competition in the space for a very long time (and even then, their monopolistic behavior lets them keep AMD out of all the major brands premium notebooks anyway).


> As someone who reads text all day, going up to a 27" 150 PPI monitor was huge.

You're right, however not everyone has the same desktop space to accomodate a 27" panel. I can barely fit a 24" on my desk. 1440p monitors start at 25".

> The other footnote should be that display prices have been crashing recently.

When I was in the US at the end of 2014, one of my friends said the same thing about flash drives when I pointed out a $44 PNY 128GB flash drive. Unfortunately, other parts of the world doesn't work same way. Because EUR or other currencies are not fixed against US$, and prices fluctuate at most parts of the world if not increase. So, no, technology doesn't become cheaper as it matures at some parts of the world unfortunately.

Addendum: BTW, you are right about the 768p screens are generally found in entry level laptops or netbooks. These devices are most feasible ones when they first came out. Now they are bottom end cash cows which are virtually free to build.


Lenovo still sells 768p screens in their high end X-series laptops including the very recent X280.


Which imo is criminal, if the laptop is starting at >$700 they should "splurge" on the nicer display.


I could swear 1080p was the default on my X270 that I bought a few months ago, but it looks like its a $150 upgrade now -- on an already expensive nearly $1200 base model. I paid just over $800 for my X270, including the 1080P screen and upgraded memory.


1440p is a resolution for gaming, not work.

You can get 24" 4k (2160p) 180ppi 60Hz for $300 e.g. LG 24UD58.


Just purchased two 1440p monitors and I love doing work on them. 25" and the PPI is just perfect for me.

Anything higher resolution-wise requires a much larger display to be readable at 100% scaling. I'm adamantly against using scaling.


A 4K screen with 150% scaling is liquid smooth and other-worldly


Don’t you find it a little small at only 150%? What size/model?


> I'm adamantly against using scaling.

Why?


I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design. I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time. Sure it looks a little more smooth, but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain. I get more screen estate and still readable text with the 25" 1440p.

Perhaps my experience would have been better on a desktop, but this was for my work where I have a Surface Pro which when going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Also still come across apps that don't know how to scale so that can be really frustrating.


> I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design.

I have to say you have it exactly backwards!

Gaming on 4K is extremely expensive and still basically impossible at refresh rates higher than 60 Hz. In fact, you’ll be lucky to get even that much. 1440p/144Hz is a much better and more realistic target for even the most enthusiastic gamers.

Also a most welcome recent trend has been to ship with novel temporal antialiasing techniques, completely redefining how games can look at lower resolutions.

Temporal artifacts have always been the bane of 1080p, forcing higher resolutions either directly, or indirectly as subsampling. Once you take that out of equation, the benefit of native 4K is much more modest.

4K movies are nice, but as with games, it’s more of a linear progression. I doubt most people could even tell the difference in a blind test.

Full-range HDR is, in my opinion, a much better investment if you want to improve your TV viewing experience (and lately gaming as well) in a noticeable way.

I don’t know much about graphic design, but I doubt 4K is all that essential. Everyone has been using 96 dpi displays to create content for very high density mediums for a long time. Even the most craptastic ink printer is 300 dpi+. All you need is a zoom function. Color reproduction is, I think, much more important than resolution.

Where HiDPI displays really shine is actually in the most mundane: font rendering.

For anyone that works in the medium of text, programmers, writers, publishers, etc., a 4K display will be a considerable and noticeable quality-of-life improvement.

Even the most Unix-neckbeardy terminal dwellers will appreciate the simply amazing improvement in visual fidelity and clarity of text on screen[1].

> I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time.

That’s what you are supposed to do! :)

It’s only HiDPI at 200% scaling. Otherwise it’s just 2160p, or whatever the implicit resolution is for some partial scaling value.

For 4K at 100% scaling you’d need something like 45" screen at minimum, but that’s not actually practical once you consider the optimal viewing distance for such a screen, especially with a 16:9 ratio.

> I get more screen estate and still readable text with the 25" 1440p.

A 4K display should only provide extra space indirectly. With text on the screen looking so much sharper and more readable, it might be possible to comfortably read smaller font sizes, compared to equivalent 96 dpi display.

If you also need extra space as well, then that’s what 5K is for.

Though for things like technical drawings or detailed maps you can actually use all the extra 6 million pixels to show more information on the screen.

A single-pixel–width hairline is still thick enough to be clearly visible on a HiDPI display[2].

> but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain.

Yes, that’s an issue with things like games. However you can still display 1080p content on a 4K screen, and it looks just as good[3], and often even better[4].

Most graphics software will also work with 1080p bitmaps just fine. Vector graphics necessitates doing a little bit extra work, but for a very good payoff.

Overall though, for things like programming or web browsing, it shouldn’t matter. I have a netbook with a cheap Atom SoC (Apollo Lake) and it can handle 3K without breaking a sweat. That much more capable[5] GPU on your Surface Pro should easily handle even multiple 4K displays.

Pushing some extra pixels is not a big deal, if all you’re doing is running a desktop compositor with simple effects.

> going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Yeah that must suck. Still, it’s only a software bug, and you mustn’t let it keep you from evaluating HiDPI on its merits.

> Also still come across apps that don't know how to scale so that can be really frustrating.

That’s life on bleeding edge ;)

Sure, it’s annoying, but the situation is a lot better than it used to be. Even Linux is doing fine, at least if you stick to recent releases. Some distros like to ship outdated software for some reason :/

Still, in my opinion, the quality-of-life improvements of a HiDPI display very much outweigh the occasional inconvenience. Though obviously, YMMV.

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] Assuming you’re viewing at optimal distance.

[3] With the notable exception of 96dpi native pixel art.

4K has exactly 4 times as many pixels as 1080p, so it shouldn’t be an issue in theory. Nearest-neighbor will give you exactly what you want.

However in practice you need to force scaling in software, otherwise graphic drivers, and most monitor’s postprocessing, tends to default to bicubic scaling. That said, pixel art is not computationally expensive, so it’s mostly just an inconvenience.

[4] You can use advanced scaling algorithms to upscale 1080p to 4K and it usually looks great. E.g. MPV with opengl-hq profile or MadVR on Windows. For that you’ll need something a notch over integrated graphics though, e.g. RX 560, GTX 1050 and on mobile Ryzen 2500U or equivalent.

[5] http://gpu.userbenchmark.com/Compare/Intel-HD-4000-Mobile-12...


2560x1440 on a 27 inch at a reasonable distance is pretty darn close to optimal IMO, so 4k, to me, is for 34" monitors (but 27 inch I feel is optimal on 60-75 inch desks, which is what I usually work with, so 4k rarely matters).

I'm with you on accurately calibrated monitors though! God most of them suck out of the box.


If you don’t see any (spectacular!) difference between 4K & 1440p you need to have your eyesight checked.

I’m not being sarcastic. The last time there was a thread like that on HN a bunch of people figured out they need glasses.

I have a 4K @ 24in monitor (180ppi) and a 267 ppi netbook and when I switch between them the 4K starts looking like a blurry mess!


> The last time there was a thread like that on HN a bunch of people figured out they need glasses.

Its fair advice, but some eyesight issues cannot be solved with glasses.. if they can be solved at all.

Also worth noting that with TVs the distance matters a lot. With monitors, laptops, and gadgets it is relatively stable.


For TV/Multimedia HDR makes much more difference than 4K in my experience.


No, my eyesight both far and close was quite a bit better than average as of last week (as I was just there getting checked). We have a lot of 4k and 5k displays at work and most people who say they can tell a (significant) difference when we compare (the topic comes up a lot) seem to usually either be on > 27inch, have scaling higher than what's expected, or just fail to see it when we really test it out. Your millage may vary :)

Don't get me wrong, I can see a difference, but not nearly as night and day, especially when it comes at the cost of other features (eg refresh rate... which isn't the end of the world for coding so if it's the only thing you do on the monitor it could be worse... otherwise ouch my eyes.)


> have scaling higher than what's expected

What scaling is that? 200% scaling is what you should have, 4K is exactly 4x as many pixels as FullHD. If someone is using lower scaling then they are trading sharpness for virtual space.


I never get comments like these, as if the text just gets smaller as the resolution increases, rather than what actually happens (the text gets crisper). 5120x2880 is so nice because you can’t see the pixels and words almost look like they are on paper.


Seconded. 2560x1440 on a 27" panel is only 109 pixels per inch. I use a ThinkPad with that same resolution on a 14" display, with a 24" 4K UHD next to it in portrait mode.

Both displays are around 200 pixels per inch, plus or minus. It's great not having to see the pixels, so much more pleasant and easy on the eyes.

Also the combination of a portrait display with the landscape display is really nice. I can read an entire PDF page without scrolling.


I agree that having higher PPI is great, but are you using scaling to make text larger? I was barely able to use a 28" 4k a 100%, can't imagine doing that at 24"


Yes, I should have mentioned that I'm using Windows 10 with 225% scaling on both the 4K UHD 24" display (187 DPI) and the WQHD 14" (210 DPI). Some people like a bit less scaling, some more, but in general you want a scaling factor that roughly matches your display's pixels per inch.

The original Windows "standard display" was assumed to be around 96 DPI. That's the monitor that 100% scaling (i.e. no scaling) is intended for. Round the 96 up to 100 and we can say that in rough terms, the percentage scaling should be in the neighborhood of the monitor's DPI.

So monitors in the 200 DPI range are best at around 200% scaling.

A 28" 4K UHD has 157 DPI, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

The idea with a high-DPI monitor isn't to make everything smaller on the screen, it's to make everything sharper and more detailed. When you double the DPI and scale appropriately, you get four times the number of pixels for everything you put on the screen.


> A 28" 4K UHD has 157 dpi, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

That’s not how it works. Lower dpi does not somehow give you more real estate!

You should still be running with ~200% scaling because you are viewing it at a greater distance.

Optimal viewing distance, assuming 16:9 ratio, is 120 cm vs 140 cm for 24" vs 28", respectively[1]. Accounting for the difference gets you ~155 ppd with both monitors[2][3], maintaining 25.0° horizontal viewing angle.

The closer your viewing distance the more ppi you need for the same density. That 28" is not inferior to the 24", when you account for distance, despite the lower ppi, because the greater viewing distance makes the pixels look smaller, thus creating more dense image.

[1] https://en.wikipedia.org/wiki/Display_size

[2] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...

[3] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...


I guess the problem is I value amount of information I can fit on the screen vs. quality of the information.

Also, apps that don't scale properly are a pain haha.


Scaling is usually on by default in most modern operating systems.


Bitmap text can look clear as crystal on a very low pixel density display.

You need a higher PPI to make anti-aliasing work on screen (finally looking nearly as nice as print).


Last year I had a 45” 4K screen with 150% scaled UI and was able to develop in VS code with two code windows open side by side, all with the crispiest text I’d ever seen. It’s the dream.


It's not about needing 4k, it's about the prices. While it's normal in the usa to get a couple of these monitors for me the expense is impossible. And I really would use a 4k monitor. It's a big world, and most of it is not the usa


Actually, it's not always the price. Serious developers or seasoned computer enthusiasts doesn't change rigs every couple of years. If one's system is performing well enough, and the user gets used to it, system upgrade can be deferred until some technology or computing resource becomes necessary. When something breaks, the broken part is replaced and upgraded in most cases.

Personally, I've just upgraded from 1680x1050 to 1920x1080. My old monitor was 10 years old, and was performing relatively well. I bought a new one, because it started to show its age (mostly backlight age).


I would __much__ rather have TWO 1080p or 1920x1200 displays than a single 4K display. For me, the quantity of visible 'glass' matters most. It's surprisingly difficult to drive double 4K monitors.

Maybe as the price comes down and more transistors end up in basic chipsets we'll see 2 and 3 heads of 4K displays become common.


Agreed. I've been using a Commodore or PC desktop since ~1986. Been through lots of hardware iterations, seen and pondered many different configurations. I found the 24" 1080P display is the best and the more, the better[0]. And no larger than 24" either, that's crucial. I've downsized from larger panels to 24".

I wouldn't trade my three 1080P 24" panels for 4K panels, unless the goal was to sell the 4K panels and rebuy the 1080P 24" panels. I don't do a lot of gaming anymore, but they're hardly a horrible experience in that regard either.

[0]https://pcpartpicker.com/b/dFmqqs


I almost agree, except that I definitely prefer 24" 1920x1200 over 1920x1080.


So I wasn't able to get this out of my mind and went to order 3 new 1200P panels today and actually backed out at the last moment after further consideration. I think it would drive me nuts to effectively have a 21.6" panel in 16:9 optimized content, which is everything. Which would bother me more than having the additional 10% viewspace for work. Especially considering I have 3 panels, plenty of workspace.

I think at this point in the market, I'm going to stick with 16:9 native resolutions. If I do any swaps, I'll probably try out dual 27" 4K panels (also 16:9, great for 1080P content), with one mounted below the other. That'll be pretty nice in time as 4K becomes better supported.


Agreed. I couldn't find (modern, thin bezel + Displayport) 1920x1200 panels when I was shopping last time. I do see a few on the market right now, when one of my 3 goes out, I'll sell off the other two and order three 1920x1200 panels for sure.


Why do you prefer a 24 inch display to a larger one such as a 27 inch?


The dot pitch is too big, at least at 1080P and it's just not as sharp as I prefer. I had a couple 27" 1080P panels at one point and got rid of them. 1440P@27" is good but I've just been happiest overall with my three 1080P panels, the main factor is fitting three 27" panels on most desks (& mine is rather large) is harder than 24s. Also, less neck movement to focus on a panel on the periphery. Some trial and error to reach this point but as long as I run three panels, I'll never go beyond 24".


I think your experience may be an outlier. Being in an University for the past 4 years, I observed that 90% of the laptops used by the students were 1366*768, and majority were 4GB, followed by 8GB.

I don't remember anyone having any problems running the latest chrome or Firefox. I think people even managed to run things like SolidWorks or Matlab perfectly fine on those machines.


None of the other things surprise me, but that screen resolution does a bit.

It's been about 8 years since I last had a laptop without at least 1080p I think.


Do you buy your laptop for screen real estate or for "small and not too heavy"?

Because the latter is what most people buy for, as far as I can tell.


Those things aren't necessarily mutually exclusive anymore. Something like a Dell 7270 will cost about £275 on eBay. 12 inch screen and 1080p touchscreen. Some question the wisdom of a small screen like that with scaling and whatnot but I've always found it fine.


That's fair. Though I wonder how many people buy computers on ebay as opposed to dell.com or amazon.com or an actual brick-and-mortar store.

That said, I suspect most people over the age of 35 or so can't read text at "normal" size on a 1080p 12inch screen, so if they want a 12 inch screen they have to either scale up their text or drop the resolution. And I believe the reported resolution in the Firefox telemetry is the one the OS thinks it's using, not the one the hardware has.


1366x768 just will not die. They're usually cancerous TN panels with 20 degree viewing angles too.


For work, the laptop stays at home or you put it in a rolling case and take it into the office once a day. College, it's in a backpack and carried to class 5 times daily.


It doesn't surprise me too much.

I develop for an organization that serves people of mid-to-lower economic strata. 10% of our users are still on Windows XP.

We get caught up in the tech sector echo chamber and don't realize that upgrade cycles for millions of people is a lot slower than we are used to.

I think that it's not surprising to see what Firefox is reporting. A lot of people go to Firefox seeking something that runs faster or better than what came with their Windows box years ago. Most of the people we serve on low-end computers are Chrome, not IE/Edge.


Personally I won't touch anything with less than 32 GB and 1080p, but people that use a PC like a tablet with a keyboard and a larger screen have different needs. They are the vast majority of buyers.

A friend of mine is thinking about buying a new laptop. We were looking together at what's available in the sub 400 Euro range, basically everything is 4 GB and 1366x768. Add some 200/300 Euro and 8 GB and 1080p start to appear in the lists. Guess what people buy most? The cheapest stuff.

By the way, the new laptop must be Windows because anything different will be too difficult to cope with after 30 years of training. He's replacing the old one (only 2 years old) is getting too slow. Probably a malware party but disinfecting that machine it's a hopeless effort. That one and the previous ones were periodically cleaned and/or reinstalled but there is no way to change those ingrained clicking habits. No adblocker because my friend wants to see ads and they could actually be useful in his business. I told him that he'll be back with a slow PC soon and that he's wasting his money, and I've been wasting my time.


Those numbers seem out of whack to me. 1080p is pretty bad in this day and age, and 32gb is massive overkill for virtually everyone.

My daily machine at home is still a 2011 Macbook Air with 4GB of RAM, which is admittedly not enough. My work machine was a 2012 Retina MBP with 8GB which was PLENTY for all of my everyday needs except when running VMs. To this day, 8GB is enough for 'most' uses, and my work machines only have 16 to get me over that "8gb-is-not-quite enough" hump. But I've got retina displays everywhere, and a 5k iMac at home. No clue how much memory it has, to be honest, but 32 is insane for not just the average user but even most power users unless they have really specific needs.


In Mac world 1080p is bad. In low cost Windows laptops 768p is normal. It's one of the reasons why they are cheap. One random laptop under 300 USD:

https://www.amazon.com/Performance-HP-Quad-Core-Processor-Gr...


>I won't touch anything with less than 32 GB

why would you need that much ram? are you running 3+ VMs all the time?


I run an IMac with 32gb because I’m a SharePoint developer. A 2012 VM running AD, DNS, SQL Server, Visual Studio, and a SharePoint farm all running takes up 16gb easily. The other 16gb goes to VS Code, XCode, Slack, multiple browsers, Outlook, Docker at times...etc.. Then I gotta shut it down and boot up the SharePoint 2010 VM sometimes which gets 12gb a little less I suppose.


I had 16 GB, got close to the limit working on a project with many containers, upgraded to 32. I leave open projects for 4/5 customers now which is very handy, no need to close and open. That alone is worth the cost of the extra GBs.


I run 64GB in a mac pro, with about 1/2 of it used as a cache for a zfs volume. With it taking half of my ram along with my other daily apps (safari, sublime text, etc) I’m only left with ~10GB free.


I had to move from 16 to 32 gb when I did hadoop development and needed to run a vm hadoop system locally. It all depends on your workload.


> No adblocker because my friend wants to see ads and they could actually be useful in his business.

This is the weirdest part for me, as I've myself never met a person who'd like to see ads. Installing an adblocker seems to usually be a relevation for people.


Preaching to the converted I know, but: A laptop that's in the shops today is going to be hardly faster than one from two years ago. In the 90s it would have been but the pace has slowed down a lot since then.

Possibly a good candidate for just doing a reformat?


"I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768."

I've used for years a CoreDuo system+4GB with Firefox then PaleMoon playing videos in FullHD on 23 and 27 inches monitors, and never got even close to hit the swap partition unless I was running virtual machines or heavy games. Now I'm on a i5 Debian+XFCE with 8gigs, one Waterfox (privacy oriented Firefox derivative, thanks to the HN user who pointed me at it) instance with 4 tabs opened and a shell with 2 tabs; here's free output:

  ~$ free
                total        used        free      shared  buff/cache   available
  Mem:        8037636     1194044     4182928      187332     2660664     6555172
  Swap:       8249340           0     8249340
And this one while watching a 1080p Youtube video at fullscreen.

  ~$ free
                total        used        free      shared  buff/cache   available
  Mem:        8037636     1453204     3761776      280824     2822656     6202520
  Swap:       8249340           0     8249340
The operating system probably makes a difference [1], but anyway Firefox runs perfectly on 4 gigs or less. That RAM amount is plenty if well used.

  [1] ~$ uname -svo
  Linux #1 SMP PREEMPT RT Debian 4.9.30-2+deb9u5 (2017-09-19) GNU/Linux
  (a bit older RT kernel needed for low latency audio)


You seem to assume a large portion of users are software engineers or tech people. You may consider a 16GB ‘bare minimum’, but not everyone needs that - nor can even afford that (or a laptop more than, say, new or used for $750).


My brand new thinkpad with 16gb ram and a 1080p ips display cost only £670. These specs aren't expensive any more.

That being said, my parents are perfectly happy using Google chrome on 2gb ram with a core 2 duo. And that's not with only a few tabs.


Not a single person in my office has a laptop that costs over ~400 GBP.

It may come as a shock to you but there exists a world outside of California/London/random-tech-hub


You can't walk into/click onto a high-street store and buy that though, and my assumption is that's where most people buy laptops.

£670 is much more than people I know want to spend on a new machine. None of them have ever mentioned screen resolution when i've asked what they want in a laptop.


£670 is not "only".


The average person won't spend that much on a mobile device.


> (or a laptop more than, say, new or used for $750).

To me, the hardware report shows technology is quite stagnant on display resolution and memory. 1366x768 is a resolution from 2002 or thereabouts. 4GB was common in desktop computers in 2005. ~15 years have passed and consumers in the $750 price range are still hamstrung with 1366x768 and 4GB of memory? I would expect better from a $750 phone, let alone a computer.

I think we're actually talking about computers that are very old and have never been upgraded or new computers in the $200 to $400 range.


>To me, the hardware report shows technology is quite stagnant on display resolution and memory. 1366x768 is a resolution from 2002 or thereabouts. 4GB was common in desktop computers in 2005

I think there's a huge disconnect between what's you think is "common" and what people actually have. Steam hardware survey from 2009[1] says only 10% of steam (gaming) users have ram greater or equal to 4GB. I'm going to estimate that in 2005, that figure is less than 5%. Saying 4GB is common in 2005 would be like saying 64GB today is "common".

[1]https://web.archive.org/web/20090415225520/http://store.stea...


> 1366x768 is a resolution from 2002 or thereabouts.

In 2002, the most common resolutions were 4:3, not 16:9, not only because notebooks were not mainstream yet.

> 4GB [RAM] was common in desktop computers in 2005.

That's the time when notebooks became mainstream iirc. I purchased a gaming notebook in 2006, it had 1 GB RAM. I also bought a midrange notebook in 2009 which also came with 1 GB. (I later upgraded to 2 GB aftermarket.)


> In 2002, the most common resolutions were 4:3, not 16:9, not only because notebooks were not mainstream yet.

Sure. I didn't really intend to get into fine-grained technical details, but yes, back then 4:3 was the predominant aspect ratio. In fact, some of the 17" LCD displays from the time were that awkward 5:4 resolution of 1280x1024.

But putting specifics aside, the broader point is that computing technology—at least with respect to the most popular display resolution and amount of memory—is stagnant. In this thread there are people arguing for technology asceticism and making the point that 4GB and 1366x768 are good enough. And maybe they are for some people. I am surprised there are so many who are satisfied with this particular flavor of "good enough." (Though I would like to see how well that correlates with mobile device technology asceticism.)

I want the hardware industry to do better, and I thought things had started improving a few years ago when 4K displays emerged and the shackles of "High Definition" (1920x1080) were cast aside. But this data shows adoption of 4K displays for computing is so low it's not even measured as such. That disappointed me, but it's unsurprising. What surprised me was realizing 1920x1080 isn't the top dog I thought it was—in fact, the situation is worse than I thought.

And to my mind, all of this feeds into the subject at hand: an operating system vendor getting stuck. The embrace of "good enough" is suffocating.


Notebooks were plenty mainstream in 2002 (they'd been accessible to the average middle-class household or small/medium business since the mid/late 90's). It's just 16:9 that wasn't mainstream at the time, at least for PCs (laptops included).


Many of these statistics may be from overseas? I know in Brazil for instance, computers are levied with crushing taxes that mean even new computers in stores are already obsolete. Add in high rent seeking from middle men and low salaries and most people just make do with something ancient.


> 4GB was common in desktop computers in 2005. ~15 years have passed and consumers in the $750 price range are still hamstrung with 1366x768 and 4GB of memory?

If anything it's worse now because the RAM is soldered in and they're using BGA ICs which makes it impossible for most people to upgrade.


> new computers in the $200 to $400 range

There's quite a lot of these, e.g. HP Stream; the Windows license is much cheaper for these systems because they have such low specs.


My 1-year-old MacBook Air runs 1440x900 with 8 GB and it cost >$1.000 IIRC. So neither "very old" nor $200-$400 range.


MacBooks Air, Macs Mini and Mac Pro during last few years were ancient as soon as manufactured. Apple is being criticized for not updating these model lines.

So yes, the innards of your 2017 MBA are very old. It has Broadwell generation (circa 2014) CPU/GPU, and one optimized for power draw at that (i.e. a slower one).


My main home PC is a laptop matching those specs. I frequently use it for development and web browsing. You can comfortably fit two terminals in columns with that resolution. The RAM is mainly an issue when browsing, but killing all the ads and defaulting to no Javascript helps. A surprising amount of websites still only use Javascript for "progressive improvements" i.e. they'll use it just for pop-ups, hijacking scrolling, randomly moving menu bars in and out of view, implement some nonsensical partial loading of the main content etc. but will happily show you the full text without a hassle with a consistent user interface once you turn it off. For a while I used Dillo but that became less viable as an option over time. That said, I do waste a lot less time on shitty websites on this setup.


It is worth contrasting with the Steam survey which covers several million gamers (of varying degrees of dedication) -- http://store.steampowered.com/hwsurvey

Of note, 8GB is most popular with 1920x1080 resolution.


Simplified Chinese being the most common language (by far at 52%) threw me for a loop, though it probably shouldn't in this day and age. (For context: English - 23%, Russian - 7.7%, Spanish - 2.8%).


That's probably due to the influx of chinese players coming in to play pubg or whatever in the last few months. A more "representative" sample could be found in the months leading up to the spike: https://web.archive.org/web/20170606030609/http://store.stea....


Would probably be higher if memory prices weren’t so damn high. Same goes with video resolution. I bet we’ll see a quick bump once video card prices start to drop.


Every time we've said "when video card prices start to drop", we've seen about 3 months of drop, and then the companies come out with some new thing that increases the prices yet again.


>disappointingly small 8GB. In my world, I consider 16GB a bare minimum

My main computer has 8GB RAM. I don't know why I would need more than that. Right now I have firefox with 15 tabs, qt creator and 3 terminals open and I'm only using 2.5 GB of RAM. Unless I suddenly wanted to get into heavy video editing I don't think I'll feel the need to upgrade soon.


The more RAM you have, the more your OS will use. My 16GB laptop is currently using 8-9GB and I don't have much open. What's the point of RAM if you don't use it? If I get low on free RAM, but OS will discard some unneeded stuff but otherwise it properly uses it as a cache.


Almost all operations I command are executed instantly by my computer, so I see no need to use more RAM.

I think a good SSD and GPU are much more important than having a lot of RAM if you want to improve general user experience.


> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768.

The laptop I'm typing this on is exactly that: A Lenovo Ideapad from 2010 with 4GB and 1366x768 resolution, running Bunsenlabs Linux and the latest Waterfox (Firefox fork with telemetry removed). It's blazing fast even without a SSD; the browser starts in the blink of an eye and it takes nearly 70 tabs open to start slowing down.

Yes, the resolution can be limiting for some websites, but most are designed well enough to work fine on it. In fact, on text-heavy sites like HN I find myself browsing at 120% magnification for less eye strain. I couldn't imagine what 1080p or higher on a 15" screen would feel like, but it would likely be painful. It's actually more comfortable to read on than my iPad mini for that reason as well, and of course typing is a dream compared to any other laptop I've used, to say nothing of klunky touchscreen keyboards.

Granted, it would be slower with a more bloated Linux distro and certainly was with Windows 7, but when it comes to the OS I prefer minimalist and stable to kitchen sink and bleeding edge.


When I bought my father a new laptop a couple of years ago, I specifically looked for a model with 1366x768 resolution because a lot of legacy apps he depends on for his work have terrible HiDPI support. "Upgrading" to 1920x1080 with HiDPI turned off (100% scaling) would have made everything too small for his eyes.

Moreover, even apps that do support HiDPI often look like crap at 125% scaling -- the default setting on Windows 10 for most 1920x1080 laptops these days. And if you go up to 150%, you only get 1280x720 effective pixels. That's even less information density than 1366x768. Most legacy Windows apps expect to be able to use at least 768 vertical pixels.

The search turned out to be more difficult than I expected, because most high-end PC laptops with good build quality had already transitioned to 1920x1080 by early 2016. I finally found the perfect model for my father and told him that he should expect to use it for a long, long time since it's the last laptop that will ever support his apps. Fortunately, it's a well-built Broadwell i5 with a number of upgradable components, so it should last well into the early 2020s.


I think if more developers had to live on 768 line displays, slow CPUs and 4GB RAM, we would have much better performing software.


> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768

It seems that most people who make websites nowadays think similarly. I don't get why they are making websites for themselves, rather than for their users.


> It seems that most people who make websites nowadays think similarly. I don't get why they are making websites for themselves, rather than for their users.

Because most people making websites don't care about basic usability, even though they have stats that shows their users use small laptops they'll still stick a fixed header that takes 1/5 of screen real estate because it's the latest "bootstrap/material design fad". Google does this a lot, which is ironic with them pushing "AMP" on mobile, they can't even give regular people a good on experience on their laptop.


There's always been a strong impedence between users and developers. Calling a website "Apple first" isn't a compliment.

Websites become a problem on slow internet, which is surprisingly common in companies. This is because they are often located in cheap rent areas with little infrastructure.


> I know many people economize on memory

Most just don't have a choice. Plenty of laptop models top out at 8GB,with 16GB either being confined to larger form-factor machines, or being so outrageously expensive that no-one in their right mind would bother.

I mean, look at the current line of macbook pros, you simply can't get 16GB without going up to the 15-inch model.


I have a 1366x768 laptop with 4 GB of RAM..

It's included in the university fees and there's no way of opting out of it, so I had to take it. Not only do they overcharge us for it, finding a seller is hard for it too, so I decided to wipe it clean and use it as my daily driver.


Ouch, which university is that? I've heard of universities negotiating deals to get student discounts with certain vendors, and others requiring students (normally in engineering programs) have laptops meeting certain requirements, but this is the first time I hear of students having to purchase underpowered laptops.


This surprises me too, but I also must recognize my own inherent bias: The places I have worked in for the past five years all have "enterprise" laptops which are loaded down with all kinds of agents, clients, and background services which just put a silly load on just an idle system. A measly 4GB of RAM would not be enough to run MS Office alongside IE.

The screen issue does bug me though. Not just a haxxor feature anymore, all basic Windows users can leverage the split-screen capability to put two applications side by side - on a 720p screen, there really isn't enough screen space to practically do this. I think 1080x1920 or at least 900x1600 should be the standard for 2018.


> Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?

Yes. The laptop I keep by the TV has 4GB of RAM, a 1366x768 screen, and a quad-core Celeron, that I think is some descendant of one of the Atom families. I tend to keep 10 or fewer tabs open, avoid sites that use really heavy Javascript (well, I do that anyhow...), and usually don't have much else besides a Bash window running. It's got a spinning-rust drive, and swapping would be very obvious. With my usage patterns, it doesn't swap. Performance-wise, it's a fair bit nicer than my phone, at least.


I find it extremely sad that us developers really have no idea what our sites are used on.

Works great on my iPhone X and iMac Pro? Perfect!


If you look at Chromebooks, most are 4GB, often 2GB. These are the mainstream new laptops that people will expect to run for 5 years+.


Think is 'expected to run for <5 years' since the device has 5 years of software support. I find that to be a bit of a bummer, but it's still better than android's 3


Yeah, but that link is a Firefox hardware report. How many Firefox users are on Chromebooks?


> 4K doesn't even get listed, presumably lumped into the "Other" category.

Could the chart be reporting CSS pixels rather than hardware pixels? Macs are only 7% of total Firefox users but the majority of Mac hardware has a "Retina" display with a higher resolution than these.

BTW, 2560x1440 is listed with about ~1% but it's not visible because the line and text is black on a black background.


I've been using 2560x1080 for almost a year now and will never go back. 21:9 is the way to go.


What size is that? I run a 34" 3440x1440 21:9 and it's nearly a bit too wide.

EDIT: https://www.asus.com/us/ROG-Republic-Of-Gamers/ROG-SWIFT-PG3... 100Hz is awesome, even for just browsing the web.


I have the same monitor and it is the first time I've started to run Windows apps more in actual windows than full screen/full screen-like. Too many sites waste a bunch of horizontal space on it, so makes sense to shrink the window.

Great monitor though. Both for gaming and for general usage.


Its 29". I've had many monitors, both bigger and smaller, and this is by far the best I've had for gaming and everything else.

Exact model:

http://www.lg.com/us/monitors/lg-29UM68-P-ultrawide-monitor


Sounds like April Fools joke/Onion article BREAKING: Bay Area developer discover that California is surrounded by rest of the World.


> * In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

In 2018, my computer only has 2GB RAM and runs more than fast enough for what I need on a 32-bit processor. I don't understand why some people always want bigger hardware numbers, rather than better software.

> * A surprising number of people still have Flash installed. I do too, I don't see how it's abnormal.

> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768. Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?

I can barely conceive of how narrow-minded and myopic your technological experience must be that you feel only the latest and "greatest" is viable. As someone who browses the Web regularly, I spend a lot of time on places like HN, YouTube, etc. My screen resolution is 1440x900 - one step above the 1366x768. As I said, 2GB RAM, etc. This is why I've argued, constantly, against the nonsense of Chrome and "modern" Firefox. That's why users of old computers are going to Pale Moon if they know about it, or back to Internet Explorer, or just not upgrading to the newest Firefox.

Your technological experience is very much a high-class experience, but you have to understand that there are users all over the world and in all economic positions. All this push toward expecting the best and most modern only works to force out the common person - in America it's the poor person in the ghetto who still is well above the position of people in other, even less prosperous nations. Most people, if they are lucky, have a used <$200 laptop which is often a decade or so old. Library computers are often almost as old - I know two libraries which are still running their .5GB 32-bit Windows NT2000 computers as patron-access machines because they are in poor areas and simply cannot afford to upgrade.

People who have the view that everyone should have base access to the latest, greatest, and most wonderful hardware upset me greatly. I have one comment about it that seems constructive. If you think it's so important that they should have this, you should provide it. If you can't give every person who has an old reliable computer (that they can repair) the newest bleeding-edge tech (and give them service when it inevitably breaks), then you need to understand that they have valid concerns too - and help provide better software and help others like you to understand what reality looks like for most of the world.


> That's why users of old computers are going to Pale Moon if they know about it, or back to Internet Explorer, or just not upgrading to the newest Firefox.

Malware authors rub their hands gleefully when they see sentiments like this being spread online by people like you.

Don't do any of those suggestions. Disable multiprocess mode in the Firefox config if you find it eats too much memory, stick with the ESR if you're scared of upgrading and use Edge over IE.


Please show me a download for Edge on a 32-bit WinXP PC.


If you are running XP (32 bit at that!) then your choice of browser is the least of your security issues.

In any case you must be at least airgapping it (right?), so I guess it's academic.

Edit: you might be onto something here. Running an old, outdated browser on an old, unpatched 32 bit windows xp box might trigger some anti analysis defenses on whatever malware is sure to come your way. Security through being too obviously insecure!


This isn't specifically my situation.

But I know three families in low-income situations for whom that sort of PC is the only one they have, because they got it from either an office or a school when the place was discarding computers. So no, it's not being air gapped. It's the computer the kids look up homework info on, it's the one the parents use to search for jobs, and more. They can't afford to upgrade, and they're afraid that even upgrading the software would make the whole thing come apart at the seams.


There is no real need in 4K resolution, so that's why is's not so popular.


People that think they don’t need 4K either haven’t seen a 4K display in person or have poor eyesight. It’s a leap like upgrading from HDD to SDD, in the display world.

Unless you just need your monitor for gaming. New gen temporal aliasing mostly makes up for the lack of pixels.


I have a system with 7200 rpm HDD and system with SSD. Not so much difference. It was a waste of money. Just like 4K display.


I don’t believe you. There must be something pathological with your SSD or something else in your system.

SSDs have an order of magnitude faster random access performance (both read & write) and you can measure that. NVMe drives additionally increase parallelism by a significant margin. Typical workloads are very sensitive to this metric. At least at cold boot you should see significant improvement in performance.

Font rendering, likewise, you can just see[1]. See the full page[2] for more examples.

If you have good eyesight, you will see a difference. I could see how 4K to “retina” (>260 PPI) might not be worth paying extra to some (though I personally want the highest PPI I can afford), but 96 PPI to ~185 PPI is a world of difference. It goes from blurry to sharp.

That’s not all—because LoDPI text rendering often involves subpixel rendering techniques, HiDPI also brings significant improvement in contrast.

For crying out loud, some people even get headaches from the blurry subpixel rendering! [3]

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] https://www.eizo.be/knowledge/monitor-expertise/understandin...

[3] https://www.w3.org/WAI/RD/2012/text-customization/p7.html


HDD fast enough for my needs. Just like full HD resolution. Of course there is may be difference, but in very specific cases. So, just waste of money, as I said.


Alright, you might not appreciate it, but it’s not a waste of money.

Maybe that’s arguing semantics, but to me a waste of money is something like a $150 HDMI cable… Or 64GB of RAM you’ll never use. Things that you can brag about but that don’t actually improve your life in any way at all.

However, the $100 extra I’ve spent on a 4K vs FullHD monitor was, I’m certain, one of the best purchasing decisions in my life.

My 4K display[1] makes spending time productively so much more pleasant, and so much less tiring, both on eyes and mind. And it just looks gorgeous.

The first time experience of watching BBC’s Planet Earth in 4K on this thing was mind-blowing. It must look absolutely incredible on an actual big screen with full range of HDR.

I also have a very cheap Chinese netbook with a 3K display (260 ppi), and I have, on occasion, used it to read books etc. I would have never even imagined doing any extended reading, of any kind, on my old laptop. The font would need to be huge to make that experience at all bearable. At least that’s an option for something like ePub or HTML, but a lot of things are available only as a PDF. On that little netbook it’s no bother to have multiple columns of tiny text on the screen, all at once, just like in a real book.

I wear glasses and with the old LowDPI devices I was always unsure if I need a new prescription or if it’s just the crappy display.

As for SSD I can strongly recommend getting one in addition to your HDD. For my workstation, I’m running a bcache setup (in writeback) with a 60GB old SATA3 SSD + 1TB HDD RAID1 and I’ve been extremely pleased with that so far.

I can only tell I’m not running an actual 1TB SSD right after I do a major upgrade. It takes at least one cold boot to promote files in the boot path to cache[2]. Every time that happens it reminds me just how much faster SSDs are for certain things.

Nevertheless, you almost get to have your cake and eat it too. Bcache (or dm-cache, a fine choice as well) will intelligently sync all the randomly accessed and most used data with the SSD and stream anything bigger (like movies etc.) directly to/from the HDD.

In writeback mode it will also use the SSD to buffer random writes, until they can be written sequentially, all-at-once, to the backing device.

It makes both your SSD and HDD so much more valuable. Strongly recommended :)

[1] http://www.lg.com/us/monitors/lg-24UD58-B-4k-uhd-led-monitor

[2] It’s something that could be solved, though. An explicit interface to promote certain extents would do the trick. So far I think only bcacheFS offers something like that.


Moving from a 7200 to a SSD would have made a big difference. Your boot time and program load time were cut in half or better. A lot of inconvenient actions like file search became easier to do frequently. I can only guess that you do almost everything via web on a newer machine to not notice much difference.


That's not a need though. My computer needs a CPU and by extension so do I, but at the end of the day, a 4k display is a nice to have.


Sure, but once you use one for even 15 minutes, it pretty much becomes a need ;)

That’s why I have an adamant rule about never trying hardcore drugs or displays capable of refresh rates higher than 60 Hz.


I went back from a macbook pro 16gb to a macbook with 8gb. I do iOS, web, hardware and TV development. I have not had a problem. I can have xcode, eclipse, firefox, safari, mail bbedit and a few other programs running fine.


How is 16GB bare minimum when it's the max you can get on most laptops?


4k is overrated. 27" 1440p is good enough, plus current GPUs are able to push 100+ FPS with such resolution (they fail miserably with 4k resolution).


>(they fail miserably with 4k resolution) I use an external GTX 1070 over Thunderbolt 3 on a 2017 Dell XPS 15, and have 2 2160p monitors. 4K gaming is no problem at the refresh rates that 4K offers (usually 60Hz)


> 4K gaming is no problem at the refresh rates that 4K offers (usually 60Hz)

Subjective. Depends entirely on the settings you are using and the resulting frame rate at that resolution.


640k ought to be enough for everyone


I will hope this post is ironic and move on.


Most corporations can’t afford to upgrade hardware and most likely run on very old hardware running windows xp or if they’re lucky, windows 7.


Satya is having enormous financial success in chasing much needed markets, but the jury's still out on ditching MS's strengths in large software development in favor of a no-support permanent beta culture at a moment when the latter is showing its ugly side.


I kind of expect that the "let consumers be the beta testers and fire all our QA" approach is going to majorly bite them in the ass at some point and lose a lot of goodwill.


The thing I don't fully understand about Nadella's strategy is what Windows is supposed to be any more. Clearly if all these whizzy subscription-based online services are going to be useful, people need a device to access them. For simple communications and information consumption, the modern generation of devices like smartphones and tablets works just fine, but for anything more involved or with a significant creative element you still need something more like a PC.

However, it looks like Microsoft is isolating the online services from the client platform in order to de-prioritise Windows, which inherently makes Windows itself more replaceable. That creates a huge potential gap in the overall tech market, particularly once Windows 7 reaches end-of-support and a lot of people are going to be wondering about what they can do if they don't want Windows 10. (After all, if they did want Windows 10, why are so many people still not using it?)

There obviously aren't that many businesses that would be in a position to exploit such a vulnerability and start building not just a viable alternative client OS but a viable alternative ecosystem around it, but there are some. Apple could be a threat from one direction. Traditional big tech companies like IBM could be a threat from another. With the foundations available with modern OSS stacks, an unexpected alternative from (or sponsored by) the likes of Red Hat might be less likely but not totally out of the question either, or even a curveball from the likes of Intel.

So, what if one or more of them really did produce a compelling client platform and moved the whole industry back in the direction of local software and in-house operations? If the current focus on Cloud everything wanes, but Microsoft has voluntarily given up its long-standing dominance of the more traditional parts of the industry by neglecting Windows as a platform in its own right, where does that leave their online services five or ten years from now?

Maybe the likes of Satya Nadella are just much better connected in the big boardrooms than a mere geek like me, and they're confident that the emphasis on Cloud everything isn't going anywhere any time soon. But now we've been doing that for a while and the gloss has worn off a bit with all the real world problems of reliability, security, leaks, costs, and so on, I can't help wondering whether Nadella is mostly playing to Cloud-obsessed big investors and at some point those investors will prove to have been throwing good money after bad because they didn't know any better.


> So, what if one or more of them really did produce a compelling client platform and moved the whole industry back in the direction of local software and in-house operations?

I've been thinking about this too but I don't see who would stand to gain from it.

If Wine got some serious commercial backing it could be leveraged to run all "legacy" Windows apps on another platform (Linux). Desktop Linux is relatively far along but with solid commercial backing it could become a serious contender. Or something more exotic...

The only question is: cui bono? Who would have a reason to go that route?


The only question is: cui bono? Who would have a reason to go that route?

Someone like Valve, who has a massive investment in Windows software in areas that Microsoft has specifically targeted for "embracing."

What Microsoft doesn't seem to understand is that when they screw with Windows and deprioritize the needs of its users, they're also screwing with the rest of us who live, work, and play in their ecosystem. Gabe Newell, being an old-school 'softie himself, understands that very well... and he's one of the few interested parties with the funding to do something about it.


"What Microsoft doesn't seem to understand is that when they screw with Windows and deprioritize the needs of its users, they're also screwing with the rest of us who live, work, and play in their ecosystem."

That is the part that baffles me. Visual Studio Code is a fantastic advert for the Microsoft brand: every day it shows every developer who uses it that MS can produce genuinely great software. It makes Azure more credible, as well as providing on-ramps to services. If people have a bad experience with Windows, why would they be inclined to get other services from MS?


Visual Studio Code is a fantastic advert for the Microsoft brand ... If people have a bad experience with Windows, why would they be inclined to get other services from MS?

Ironically, we decided not to try out VS Code specifically because of privacy/security concerns about Microsoft that have been heightened since Windows 10. We checked Microsoft's written policies to see what sort of telemetry data was being uploaded. We were unable to find any detailed information about it, and the general wording appeared to allow for uploading more-or-less anything, up to and including our own code.


Put:

"telemetry.enableCrashReporter": false, "telemetry.enableTelemetry": false,

into vscode settings and you are fine wrt telemetry.

What concerns me more is that the .rpm build is consistently later than other builds. Usually it means, that the user is staring 2 out of 4 weeks every month on update notification, that cannot be used, because the build is not ready yet.

It brings back the memories of MSIE and WMP that used to be available for Solaris and HP-UX, always slightly later... until they weren't and the users were left in the dark.


Put: "telemetry.enableCrashReporter": false,"telemetry.enableTelemetry": false, into vscode settings and you are fine wrt telemetry.

... for the moment, as far as you know.

The problem is the trust thing, not the telemetry thing.


The problem is the trust thing, not the telemetry thing.

Yes, exactly, and trust is hard to earn but easily lost. Given the corporate philosophy of Nadella's Microsoft, and the fact that Nadella was made CEO in the first place when it was pretty clear what style of leader he was going to be, I don't see Microsoft ever regaining the kind of trust we used to place in it without fundamental changes that start at the highest levels. And sadly, as long as investors continue to buy into that vision, those changes are highly unlikely.

It's not often I feel so strongly about a business, but Microsoft's actions in recent years almost feel like a personal betrayal after being being their customer for so long and trusting them as a keystone of the tech industry. In this case, I really do regard them as "the enemy" now, and I really do hope that someone else will drive a wedge through a gap created by their customer-hostile strategy with painful and expensive consequences for their management and investors.


It's difficult to predict who might see what potential advantages in shifting the market back in that direction.

The claimed benefits of the Cloud have always been as much hype as substance, and a lot of the "sales pitch" in the early days has now been debunked. We're not going to run short of stories about major Cloud services having significant downtime, poor support, data leaks and so on any time soon.

Likewise SaaS is a great pitch if you're a software developer, and converting capex to opex makes it attractive in some contexts as a customer as well. However, I get the feeling that patience is starting to wear a bit thin in some quarters now, with customers looking at the overall costs they're paying, how many improvements they're really seeing in return, and also how many unwanted changes are being pushed out.

Depending on where you stand on those issues, combined with the general dumbing down associated with many online apps compared to traditional desktop software, I can imagine a variety of situations where businesses might be looking at keeping more IT in-house as we go round the cycle again. For personal use, I suspect there's less of an incentive because many of these users are quite happy with just using their tech for basic communications and information retrieval, but for those who do want to do more creative things or who enjoy more demanding applications like games, again there's surely a big enough market for a serious desktop OS with an emphasis on good management of local native applications and not just being an expensive thin client for online services.

So who potentially benefits from an industry shift? I guess the simple answer is anyone who is interested in developing software for the markets I just described.


> I kind of expect that the "let consumers be the beta testers and fire all our QA" approach is going to majorly bite them in the ass at some point and lose a lot of goodwill.

I wish I could agree with you. However, the QA layoffs at Microsoft were in 2014; if something like was going to happen, it would have happened by now.


I am beyond sour on the Surface line -

I have championed it when it came out, and despite being in a hard position to pick one up I managed to do so.

I deeply deeply regret it.

If anything I have gone from being a massive supporter to actively doing what I can to spite the brand.

The product is an expensive laptop, which has such critical issues with it that it renders it inoperable.

Then when you need to try and talk to MSFT help, they are about as capable as toddlers. Politely unaware and unhelpful.

The surface line went from having a great rating, to having that rating revoked since it is unrepairable.

I want to love my device, I am using it right now, but there are serious issues with it, and this is a device which has been placed in the same context as apple products and their legendary support reputation./


They also will never drop windows because it's probably gonna be an extremely useful platform when the next generations of devices will come up, think of AR.

The fact that they dropped the mobile market (where windows could not fit as very different hardware was involved in the beginning) doesn't mean they will miss the next shot too.

They will have an entire ecosystem up and running, and windows 10- level flexible to be tailored to whatever comes. Of course it would be crazy for them to drop it now


The fact that they dropped the mobile market (where windows could not fit as very different hardware was involved in the beginning) doesn't mean they will miss the next shot too.

Every single incarnation of windows mobile / phone was well tailored for the mobile hardware of the day.

What sunk windows phone was not that it was slow, it was that it was late to go touch-first and failed to differentiate enough to make up for it. That and tightly controlling the vendors while up against android made the quality of windows phone itself irrelevant.


You seem to be suffering from the same delusional thoughts that affected Myerson.

This is such an utterly clueless explanation of why Windows Phone failed that it’s kind of stunning. Until, of course, you remember the culture-induced myopia I described yesterday: Myerson still has the Ballmer-esque presumption that Microsoft controlled its own destiny and could have leveraged its assets (like Office) to win the smartphone market, ignoring that by virtue of being late Windows Phone was a product competing against ecosystems, which meant no consumer demand, which meant no developers, topped off by the arrogance to dictate to OEMs and carriers what they could and could not do to the phone, destroying any chance at leveraging distribution to get critical mass…

Interestingly, though, Myerson’s ridiculous assertion in a roundabout way shows how you change culture…In this case, Nadella effectively shunted Windows to its own division with all of the company’s other non-strategic assets, leaving Myerson and team to come to yesterday’s decision on their own. Remember, Nadella opposed the Nokia acquisition, but instead of simply dropping the axe on day one, thus wasting precious political capital, he let Windows give it their best shot and come to that conclusion on their own.


Well, that's a very myopic view of what happened to windows mobile. Windows mobile and blackberry owned the mobile enterprise market for years. Then the iPhone came around and despite not having any ecosystem at all (or even native app capability) immediately started displacing both of them rapidly. Microsoft and RIM both gambled for a long time that ecosystem would keep windows mobile and BB OS in place, and only realized that it wouldn't once iPhone and Android had taken most of their market away. Microsoft then concluded from this that ecosystem didn't matter as much as people thought it did, and that if their offering was good enough the ecosystem would materialize by itself. Windows phone was never good enough compared to iOS and Android for that to happen, so it failed. But, the underlying theory was not crazy given what had just happened in the enterprise mobile market.


I still can't wrap my head around everything that was wrongheaded about Windows 8, and I was working there for most of its dev cycle.

One wonders what would have happened if they just sort of let Windows be Windows. I.e. if they had continued iterating on core stuff but left UI and general philosophy resembling Win7's trajectory and not tried to force people into WinRT/UWP/store.

Even though Win10 attempted corrective action it always struck me that it was still accepting the fundamental thesis of Win8. They still kept with the Orwellian redefinition of phrases like "Windows app" to mean very recent, immature and unproven technology. They were still talking about ARM devices that don't let you do a straightforward recompile of old code. They were still pushing the Store as the only means to distribute software, where even Apple has not succeeded in changing people's habits on the desktop.

I hope that with a weaker organization, people keeping the lights on for Windows will know to not shake things up too severely and ease up on pushing some of these silly ideas. I somehow doubt it. I've been hoping for that for... 7 years?


> One wonders what would have happened if they just sort of let Windows be Windows. I.e. if they had continued iterating on core stuff but left UI and general philosophy resembling Win7's trajectory and not tried to force people into WinRT/UWP/store.

With Windows 8 they tried to do too much too quickly and not well, however their motivations and intention were spot on in my opinion. Security has always been a pain point for Windows and it's only been getting worse, the Store introduced a sandbox like that found on iOS and Android. ARM, iOS, and Android showed that modern hardware and software can produce power efficiencies that put x86 and Windows to shame. UWP introduced hardware independence and a modern development framework that would put Windows Apps on a level playing field with iOS and Android.

The problem was that Microsoft thought that merely by virtue of their existence, developers would adopt UWP/Store despite the path forward being a huge pain in the ass. We see how that played out. With Windows 10 we've seen a course correction and a change in strategy. You can have traditional Applications sandboxed in the Store, and you can have UWP Apps outside the Store. Microsoft bought Xamarin and is folding it into UWP, the intent is to not just have hardware independence with UWP but also OS independence.


Is there any metric that doesn't show the Windows Store to be an abysmal failure? Have any Windows Store apps gone viral? Despite the enormous installed user base, there seems to be a definite lack of excitement around Windows apps (other than games) compared to the Android and iOS app stores. And for games, Steam feels like it's more dominant.


No but recently I started using the App Store version of the Arduino IDE. It's been a lot less trouble to setup and deal with cross device. If more Apps moved to the Store I would consider using them over managing updates across all of my devices.


Has security been getting worse on windows?

That’s a bold claim presented without evidence.


> and you can have UWP Apps outside the Store

IFF your user has enabled side-loading.


that's... not what most users want. even as a dev myself, I understand most people want just freedom and no hassle, OS to be there in the background, not creating narrow allowed alley for its own users.

will keep my Win7 for as long as hardware and software keeps running. don-t expect much more from OS honestly


> that's... not what most users want. even as a dev myself, I understand most people want just freedom and no hassle, OS to be there in the background, not creating narrow allowed alley for its own users.

Android and iOS have demonstrably shown that most users don't value freedom as long as the OS works for their needs. Most users are security nightmares as IT Admins and relatives supporting family members can attest to. The Store is Microsoft's attempt to introduce that stability and safety to the wider Windows audience.

That's not to say that there isn't a significant subset of power users who want the ability to do what they will with their machine however they are the exception.

Microsoft is struggling with how to cater to both markets simultaneously.


There's a big difference in phones and tablets vs productivity machines like desktops and notebooks.

The first is designed to be casual and apps grew out of a situation where there previously was nothing, while desktop OS's have had decades of external software being installed regularly. Introducing something new, especially when opening up so much potential, is far easier than trying to change existing habits of both consumers and vendors on an existing landscape.


And still, the Mac OS app store failed as well.


IMO iOS’s success comes more from the quality of the applications than the security model per se.

The iPad is a good case showing how the same model doesn’t succeed without top class apps.


I second this.


I feel like they started gathering metrics and then had no idea how to interpret them, and the result was Windows 8. "The Start menu only gets clicked on a few times per day. That must mean no one uses it, so let's remove it". Are you kidding me? It's like they really have no clue there. Some of these things are just so blindingly obvious you are left with no words...


Yep. If this was a car, they’d have said: “clearly the brake pedal is needed but let’s get rid of the door handle, people only touch that occasionally”.


It would have helped if they hadn’t done everything they could possibly imagine to fragment windows as much as possible. Six SKUs of Windows desktop, then on top of those two entire desktop UI environments in desktop and Metro, then Windows on ARM, then shifting from one new development stack to another, then another. Then mandating store apps. This was foot shooting on full automatic, clip after clip. I switched to a Mac desktops in 2007 before Vista so I watched all of this with stunned bemusement and to be honest have little idea how it affected users and Devs reliant on the platform, except from reading the occasional article by Paul Thurrott.


> Six SKUs of Windows desktop

Here's hoping they'll rethink that at some point and follow Apple's approach of having just one desktop OS instead of nickel-and-diming the masses for features such as BitLocker.


On the other hand, compare Windows desktop marketshare with that of macOS. When you're tiny, limited choices makes sense. When you're huge, product differentiation makes sense. When you're a premium product, you can charge whatever you want. Windows appeals to people who spend $0 on an OS and also people who can spend $300 on an OS. If they cut that down to one product, they'd be losing money.


This is true, but fragmenting your platform is an existential threat. Nickel and diming customers just isn’t worth it. The problem is Microsoft thought they were so powerful and had such strong lock in that it didn’t matter. They were wrong, hence the radically simplified SKUs for 10.

But they just can’t bear to actually get it all right, so still try and stiff people to switch off ‘S’ mode.


There is two constant declarations for each new Windows :

- There will be less SKUs

- Updates won't require to restart the OS

Well ...


>One wonders what would have happened if they just sort of let Windows be Windows.

Presumably, given that the opposite has happened, in that case, "targets" would not have been met by "Q4" and important team members would have to be "let go".

As you point out, this is not new. I would pin the start of this trajectory on the introduction of Windows Activation with XP. It's all variations on the theme of restricting the user's freedom. It hasn't been your software since 2001. It's theirs.


It felt like somebody on very high level decided to have unified experience for both tablets and desktops. Then those people who told this is impossible to do (well) in given timeframe were ignored. Windows 8 was what they managed to scrape together before running out of time. Delay was not an option as the tablet hardware launch was depending on the OS.


One thing I love about Windows 10 which may sound silly. The Netflix app. I can download anything downloadable on Netflix. I can't do this basic thing on my Macbook Air sadly cause they have no desktop app. Small thing that makes me enjoy Windows 10.


I honestly don't know the answer to this, but is that a Microsoft/Apple problem, or is it a Netflix problem? Apple has an app store for macOS, and there are tons of unofficial Netflix apps in there. Netflix has an app for iOS. So why hasn't Netflix made an app for macOS?


I don’t know either, but I wouldn’t be surprised to discover Netflix thinks people with Apple laptops are likely to have an iPad or iPhone as well.

With the possible exception of my non-Retina iPad mini I’d rather watch Netflix on any of my mobile devices. It never even occurred to me to look for it on my laptop, because the form factor is so ridiculous for movie watching.


Very possible. I've watched movies off my desktop for years, and my network isn't always reliable so being able to predownload shows on my Windows desktop is perfect. It returns me to my experience pre-Netflix / Hulu. I wish the streaming services were not getting so fragmented they're not helping with their greed, they're only drawing people toward piracy further all over again. I'm speaking of course of CW. I'm not paying to stream only one network just to watch Arrow and Co. dang it!


As a developer, the Windows target is baffling. I briefly considered porting a few apps I wrote over to Windows just to re-learn the platform again. Might be fun! Back in the day, the choice was easy: MFC/Win32

Now, I have no idea where to even start! C#? C++? .NET? Win32? MFC? WPF? XAML? WinForms? UWP? What a rat’s next! How did MS let the development ecosystem get so muddied? I know I can go online and research all of these, learn the trade offs, and hope I bet on the right horse, but then again I could also choose to simply leave this crazy platform alone and work on more features on the platforms that are more comfortable.


Here's how simple I think it is:

Out (but not gone/forgotten): Win32 (hand-coded C, C++ MFC, C++ ATL and related alphabet soup [aka back in the day the choice wasn't easy, either], VB6, .NET WinForms, etc), WPF/Silverlight (a .NET only XAML platform)

In: UWP (C++ with XAML, C#/.NET with XAML, JS with HTML/CSS)

Just three UI platforms to consider (Win32, WPF, UWP), two of which are mostly deprecated and not recommended if you are building a fresh application or porting one to a new UI. The remaining platform (UWP) gives you language choice (C++, C# or other .NET languages, JS), but not UI design framework/markup language choice (XAML, unless JS and then HTML).

If you are doing a fresh port of a C++ application from another platform, the choice seems pretty simple to target the UWP and use XAML to define your UI.


Except that with Win7’s dominance your UWP isn’t going to run on most of your clients for many years. In a corporate environment it sorts of kills it from the outset. I think Microsoft made a big mistake by not making UWP win7 compatible.


Windows 7 doesn't have "many years" left. Windows 7 has less than two years left in its support lifetime. (2020 is when extended support ends.)

In the mean time, if you, in a corporate environment, must Windows 7 until the bitter end in 2020, you can use WPF as a bootstrap step towards UWP. The XAML is similar between them and a lot of the coding paradigms are same (MVVM patterns), and if you architect well (.NET Standard 2.0 libraries) you can share 100% of your business logic between a WPF application and a UWP application.

Xamarin.Forms also has an open source WPF renderer if you'd prefer to further go the cross-platform route. (Xamarin.Forms supports UWP, iOS, Android, macOS.)

I agree that I think Microsoft should have had the .NET Standard, XAML Standard, WPF Xamarin.Forms development stories in place sooner to better leverage developers that need to support older Windows, but many of those pieces are in place now if that is what corporate developers were waiting for.


I'm not developing anything desktop right now but when I think about it the conclusion I come to is to give up on MS technologies and go with QT. Much more stable and you get Linux for (almost) free.


Yea, this is where I ended up. If I’m going to port something to oddball non-Unix-based or otherwise less-important platforms, like Windows, Qt makes the most sense.


This reminds me of "How Microsoft Lost the API War": https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...


You forgot Universal Web Apps in your list, which like UWP is the least universal of the Windows ecosystem.


You start at Electron. Having used most of the GUI frameworks listed above, making a GUI app in HTML/JavaScript is faaaar easier.


They bet the farm on “store”. Why? Because they must have done the math and realized that the enterprise and “creative pro” desktop with classic expensive desktop apps is a dead end. Was that calculation correct? Who knows. But I have to think they did that analysis very carefully.


It was wrong. None of those apps have been replaced. I use them every day and switched to Windows 3 years ago. I was stubborn and should have done it sooner. Maybe MS saw Apple abandoning the pro hardware market and was trying to play along “because Apple”.

A fun result of the whole mess is that after a new Windows 10 install, the first thing I do now is open Store to install Ubuntu WSL.


I think they saw Apple’s success with the iOS App Store, saw Apple trying to do the same on desktop and tried to leap ahead of them by forcing the pace. Of course when app stores on desktop, for both Apple and Microsoft, proved much more problematic to implement, ramming it through like that proved highly damaging.


It's funny because even if you put yourself in the mind of 2009, you can imagine a much more modest store feature doing good things for end users.

The old Windows desktop might have had a half dozen things prompting you to elevate and update. Flash player, Adobe reader, Java runtime, all these things wanted your undivided attention to update, not to mention were polling the network and using resources to do that. So maybe you could see a single piece of infrastructure allowing those third parties to hook in less intrusively.

But that's not what they asked for. They would prefer you trash your app and write for the thing they came up with this year, to be thrown away when they do the rewrite the next year.


> They would prefer you trash your app and write for the thing they came up with this year, to be thrown away when they do the rewrite the next year.

That hasn't been true for over a year now. Classic Win32 desktop apps have been in the Store for over a year now (nearly a year and a half). It's my preferred way now to install things like Office and Paint.NET. Even Photoshop is in the Store now.

(Even iTunes was promised at one point, and I'm still very hopeful for that, because Apple's Windows Updater is still one of the worst updaters on Windows, where bad updaters was once an art form.)


I think it's more likely that there was a team or more likely several teams looking to make a name for itself, and they concluded that forcing everyone in the Windows desktop monopoly to use its stuff was the way to get that work noticed. I am sure that a bunch of the people responsible were praised as top performers and rewarded for their victory. When the original people are cycled out there is already inertia and it doesn't really occur to people that the previous blunder needs to be rolled back.


Yes, but the solution would be to do something like Steam, and sell normal Windows software in the store.

Selling only WinRT software made all their efforts worthless.

Forcing developers to provide Win32 and WinRT versions of software alienated software developers.

Some of the calculation was correct. WinRT was not.


While I agree to a point, I do think the "wild west" nature of a lot of Win32 apps kind of prevented it as they couldn't manage the security. If there were some way to wrap the Win32 apps in their own individual sandboxes without hampering function, then I could see it working. But with everything else they have to get going, I guess it is too tough a nut to crack right now (if could be cracked).


The obvious counterpoint to that is Steam again.

They don't have to sell all existing legacy Win32 apps in the store, but they should at least allow some Win32 apps.

About malicious software, I think it would be very dangerous legally for the malicious developer to expose his personal and financial information to MS and expect nothing happens when caught.


true, but I think (not certain) Microsoft's aim is security via sandboxing, so without thorough testing of the various Win32 apps, I don't see how it gets done. And I could just imagine the blog post from a Win32 developer when their Windows Store release breaks because it doesn't have the same access privileges as the proper version. Then users request added security but with no net impact on user experience/expectations and all that.

And all it would take is the developer to claim that their machine had a virus unbeknownst to them that somehow made its way into their binary.

Dunno, Steam is in a position where they don't really have to care about the Windows brand but Microsoft does, so they have a bit more freedom than Microsoft in that.


>immature and unproven technology

Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now..

Granted, Windows Store still sucks ass years later its introduction, but building a new ecosystem with unified and modern tooling was probably a necessary step for them to make


There as nothing wrong with their desktop core technology. Visita at least allowed them to solve the security issues, and with seven they had a really great, solid desktop OS. The problem was actively undermining all that with Metro and RT. What’s popular and effective about 10 is all the crud they stripped off it, much more than anything they added, just as with 7.


> with seven they had a really great, solid desktop OS. The problem was actively undermining all that with Metro and RT.

I sometimes wonder if maybe they should have done a hard fork between 7 and 8 and continue to develop 7 for commercial users and 8 would be the consumer version.

In the commercial branch, they could undo some of the moves they made years ago (like putting GDI in the kernel rather than user space) to eventually get a more stable, secure OS.


I think they recently put backs tons of kernel space code back in userspace (in recent versions of 10), probably because the quantity of discovered security vulns started to become really ridiculous.


The security issues aren't "solved" as long as the expectation is that apps run with full user privileges; that's why the AppContainer/UWP model was and is needed.


Define proven to be bad. What is the metric there? And note also who considers it "proven bad". It was winning and also the better ui at the time. It did indeed had bugs, but not really that many of them affected most users - who were fully comfortable restarting and go. The viruses were issue through.

The thing with "people making fun" through is that a lot of it was by-product of culture war between linux and windows world. between radical ideological open source and closed source - where Microsoft engaged in a lot of ugly tactic. Between techies who wanted tech focused on their needs and commercial company that cared significantly less about needs of admins, tinkerers and such.

Meaning, the jokes were much more political and much less technical. They were often result of people hating Microsoft (recall their ugly tactic).


>Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now.

A lot of that stuff could have been (and is now being) fixed incrementally. Lots of that tech is available to Windows desktop applications without needing to switch to UWP. Some of it is UWP only (except for a backstage pass for Edge, which tells you everything you need to know) only to push developers to the Store.


> Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now..

I feel like MS had mostly quashed these problems with XP - it was certainly not bug-free, but after a few years was a solid enough platform many used it all the way to (and probably past) its EoL in 2014.


or if Windows 8 had set one UI default for tablets and the other default for desktops. The OS was faster and had better load times. It was a far cry from the failure of Windows Vista. It was just that start menu that really killed it for most people.


Remember at that time Windows 8 was the MS solution for tablets and MS was getting killed in the tablet space as iOS took off. Windows 8 wasn't officially out until a year after the first iPad. They had to get it out asap.


Windows 8 may have been a response to iOS popularity, but don't forget that Microsoft had been pushing tablets for a long time by then. They had a huge head start over Apple. If you want to include Windows for Pen Computing, then it was an 18 year lead that they lost overnight.


Apple released their own hardware pen-based handheld computer in 1993.

https://en.wikipedia.org/wiki/Apple_Newton

Windows for Pen Computing was released in 1992.


Yes, but Newton's OS had nothing to do with MacOS, being a completely independent system with its own API and even programming language (Newtonscript). Windows for Pen Computing was Windows.


Good point. I guess I would respond with the difference being that Microsoft developed the technology continually during that time whereas something like the Newton was a relatively short-lived product.

For the record, I had a MP120 and absolutely loved that machine.


To be fair though, it’s not that a desktop app store couldn’t work, it’s that Apple’s implementation and set of restrictions are infuriating.


They took risks with Windows 8 and I give them credit for that, but yeah, I'm glad they changed course because it wasn't working.


What strikes me is that their latest bet seems to be universal web apps. I have nothing against UWA in themselves but I very much doubt that the mass of Java/VB/C#/C++ developers that write the bulk of Windows apps will suddenly move to JavaScript. And if you ported your app to the web why bother with UWA.


Agreed. If I wanted to be a web/javascript developer, I would already be doing JavaScript. But I don't want to.


This seems like a ridiculous strategy. Who would abandon an ecosystem that they own over 80% of? If they're spending too much money on it, stop adding features and harden what they already have. It's increasingly clear that macOS isn't business-level stable, and Linux isn't polished... No one has an Android device on their desk at work. Maybe if they stopped trying to take over the world they would realize that they already own a huge part of it. If they don't care, they should sell it to someone who does.


They're not abandoning Windows or the ecosystem. This is just the end of it as the tip of the spear/driving force for MS strategy. Nadella is (rightly, imo) focusing more on o365 and Azure.

One of the points the article makes is that Ballmer's MS would do things like not release Office for iOS because that would potentially eat into Windows marketshare. That no longer will happen. MS sees its more important to get o365 everywhere they can.

From my perspective, switching small businesses over to Exchange Online and other o365 services - The future is bright for MS in that space. These SMBs we've switched over have gone from using a version of Office they purchased forever ago and an old Exchange license to paying 50-100 a month for Office 365.


I agree that getting Office to run on Android and iOS is a bright future, but that doesn't mean you put Windows in the corner with "some other stuff". Maybe there are too many people on it and they've gotten too comfortable? Spin it out into a company with half the staff and let it soak up its own sun.


The problem with Windows is operating systems are a commodity. You shouldn't care about it. You are about opening your documents on your desktop and phone - you don't even care what program opens the documents.

Mac OSX, Linux+desktop (any of several), IOS, and Android are all operating systems that can do everything you need of an OS, except MAYBE run windows programs.

Which is to say Windows as a separate company probably cannot make it long term.


If that were true, Linux on the desktop would have already won and this decision wouldn't be controversial. I think what's preventing this is that making a productive desktop operating system is more difficult than we think.


The desktop matters less and less.

The only computer I own at home (other than ipad/iphone) is a Chromebook. But I lease a cheap VPS that lets me do "computer stuff" otherwise.


Not at all; if you want to consume a lot and create a little then your thesis is right, but the desktop is king for creators. There are a slew of very mature Windows only software out there serving many fields of endeavour for which there are no competitors in either FOSS or other OSs.


Eh. Creator software comes in two tiers. Dilettante tier (e.g. lower-end Photoshop) is actively ditching desktop altogether, while higher-end (think of Color, the motion picture colorizing system) either run on Macs or on such dedicated machines that they might run Wintel but are not "desktops" anymore.

(Then there's scientific software -- Stata and Matlab and that stuff. But ten years ago Matlab, like gagh, was best served in higher-end minicomputers and now everything is Python and can be ran on a cheap VPS or on AWS.)

Besides the sheer gobsmacking convenience, there's also the fact that it's way more fun for bean-counters to have higher OPEX in exchange of lower CAPEX. It really kicks the llama's ass.


> There are a slew of very mature Windows only software out there serving many fields of endeavour for which there are no competitors in either FOSS or other OSs.

Can you list some? That sort of software is my favorite genre and I'm always interested to hear about different ones.


Apologies for the delay in replying. I use MS Project and an add-on called SSI Tools daily, both essential for CPM based project management. Also MS Outlook plus Clear Context for PIM and communications functionality. There's also MindManager which is a highly functional mind mapper plus a couple of plug ins for it.


No. Linux on the desktop has some specific list of pros/cons. Windows can be better by enough, even if inertia is the only advantage.


Isn't that my argument?


You seem to be arguing that is enough to make windows not a commodity.


> Windows as a separate company probably cannot make it long term.

That is surely overstating it, at least for definitions of "long" meaning <= 20 years. If you assume that Windows is more or less "done" their engineering expenses could be low, and revenue significant, merely serving the corporate and various niche (gaming, etc) markets which are still heavily bound to Windows. Their market share is declining in some niches (such as CAD) and maybe even overall (to OS X), but negative revenue and market share growth does not necessarily mean quick extinction. Some dinosaurs take decades to bleed out.


Are you kidding? I'd be ecstatic to go purchase a new boxed copy of Windows 11 Pro 2018, so long as it was basically Windows 7 shell with Windows 10 kernel and security updates, no UWP/Cortana/store/telemetry addons, and settings that stay set.

I'd even buy the Windows 12 Pro 2021 version when the time came for fundamental internals shifting like wider WSL access.


You might but no one else would. Most people use the OS that came with their computer until it dies.


You might but no one else would.

I would. An update to Windows 7 that worked well with modern hardware and had modern security updates but otherwise ignored almost everything Microsoft have done in more recent versions would be a significant improvement on anything currently available for those of us who value stability and security over bells and whistles.


I would and have. My last PC was built with Vista. I bought 7 when it came out and then 8. I'm far from the usual user though. My point was that regular people will not.


Not saying that you are wrong, but iOS office was in development long before Ballmer left, so never isn’t as true as “not soon enough.”


> Nadella is (rightly, imo) focusing more on o365 and Azure.

Windows is a perfect example of no focus at all. Maybe this makes sense for a true mainstream OS.

To be honest, I wonder if Azure, Exchange online or Office 365 can really be their future. Why not use AWS, Scaleway or a local service. It's cheaper and better.


> macOS isn't business-level stable

amazing how many people here are missing the point. this is not about Windows vs MacOS. Client OS has become a commodity with prices approaching 0 (MacOS no longer charges for upgrades, neither does iOS or Android) and users care less about how they get to Facebook.

Think about server/mobile space. if MS tried to hang on to their dominance simply via desktop lock-in (because it worked in the past) that would be their Kodak moment. All Nadela is doing here is emphasizing efforts on staying competitive in cloud/o365 which is where he thinks they can continue to grow and stay competitive and bring in revenues from enterprise clients. Less and less companies want to lease/run their own windows/database servers, and with technologies like containers a lot of companies are now going with open source solutions.


Established businesses aren't going to put their data in someone else's datacenter. Small to medium businesses perhaps, but all it takes is one downturn that tightens the purse strings, or one critical event that reveals their absolute dependence on a third party. SAAS solutions are fair weather, and while we've had a lot of good weather I don't expect them to be used for critical business functions in the long term.


> Established businesses aren't going to put their data in someone else's datacenter.

No, established enterprises (both private and government) are, in the real world, important customers for cloud offerings like Azure, GCP, and AWS.


During a AWS demonstration at work. The Amazon guy specifically mentioned, AWS is a more lucrative option for large companies than small ones, simply because it was an easy one time way of culling large teams in big companies who were adding very little to nothing in return in value.


Hmm, not convinced about that.

My Boss is smart and he'd wonder what the business plan would be should say 5 years down the line you want to move away from the cloud for some reason.

You threw away decades of accumulated knowledge about the company and it's systems to save some money upfront.

He thinks on a longer scope than most bosses I've known though (largely because his family own the group so he doesn't have to answer to shareholders).


But if your business is to produce, say, cars, why would you want to maintain this knowledge in house. You certainly want to understand your systems, but do you really want to deal with physical datacentres, updating the underlying OS, replacing drives, etc. You kind of want it outsourced to specialists.


Yeah, that ought to be a great sales line.

I'd be wary about how the real world actually deals with that, but there is no lack of large companies jumping on short term costs cuts without looking at the risks.


From the point of view of upper management in a large enterprise, paying a third party with whom they have an SLA and someone to sue for infrastructure which may go down vs. paying their own staff to build and maintain the infrastructure that may go down often either isn't a difference in risk or, actually, the former seems to be better-managed risk. It's the same reason they often prefer to buy support contracts even for OSS rather than supporting it internally. Part of the analysis that probably gets overlooked from the outside is that employees aren't just a predictable cost they also are a risk.


> isn't a difference in risk

Really? People incentivised into looking for your best interests vs. a managed organization incentivised into dissolving your contract and gaining on average. I can't imagine anything that could create more different risks.

What people seem to do all the time is looking at an SLA and believing the real world must behave the way things are on paper; because there's a signature there. People do this even after being recently burned by the same partner. Some people have the same trust for in house staff, but the usual on high management is to have the exact opposite bias.


With the developed world being increasingly dependent on AWS, Azure and Google Cloud, the next big fuck up is going to be interesting. We have seen multiple datacentres from the same cloud provider going offline simultaneously in the past. And it is safe to assume that hostile foreign state actors took notice of this critical dependency. Whether it is because of malice or mistakes, I can’t help thinking that this massive concentration is like a snowball increasing in size while it’s going down the slope.


Indeed. If/When another major conflict comes to the developed world, you better believe major cloud DCs will be pretty high up on the cyber and (if it comes to it) kinetic target lists. Never have so few facilities been relied upon for so much by so many.


It seems like my comment is being evaluated without proper context. Sure, some companies are moving some of their data into someone else's datacenter. Is that 50% of companies? Are they moving their core business data? Like when there's a network outage, they can't do business? When there's a partial data loss, they close their doors?

When we're talking about an entire ecosystem going away, the new one has to have almost 100% adoption. Windows desktop is going to exist for a really long time, especially when everyone else is giving up on desktops.


Those 2 statements are completely different issues.

And yes, companies are entirely moving their systems and data. Salesforce runs their sales team. Oracle and SAP runs their ERP and finance. And now AWS/Azure runs their IT infrastructure. They use email from Google or Microsoft, along with hundreds of other SaaS products used day to day by employees. New startups are even more invested and start using other vendors for every non-core activity from day one. This is what a strong global business ecosystem enables and is a massive advantage for everyone.


You have entirely missed the last 2 decades of this happening and at a rapidly accelerating pace. I'm not sure what you think "business critical functions" are but every company outsources at least some important components that could easily put them out of business if something goes wrong, whether that's cloud servers or payroll or manufacturing supply chains, etc. It's called risk analysis and yield management and is well understood by every successful team.

When it comes to "someone else's datacenter", I think you'd be surprised at just how few companies even own a datacenter, let alone their own racks. Even the major clouds lease their own space from actual datacenter builders and operators.


>Established businesses aren't going to put their data in someone else's datacenter

You are incredibly unbelievably wrong here. I work for a vendor known for incredibly high prices and have knowledge of basically the entire infrastructure of every one of my clients, and nearly all of them have a massive cloud strategy that's in-place right now. I even have clients shipping their security and compliance logs to the cloud, or sending them from cloud to cloud.

Actually the more medium-sized businesses are the ones most hesitant to move to the cloud, because of the lack of cloud expertise on their teams. But even then... it's still a huge strategy. They're all consolidating physical datacenters and using Amazon or Azure or Google.


On the topic of cloud strategies and infrastructure, what are your thoughts on the Mulesoft acquisition? Does this foray into infrastructure software really make it that much easier for CRM to sell the rest of their SaaS stuff? Obviously a lot of CRM and potential CRM customers have hybrid models today, does this really help them dive two feet in with Salesforce for their pending 'digital transformation'?

Also, I don't want to get off topic but how does SIEM space fit into the infrastructure stack you see in the future? In security, what about the firewall vendors trying to become the central hub for managing the new security needs brought about by the cloud (CASB, endpoint, virtual firewalls, etc?). What about SaaS only Iidentity & access management solutions like OKta?--I hear they are a game changer, but is a cloud-only solution really the best positioned seller for something like this, especially considering the hybrid structure of most large orgs today (on-prem & off-prem/public cloud)?


I can’t talk too much about Salesforce and Mulesoft since I’ve never used them and don’t have many clients asking me about them.

When it comes to next-gen security appliances, I see them enhancing the SIEM, but not replacing it. Too many regulations require centralized log management, and organizations depend on a central alerting and monitoring platform. I do see these security platforms making the SIEM cheaper and dumber, though. Whether it’s a storage-based pricing like Splunk or an event Rate pricing like QRadar, licensing costs a lot. Which means it costs a lot to feed a bunch of dumb logs into a system that makes information out of dumb logs. As long as you already have a Palo Alto, you might as well feed the IDPS logs into your SIEM and forget the rest: it’s fewer logs, cheaper licensing, and less hardware needed on the SIEM. You already have the Palo doing the intelligence.

With regards to hybrid clouds that are popular today, you see a lot of SIEMs go to the cloud. You can ship logs cloud-to-cloud, which saves bandwidth backhauling it to the enterprise. It is a challenge for security logging to get data from there though. Oftentimes it costs extra to send logs, or can’t be done at all. Amazon wants you to use Cloudwatch. Microsoft wants you to use Security Center. It’s a pain to centralize it all. That’s going to have to change.


Thanks. Interesting....

To your last point: sounds like it should change, but not that it necessarily will. I can’t fathom the cloud providers would allow an independent third party come in and take this business from them, despite how much easier it would be for the customer.

I thought splunk was usage based pricing, not storage based? But yes, I have heard similar things on how quickly it can get expensive, sometimes without any warning when they get a bill 10x what they expected...

Thanks for your time.


I suppose time will tell... Many companies will never do this.


they can still take advantage of existing cloud technologies running on their own servers.

And guess what, the CTO and CFO and COO will very much care about licensing cost of each CPU running windows or sql server, vs FREE on Linux.


You're thinking about your own app servers. No one cares about the cost of Windows when you're running someone else's proprietary desktop software on it.


> No one cares about the cost of Windows when you're running someone else's proprietary desktop software on it

This is a small and shrinking corner of the market. Valuable. But still a corner.


It may be shrinking, but it isn't small.


> the CTO and CFO and COO will very much care about licensing cost of each CPU running windows or sql server, vs FREE on Linux

They haven't really up until now have they? MS licenses are not a big part of the IT budget compared to the people needed to run an in-house data center (typically staffed 24x7)


This is where Microsoft is going with Azure. Compared to other platforms, setting up a local instance of Azure is pretty painless and allows you to spin up resources locally that are identical to the Cloud.


Which is why Microsoft has on-prem and hybrid offerings for most of their products. Actually this is one place where they’re way ahead of AWS and GCE.


Same with SAP. They have a cloud offering (Hana Enterprise Cloud) which can connect to existing on-premise SAP deployments. That way, they can up-sell existing customers into the cloud and then slowly migrate them to even more subscriptions.

Disclaimer: I work at SAP, but not in the same division. And not in sales either, so take my word with a grain of salt. ;)


> Established businesses aren't going to put their data in someone else's datacenter.

I guess the 5 years I spent at a massive tech company shutting down our datacenters didn't happen and all the recruiters offering me huge amounts of money to move their stuff into aws aren't real.


> Established businesses aren't going to put their data in someone else's datacenter

[CITATION NEEDED]

This is very myopic. Establishes businesses are huge customers (both current customers and potential customers), for cloud providers.


"Established businesses aren't going to put their data in someone else's datacenter."

Established businesses do this all the time.


>No one has an Android device on their desk at work

If you mean as in a desktop PC then no, but smartphones is becoming more and more important and Apple is not strong outside the US. Around here almost everyone has an Android phone as a work phone and I see more Chromebooks by the day.

I also want to point out that i believe that saying they abandon the ecosystem is missing the point. They don't. They just don't let Windows dictate what the rest of the business does.


I agree. I wish Microsoft would focus on Windows 11 and not think of Win10 as the last Windows, thereby flooding it with adware non-sense.

There are still a lot of tech heads and gamers that depend on Windows and will continue to buy into the platform. I even know hard core Linux devs who've switched back to Win10 and just work off of the new Linux Subsystem and an X11 server.

With Android being a clusterfuck of vendor specific bloatware and not having a single concept of a clean install, Microsoft should focus on keeping a clean Windows install equivalent to a fresh Mac install.


I agree with what you're saying and I think they need to keep Windows 10 around at least for a little while longer.

Everyone hates when Microsoft changes something core and foundational about Windows and thus there has been little change in certain areas of the OS for the last two decades. That build up of cruft has become untenable and Microsoft attempted to address major pain points like the Control Panel with Windows 8, then 8.1, and still in 10. The problem is that rewriting something like the Control Panel would consume 100% of developer resources for an entire OS Dev cycle. Can you imagine if Microsoft announced Windows 11 and the only difference between it and 10 was a new control panel?

Windows 10 is the perfect opportunity to start fixing cruft. It's already controversial but not universally hated, they got the majority of people to adopt it through free upgrades, and they've introduced the semi annual upgrade cycle that allows them to gradually introduce foundational changes gradually. With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable AND they improve a core feature silently.


> With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable AND they improve a core feature silently.

And with every shift towards ads or trying to force a mobile-first UI on a desktop/laptop, they alienate users off of it. I switched to OS X as soon as I saw the writing on the wall with Win8 (and Wine on OS X got stable enough to play GTA San Andreas, tbh).

Also, I wonder if it would be possible to get Windows 7 userland to run on a Win10 (and with it, WSL) kernel. Now that's something I'd switch back to Windows for.


> And with every shift towards ads

You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices? Or do you mean up I have to open the App Store to apply MacOS updates and click past the ads?

Windows 10 adds some shortcuts to the start menu during the bi-annual updates, they're annoying but once they're deleted they don't come back. They also have an add in Explorer for OneDrive that also goes away permanently when dismissed.

If I had to choose between MacOS/iOS Ads, and Windows 10 Ads, I'd choose Windows but ideally they'd both stop it.

> trying to force a mobile-first UI on a desktop/laptop

Windows 8.1 and 10 don't require UWP apps to be full screen. And actually the Settings App in Windows 10 is fantastic because it responsively adapts and is usable whatever size you make it.

MacOS on the other hand won't allow you to resize most settings windows and they're fixed to a size that's comparable with a 90s era Macintosh despite MacOS only working on machines with high resolution displays.

> I switched to OS X ... with Win8

So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?


> You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices?

wtf? I don't use iCloud and don't get this nag. I only get a single question when I reimage my test computers or wipe a test device, that's it.

> Windows 8.1 and 10 don't require UWP apps to be full screen.

The start menu is enough for me to permanently turn me away. Or this stupid "tile" interface. What's that crap even for?

> So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?

No, it's these things that I always encounter when having to fix other people's computers. Plus I do not really like that Windows 10 has un-opt-out-able telemetry (yes, you can turn it off by hacks, but every random "update" will turn it back on again).


> I don't use iCloud and don't get this nag.

Are you signed into it though? I'm not and it nags me to sign in.

> The start menu is enough for me to permanently turn me away.

Which one? 8, 8.1 and 10 all have very different start menus.

> Or this stupid "tile" interface. What's that crap even for?

In Windows 10 it's not unlike Dashboard in MacOS.


> Are you signed into it though? I'm not and it nags me to sign in.

That's not an 'ad' - there is a ton of OS-level functionality that's tied to your iCloud ID and account, you can ignore it if you want, but it's not an 'ad' if they want you to use it...


> Are you signed into it though? I'm not and it nags me to sign in.

No, I don't use it in any form.

> Which one? 8, 8.1 and 10 all have very different start menus.

And they're all not the Win95-to-Win-7-era one. I need a working menu with nothing else, especially not tiles or ads.

> In Windows 10 it's not unlike Dashboard in MacOS.

Which Apple doesn't force one to use, thank God. Programs folder in the Dock, that's it (although, I admit, I'd prefer a list/menu...).


>You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices? Or do you mean up I have to open the App Store to apply MacOS updates and click past the ads?

I don't get asked to use iCloud every time I unlock a Mac. Additionally, I don't recall a Mac ever displaying ads in their file explorer asking me to upgrade my SkyDrive storage. Nor do I ever recall Mac OS ever installing third party bloatware apps and games and making them front and center in the Start menu. I also don't recall ever having to proactively prevent the mass collection of "telemetry" data when installing Mac OS.


> I also don't recall ever having to proactively prevent the mass collection of "telemetry" data when installing Mac OS.

To be fair, it will ask you once at profile creation if you agree to sending telemetry. To the best of my knowledge though, Apple respects the choice and does not reset it quietly like Windows does.


Windows 10 privacy controls are intentionally designed to be skipped by the average consumer.

Let's have a look at the "Get Going Fast" screen

https://media.askvg.com/articles/images5/Customize_Privacy_S...

You'll see at the very bottom a Customize Settings button that is nearly invisible. It's located on the left side, shaded in a color that is different from the other buttons and the text size is substantially smaller than the standard button text size.

The majority of users will have missed this and just clicked on the Use Express Settings button.

Let's have a look at what is turned on by default:

Personalization

1. Personalize your speech, inkling input and typing by sending contacts and calendar details, along with other associated input data to Microsoft.

2. Send typing and input data to Microsoft to improve the recognition and suggestion platform.

3. Let apps use your advertising ID for experience across apps.

4. Let Skype (if installed) help you connect with friends in your address book and verify your mobile number. SMS and data charges may apply.

Location

1. Turn on Find My Device and let Windows and apps request your location, including location history and send Microsoft and trusted partners location data to improve location services.

Connectivity and error reporting

1. Automatically connect to suggested open hotspots. Not all networks are secure.

2. Automatically connect to networks shared by your contacts.

3. Automatically connect to hotspots temporarily to see if paid Wi-Fi services are available.

4. Send full error and diagnostic information to Microsoft.

Browser, protection, and update

1. User SmartScreen online services to help protect against malicious content and downloads in sites loaded by Windows browsers and Store apps.

2. User page prediction to improve reading, speed up browsing, and make your overall experience better in Windows browsers. Your browsing data will be sent to Microsoft.

3. Get updates from and send update to others PCs on the Internet to speed up app and Windows update downloads.


>Windows 10 adds some shortcuts to the start menu during the bi-annual updates, they're annoying but once they're deleted they don't come back. They also have an add in Explorer for OneDrive that also goes away permanently when dismissed.

Also ads on the lock screen, ads in the start menu ("app suggestions") and (Edge) ads from the toolbar. Just off top of my head, maybe I missed some.


Win 8.1 does require UWP apps to be fullscreen. Win 10 does not.


> With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable

And some folks get the really big money for "driving business initiative", "spearheading new projects" and stuff like that.


> hard core Linux devs who've switched back to Win10 and just work off of the new Linux Subsystem and an X11 server.

What's the use case for a "hard core Linux dev" and WSL? I've tried it. It failed miserably for Rails development when it was released. I logged bugs. I went back to what I was doing. Maybe it's gotten good now?


How is Android relevant? It's not a PC OS, and doesn't seem like a real Windows alternative in the same breath as MacOS or Linux.


Project Treble should help for your last point.


>With Android being a clusterfuck of vendor specific bloatware and not having a single concept of a clean install, Microsoft should focus on keeping a clean Windows install equivalent to a fresh Mac install.

Speaking of Windows OEM bloatware clusterfucks here's your typical Windows HP OEM Windows clusterfuck:

https://www.youtube.com/watch?v=B1GyAI5d-Yw&feature=youtu.be...


The author didn't characterize his position accurately. He meant "Windows Server", not "Windows Desktop". Similar to how IBM sold off it's laptop business to Lenovo, the strategy is basically saying that "We're a business services company, there's no growth left in the end-user ecosystem anymore, because all of that growth is in mobile, which we suck at, so let's just double down on what we are growing at: cloud services and office productivity services".

That Windows we all know, both desktop and server side, is not dead. The economics on the server side are diminishing (20~30 year outlook) and Microsoft won't be around if they don't do something about it now.

IMHO it's a very smart strategy, people just didn't understand what Ben was saying because he didn't articulate it correctly (which I do find with his content, but it's less often the case)


If that's true, yes, Windows Server has been irrelevant for some time. It wouldn't explain the remapping of Office, though.


> Windows Server has been irrelevant for some time.

Most definitely not. There are a ton of Microsoft Certified Somethings that will refuse to administrate Linux servers unless their life depends on it.

Source: I work at SAP's internal cloud unit. We have to support Windows VMs (begrudgingly, if I may say), even for payloads that could run on Linux.


Inflexible workers will be separated from their jobs eventually, its just a matter of time.


What makes you think that? Those inflexible workers are also those who make the purchasing decisions. A shop full of Windows admins will not switch to Linux unless forced to (and vice versa).


And many will eventually be outsourced. In fact, systems administration as a profession is on the downswing due to the cloud. This site itself is largely focused on "disrupting" inefficiency with automation.


Not only due to cloud; even if you have local bunch of immutable OS instances like CoreOS or RHEL atomic running containers, you don't need a big staff babysitting server OSes.


Yes, conflated cloud and containers in my head when writing that comment.


Office is Office365 rebranded - which just means the lock-in is a monthly subscription instead of a license. The lock-in is further exploited because email is the cornerstone of any business productivity suite and it'll be hosted on Azure now (which is a huge lock-in to their ecosystem).


This article is way too long to make a simple point - which is that Windows is not being abandoned but is no longer the flagship. Cloud infrastructure and services are leading the way, so Azure and AI is now the forefront.

Sticking with something just because it's a large majority is a great way to fail when that something fades into the background.


> macOS isn't business-level stable.

Can you elaborate this assertion? As far as stability goes, macs are top notch. I'd say in the business context Windows machines are prevalent because of successful MS strategies (in terms of marketing, not ethics) in 1995-2005 era, and it kind of stuck around.


The enterprise tooling for macOS is woefully insufficient.

Windows has provisions for performing all sorts of tasks across a fleet of desktops. I can automate deployment of a Windows desktop to the point where all you have to do is boot it up and join it to the domain (which makes it easier to hire someone to do it as well).

What's more is that these settings will work across updates. I've gotten really sick of the screwy changes Apple makes to macOS with little warning. Windows goes to great lengths to ensure backwards compatibility between major version upgrades. (For example, did you know VB6 apps still run just fine?) Also, I can get fine grained control of updates to Windows.

But maybe most importantly, I can virtualize Windows and create a virtual desktop environment. Now I can hand out toasters to users and they can just log in to their box from anywhere.

Windows is simply much better at making desktop units fungible.


I was going to reply to the original thread with [citation needed], but thankfully someone already did it. The fleetwide maintenance is the issue, not stability.

That said, everywhere I've worked the past 8 years, across 3 jobs, was virtually Mac-only. But yes, the valley is an outlier.


> The fleetwide maintenance is the issue, not stability.

From the medium to enterprise range, maintenance and stability issues are often one in the same. Breaking automated maintenance and/or deployment leads to instability.

Keep in mind I don't mean "kernel panic" levels of instability or even "programs crashing" levels of instability. The smallest of changes can cause a mountain of work in large environments. Simply changing the icon of an often used program can cause a deluge of tickets.

This level of stability and control is something that macOS doesn't even come close to recognizing.


Agreed. To get anything close to this with Apple you need to go to third party tools such as Jamf.


I can agree with the stability of the tooling since Office products have become the de-facto standard of "enterprising" and MS doesn't really sweat herself to keep OSX tooling pristine. The original statement sounds like it is a statement about the stability of the operating system itself, which I'm sure it is good enough to support personal and business computing needs.


Business is full of "just good enough" solutions, and loves control. Apple has shown that breaking backwards compatibility doesn't bother them, they're not going to build IT control features to the same level, and ... maybe macOS is going to be sacrificed at the alter of iOS? We're not sure these days. Meanwhile, custom Windows software from forever ago works just fine today, and that's good for business.


It's possible they mean stability in terms of length of support. macOS as an operating system is more "user-stable" if you will, than Windows- i.e. less crashing. But Windows is that way because of the massive legacy interoperability in that codebase. Windows didn't just work because of marketing, it worked because when Microsoft supports something they do it for Enterprise-lengths of time- and with the proper Enterprise tooling ecosystem that Windows has. Ironic that what, probably (some of this is assumption), makes it attractive to Enterprises also probably makes it difficult for them to deliver a OS as solid, to the user, and excluding High Sierra, as macOS.


They literaly broke enterprise provisioning between the last beta and final release of High Sierra pretty much preventing update of huge fleets of enterprise MacBooks. That's not something business customers appreciate. And that's only the tip of the iceberg ignoring things like my MacBook deciding to fail to boot after a High Sierra security update.


I always thought one of Apples strengths was that they ignored the enterprise market, Windows has always been compromised by the need to support both home users and corporations.


> As far as stability goes, macs are top notch.

My MBP and its periodic gray screen of death begs to differ. Apple really needs to shore up its QA.


Re: seems like a ridiculous strategy. Who would abandon an ecosystem that they own over 80% of

I suspect it's a mere marketing strategy to jack up their stock price and gain a "cool factor". MS will go wherever the market/profits leads them.

Their cloud is commercially successful, a rarity for new MS products actually, and that's where they currently see future growth. But if Windows stays a nice cash cow, they'll quietly milk it.

"Productivity" applications still need a real GUI, something the Web has yet to do right cross-browser: CSS/JS/DOM is a mess. I believe we need a WYSIWYG standard to get GUI's right. The server side can still scale the window & widgets per device size under WYSIWYG, so it's a myth that WYSIWYG is anti-mobile. The client-side auto-flow approach sucks for efficient UI's.


> Who would abandon an ecosystem that they own over 80% of?

They aren't abandoning the desktop Windows ecosystem, they’ve just demoted Windows from a single desktop and server org that the rest of the firm orbits to something that serves the areas where there actually is expected to be growth.


I'll be interested to see how this bifurcation of the Windows team plays out. Windows 10 has been a mixed bag - some nice new features but also a lot of half baked buggy stuff. The evergreen nature of Windows 10 feels like the right move, but I'm worried they're going to let the OS stagnate. There's already a lot of low hanging fruit they could clean up with an update, but haven't.

I'm not a huge fan of using my tablet or phone for things. I almost always prefer using a "real" operating system. I know that puts me in the minority to an extent, but I'm not the only one. The problem is, neither major desktop OS is moving in a direction I like. I guess I'm just getting old and cranky.


> also a lot of half baked buggy stuff

This morning I noticed I have a 3D Objects folder pinned to the This PC item that I can't remove. I also have Mixed Reality Portal and Mixed Reality Viewer that I can't remove.

Why on earth are those apps part of the default install? Worse though - why doesn't Microsoft make it easy to uninstall them? I don't want their crappy news, weather, contacts, maps, music, and other default apps. They should make the apps so good users want them, not force install them and make them uninstallable.


That's to be able to claim an 'installed base of 100's of millions of users' for technology 'x'. Even if none of those people installed it voluntarily or has ever used it.


In their conference they presented that shit part of their VR strategy or similar completely insane BS.

In the same vein, I don't know what they intent to mean by their AI division (esp as a platform, service, whatever) and probably neither do they, but hey, you got to make the speech of the boss somehow vaguely believable, even if the whole earth know it is just for show.


> There's already a lot of low hanging fruit they could clean up with an update, but haven't.

In all honesty that's been my impression of Windows since its inception. The only time I can recall ever feeling like Windows was deserved market leader was with Windows 2000: Linux wasn't mature on the desktop, OS X wasn't yet released or literally only just released and OS 9 was god awful (worse than the Win 95 in my opinion). The only platform that really competed in technical merit was BeOS and we all know how that ended up.

Win 3.x felt like a huge step backwards from what Atari, Amiga and Apple were doing. Windows 9x was an improvement but slow and buggy. Windows XP took a long time to mature - years and two service packs in fact. And before then it was basically just a slower (bar start up time), uglier version of Win 2000. And by the time Microsoft recovered from the cluster fuck of Longhorn OS X and Linux were both streets ahead in terms of stability, usability and even just basic features.

There are obviously people who genuinely warm to Windows and prefer that as a platform - good for them; variety is the spice of life and all - but from a technical standpoint Windows has always felt like it's had a multitude of low hanging fruit which aren't trendy enough for Redmond to invest development time in (case in point: why did it take until Windows 10 before shortcut keys were added to cmd.exe? And then why did MS stop there? The terminal emulator still totally sucks compared to nearly every other ever made!)


I think because there's no career advancement at Microsoft, working on cleaning up existing features. Only by working on new stuff can you get noticed.

So, interns work on 'old' stuff (like office apps, Windows, tools). And they always look pretty rough.


Interns don’t do maintenance work in general. They build new stuff, often flashy new stuff that never ships [1], because the intern program is about evaluating potential hires and giving them a great experience so they want to come back. Giving interns tedious unrecognized maintenance work is a great way to convince them to go elsewhere.

Source: Microsoft employee

[1] The goal is still to build features that ship, but because they get new, shiny stuff, sometimes it’s speculative development and the winds shift before it goes out the door.


That's pretty much how it works everywhere. There's no career, or even pay, advancement for people who like to do this sort of work. You're called a "maintenance programmer" which people take to mean you're not very good. On the contrary, a lot of those types of developers are very good. Better than the people who initially wrote the code and jumped ship when the bugs started rolling in that they couldn't fix. They understand the system and the business much better than the talky-talky system architecture wanna-bes.

Whew, that turned into a rant quickly. I've just seen plenty of strong, smart developers get screwed because they don't do flashy work.


...as attested to by my relatives and friends who work there.


Am I remembering the timing wrong? XP came after 2000 and before Vista, right?


Yup literally that. Was my post ambiguous?


Linux on desktop is pretty amazing. Haven't looked back in 3 years.


A tiny linux/windows story:

For the last year I ran a x200 (c2d p8400, passmark 1458) + below average ssd. Win10, almost no cruft, some cortana disabled etc.

It's ok but became worse and worse (regular pointer lag for a minute, pointer lag is very very sad and indicates very low level perf issues)

I don't use full fledged linux Desktops, if at all. I run a bare i3wm + emacs + $browser + mpv setup.

Last week I booted my old x61 (c2d l7500, passmark 967) on old intel 80GB SSD linux machine. I was stunned ..

chromium and firefox are seriously more reactive. They're only slower for graphics/video/intense workloads. emacs and git flies. Everything is tiny and leaner. Even on an older, slower cpu.

It's even mentally easier since I have less things to handle under linux (still too much to my tastes, but windows has so many components and moving pieces..)

I have to run that archlinux/i3wm setup on the x200 to get more data.


> I have to run that archlinux/i3wm setup on the x200 to get more data.

Well, it's Linux; unless you're running Gentoo, you can probably just swap the hard drive :-) Unless of course the SSD is an important part of this test.


no you're right, I was just not willing to swap the disks


Caddies same size, so just one screw on each machine and a quick swap.

X220/8Gb/xfce4 runs fine for my modest needs.

At one of my employers they are using Windows 10 for Education. We had some atom based small form factor desktop PCs that worked really nicely with Windows 10 just after the upgrade, but then as great grandparent post says began to become noticebly slower as the weeks and months went by. They have been replaced with i5/8Mb machines now which seem fine.


Crazy how free market 'improves' things..

ps: btw I just did ssd swap. I'll see how the x200 does. First difference, wifi nic has a much stronger signal/bandwidth.


Outside of Dev work unfortunately Linux is not that much fun. If you need to do accounting, graphics or other work there are options on Linux but they are usually way behind what Mac or Windows are offering.


Even as an desktop developer, often Linux is 2% of the user base, very vocal, used to software being free, and with infinite OS variations. That's not very compelling.


Infinite variations is the bane of free/open software. It is the primary reason it hasn't taken over everything, IMHO.


And yet, it has, for definitions of "everything" that aren't a desktop computer, or an iDevice.


If I was in the market of workstation software such as CAD, Music DAWs, industrial automation etc. I would much rather have control over the base OS by shipping a customised version of Debian or similar. This would be very compelling for professional users also, who don't want the consumer-oriented cruft and bloat associated with Windows and Mac OS.


How would you deal with customers' corporate IT banning any non-certified OS from the network?

At work, we have a big TV screen showing an alert overview (a Grafana dashboard). A Raspberry Pi with Raspbian and Firefox would be enough to do that, but if we were to connect one to the LAN, its MAC would be blacklisted within less than a hour. (I've heard a report from a contractor plugging his own notebook into Ethernet, who had said cable ripped out by the local admin a few minutes later.) We ended up using an old notebook from one of our colleagues running Windows 7 Enterprise. Every few days, we have to RDP into it to click through app updates or restart the browser or something like that.


IPtables, block all incoming traffic aside from arps and replys to known traffic?


And what do you do when your customer wants to use on the same machine two professional apps from different vendors?

How do you reconcile Adobe Linux for After Effects/Photoshop/Premier with Autodesk Linux for Maya/...?


This has nothing to do with Linux, rather with the publishers who decide not to support Linux or any particular distro at least.


Yeah sure, let's not wonder why maybe developers don't want to develop for your platform or anything.


Nothing to wonder, it's easy to understand why they prefer to invest in 90% of the market, but they shoot themselves in the foot by investing in non-free platforms at the end of the day.


You're ignoring how they got to be 90% of the market. It is a comforting lie to say that it was all shady business practices and marketing, though those certainly played a role. Here's a clue:

"One of the things, none of the distributions have ever done right is application packaging ... making binaries for Linux desktop applications is a major fucking pain in the ass. You don't make binaries for Linux, you make binaries for Fedora 19, Fedora 20, maybe even RHEL5 from 10 years ago. You make binaries for Debian Stable…well actually no, you don't make binaries for Debian Stable because Debian Stable has libraries that are so old that anything built in the last century it doesn't work. …and this [“Don't Break Userspace!”] is like, a big deal for the kernel, and I put a lot of effort into explaining to all the developers that this is a really important thing, and then all of the distributions come in, and they screw it all up. Because they break binary compatibility left and right."

--Linus Torvalds, DebConf 2014


"they shoot themselves in the foot by investing in non-free platforms at the end of the day."

When is that end of the day? Over the last decades it would have been extremely difficult making money with Linux dekstop software. Windows and Mac were the only viable options to run a business.


...because Microsoft is good at marketing, has no ethics, and successfully leveraged its advantages in the 90s to kill all the competition? Is this a trick question?


Possibly, or maybe that's an excuse the community uses because they're unwilling to see how openly hostile they are to proprietary software on their platform.

Let me ask you a quick question: How can I distribute an application in a way that will work on all Linux Desktops?


A way to do that in a cross-distro way with (almost) no dependencies is via AppImage[1]. This creates a single executable that contains your application and all the necessary libraries and data files.

There is also work towards a more modular approach with Flatpak and Snaps, but those require support from the distro side. AppImage doesn't need any specific distro support.

[1] https://appimage.org/


And even AppImage, a solution I frequently praise mind you, doesn't always work right.

It's 2018 and the Linux Desktop still doesn't have a consistent way to... well do anything now that I think about it, but that you can't even reliably distribute an application is pretty insane.

Flatpak and Snaps are a typical Linux Desktop solution: Over-engineer the shit out of it.


AFAIK the only issue that could make AppImage to not work is lack of FUSE support, but any modern distribution should have that enabled.

Beyond that, AppImage is really just a container. It is still up to the developer to bundle the necessary dependencies (libraries, runtimes, VMs, data files, etc), but this is something they'd need to do with Windows too.

What problems do you have with it?


I have only ever used one, personally, but users of NixOS don't seem to be able to use them out of the box.

> It is still up to the developer to bundle the necessary dependencies (libraries, runtimes, VMs, data files, etc), but this is something they'd need to do with Windows too.

Only for libraries not present in the base system. Problem with Linux Desktop world is that they seem to actively loathe the concept of a consistent base system.


>>> Let me ask you a quick question: How can I distribute an application in a way that will work on all Linux Desktops?

It's a trick question without a clear answer.

Many people tried, noone has ever succeeded.


Steam on Linux works very well.


The next question would be on which distributions and which versions of which distributions?

Linux is a very large family of operating systems, with hundreds of members and variations. It's really a meaningless term.

https://support.steampowered.com/kb_article.php?ref=1504-QHX...


That KB entry is misleading: Steam works on most distros (at least all the ones that I've tried) because it includes its own runtime (glibc etc.) and thus avoids most of the idiosyncrasies of distros.

AppImage/Flatpak/Snap/Docker is mostly the same idea, just executed differently.


Yeah, it speaks volumes that the best way to ensure applications work on your platform is for the dev to bundle their own version of the platform in a container with the application.


Well, it's also how OS X apps work, and have worked for a long time already.


...those are called folders. DOS applications worked that way. Only UNIX people seem to think that not hardcoding all your paths is some kind of special voodoo.

Oh, you meant that they package the libraries with the application? Yeah. If it isn't part of the base system, they do. That's how it should work or you get DLL hell.

Problem with Linux Desktop is if you put any two of its community in a room you'd have 4 different incompatible base systems within a week.


At least with platforms such as Steam, the problem is one of porting the platform to each distribution, not all its applications.


Why bother to try to sell desktop software to users that want everything for free, and with source code available?

Usually the only ones willing to pay for Linux software are enterprise shops.


Linux users expect not to be locked-in and controlled. Perhaps if vendors tried different revenue models, such as sponsoring new features or charging for support/bug fixes. Over the last few years I have spent several hundred dollars in donations to free (libre) software. We are not all cheapskates :)


Hardly anyone can live from donations or bounties.

Try to get a mortage, pay a monthly rent, pay the kids school,.... just with donations.

One needs a regular source of income.

Desktop users are usually not keen to pay for consulting, trainigs or documentation. Rather just paying once.

Vendors that have pushed for web solutions get a constant revenue, but that solution makes the desktop irrelevant thus regular users see no benefit in running e.g. Chrome in other OS that wasn't already installed on the hardware that they bought.

Additionally such hardware is as proprietary as it ever was, even if built with a FOSS stack, as no one gets the sauce running on the server.


I didn't actually suggest donations was a lucrative revenue model. Support could be though, for example, RedHat is a billion dollar business.


"We are not all cheapskates :)"

Most are though.


Sure. It's not the direct fault of Linux but it prevents wider adoption by non tech users.


Let's not forget that most non-tech users can do almost everything they need nowadays in the browser.


Yep, the other day I was also pretty amazed how using VS Code alongside Rust, for writing a Gtk-rs application, managed to hang X forcing me to reboot the whole thing.

With the open source AMD drivers.


One of the more amazing bits of technology in Windows is that which allows the graphics drivers to crash and restart without taking down the system. They've spent so long fighting buggy third parties that this is the solution.


One thing I love about debugging on Windows is that a program being debugged cannot grab the mouse and keyboard. On Linux, if you break a program that has a context menu open, you can say goodbye to your ability to interact with the PC.


You can enable grab break actions with "setxkbmap -option grab:break_actions" and then use Ctrl+Alt+KP/ to break grabs and Ctrl+Alt+KP* to kill the applications with grabs (note that setxkbmap -option blah adds the blah option to the existing options, which might interfere with this - you can try and reset all options by typing just "setxkbmap -option" without a parameter and then typing "setxkbmap -option grab:break_actions").

This is useful both when debugging and when dealing with broken applications.


I believe this is one of things that Wayland fixes. (Unless, of course, neither your debugger and application support Wayland natively and they both end up running in XWayland.)


QnX did that in ... 1983.


I too remember the QnX demo disk with a web browser and desktop environment on a single floppy. But Linux still doesn't do this kind of isolation, partly due to philosophical opposition.

Edit for the audience: http://toastytech.com/guis/qnxdemo.html


In the famous Torvalds - Tanenbaum discussion Torvalds got it terribly wrong, but Linux was good enough to get off the ground anyway so in the longer term might made right. But a microkernel based Linux would have been a serious game changer rather than a re-run. After all we already had plenty of free Unixes back then, and the descendants of those are all still more or less alive today.

The major sticking point in that whole discussion is that Tanenbaum may have been right in principle but Minix was not a very good illustration of what a micro kernel could look like. QnX on the other hand was awesome and to this day powers a very large chunk of the embedded sphere.


Tanenbaum was certainly right, given the adoption of micro-kernel like architectures across the industry, specially on embedded.

Even containers are a kind of workaround for achieving the same goals.


I still have a clone of 32 bit QnX sitting on my harddrive, maybe one day I'll have enough time and energy to attempt to port it to x64 and/or Arm 64.


I'm not sure comparing an RTOS to a really old school monolithic kernal is fair. Windows wasn't designed to be like QnX. I really wish QnX was open source though and had a thriving community behind it.


It's perfectly fair. QnX had a fully functional desktop environment and a large chunk of functionality that experimental OS's like Plan9 popularized.

Whether or not Windows as designed to be like QnX is immaterial, the question is what would be the better OS to serve as the engine behind a computer interacting with its users and my vote definitely goes to a real time operating system using a micro kernel. Because in my opinion that's by far the best match for that workload.

Stability, speed of interaction driver isolation and so on are much much better when using something like QnX vs a monolith. And a GUI is a real time affair.


I have been quite impressed with VS Code. I’m....honestly shocked I’m using it but it’s doing what it advertises and is doing it pretty well. Sublime is getting less use these days.


Memory consumption and speed are apparently not important for you then, and that's fair enough. To each their own. I can only use so many resource hogs before my laptop says no. It starts with battery life and memory consumption. Speed (and benchmarks) is a good early warning sign.


I only use it because it is the best Rust experience on GNU/Linux and I am not willing to pay for CLion just for the debugger.

For everything else I do use native editors.


Tip: when X hangs, press Ctrl-Alt-F1 or F2 to switch to console mode, then Ctrl-Alt-F7 (or F8 on some distros) to go back to X. While in terminal mode you can investigate which process is reponsible for the X issue (high CPU? memory? disk I/O?) If that does not fix it, Ctrl-Alt-Backspace will kill X and every X application and you won't have to reboot.


That only works if console mode and X escape keys are configured.

Which is usually not the case on desktop oriented distributions.


X escape is usually not configured, but personally i haven't seen any desktop distribution to disable virtual terminals. Regardless, even if that is the case, it only takes a single PC restart for you to notice that the terminals are disabled and enabled them back so next time you need them they'll be there.


Yep, now explain that to a regular macOS/Windows user.


You gotta be kidding me. Good luck using multiple HiDPI monitors. The only reasonable way to use Linux is tiling wm with as less graphical interface as possible, in my case only xterm and emacs. Which is crazy, since it’s 2018. and I am using modern computer with modern OS like how computers were used before I was even born.


It doesn't sound like you have used Linux since 1998. It works "out of the box" with multiple screens no matter the DPI.


Ehh, screen zoom could be better on 4k monitors. I always find the display scaling to be WAY off by default and have to mess around with xrandr settings in the terminal to change fractional scaling. This (and many other basic setup issues) doesn't make for a great first-hand experience when first starting out with Linux.

https://wiki.archlinux.org/index.php/HiDPI#Fractional_Scalin...


I tried an Ubuntu 17.10 live image yesterday on a Thinkpad P50 with a connected Samsung 4K display, and my experience was not as you describe.


I'm not having any problems running two monitors in HiDPI with my Ubuntu Studio workstation .. everything just works and I enjoy it immensely. Curious what your issues are...


To be fair, some applications on some distros have HiDPI issues. LibreOffice [0], icons in GTK2 apps, anything that uses Java Swing, etc. Particularly galling is the oft-suggested fix of temporarily reducing desktop resolution and fowling up all the other apps that are supporting the proper DPI.

[0] https://bugs.documentfoundation.org/showdependencytree.cgi?i...


> anything that uses Java Swing

OpenJDK 9 supports HiDPI out of the box[1].

[1] http://openjdk.java.net/jeps/263


That he hasn't actually tried Linux.


Using Fedora and two HiDPI monitors, one landscape 4k screen and one portrait 2k screen, Gnome works great in this setup.


More common setup of one hidpi screen with another normal dpi screen (laptop + external screen) tends to break horribly in many cases though.


Don’t know about GNOME but recent KDE can handle that case just fine.


Sounds like a pretty edgy edge case to me. In a more typical deployment like a laptop with a built in display or a desktop with a single 1080p or 1440p monitor you can pick from a variety of distributions that on firm "it just works" territory.


I have a nice laptop, with a 3k display. I plug it to a normal screen, with a 1080p display. What happens? Well, it works but it looks like ass. Wanna fix it? Sure, just google around for 2 hours until you find a complicated xrandr command that you have to run every time on startup (but the OS will break every time you unplug the screen). Also, high DPI support is still garbage, which means that half the apps will have their text looking too small anyway.

So no, I don't consider it an "edge case" if everyone in the office with a Linux laptop has the same problem


It still is an edge case when the case is rare and people with high dpi Linux laptops (or monitors in general) combining them with regular dpi monitors is rare. Hell, high dpi monitors by themselves are still rare.


Pretty much any MacBook connected to an external monitor falls in this "edge case". Or any modern MacBook-class laptop. They might not be the majority, but they are still a group of people counted in millions.

It's only an edge case on Linux because anyone trying to use such a laptop on Linux is met with pretty much unacceptable breakage and has to move away to Windows or macOS.


It's as rare as high dpi laptops, which are less and less rare nowadays. If you have a laptop, you're gonna plug it to an external screen at some point, and regular dpi screens are very not rare. This shortcoming is indefensible and should've been fixed years ago


The intersection of people that care about high dpi laptop screens, but not about high dpi external screens is very, very small. Also, if you are in that group, why wouldn't you just turn down the resolution on the laptop or even turn it off (since you connected a low dpi external monitor, you don't seem to care about the additional sharpness of high dpi, right?)?

High dpi is important as is low dpi (for a while still at least), mixed dpi is an edge case even if you disagree.


I'm running linux with a 4k monitor and a 2560x1600 monitor, and it's plenty smooth. It is worth noting that I have a GTX 1080, but everything else in the system is 7 years old.


You don’t need a GTX 1080. Any modern low-end GPU like RX 550, GTX 1030 or the new Ryzen APU would be fine. Even higher-end Intel iGPUs should work[1].

Basically all you need is at least DisplayPort 1.2 (or 1.2a if you want Freesync) for 4K @ 60 Hz.

The RX 550 has 1.3 support, so it should handle up to 2x4K @ 60Hz with MST (daisy-chained or with a splitter) or a single 5K @ 60Hz.

Some 4K monitors can also work with 2×HDMI, but I don’t think you’ll find that on any low-end card. Maybe high-end used.

As for actual performance, there is a huge gap between what you need for gaming and what you need for office work.

A low-level graphics card from few generations ago with 1GB of VRAM would be fine even (like Quadro 2000), except that it will only be able to drive the display at 30 Hz.

The biggest gotcha with a 4K display is actually your host RAM. If you try to run anything like GIMP, Photoshop, or vector editors in particular, it will gobble up much more memory than usual. Your regular web browser will as well, though to a much lesser degree.

Getting something just a little better, like an RX 560, might be a good idea though, this gives you enough GPGPU power to run most expensive video filters and scalers at native 4K[2].

[1] My Apollo Lake netbook with an entry-level iGPU can handle a 3K display easy

[2] Your source material might be 1080p but you end up postprocessing it at 4K. This might surprise some people.


To some, that is good. Despite fancy displays, the terminal is still far more efficient in my opinion for many operations, although graphics are nice for things and those things work great on Linux


Linux is definitely ready for full time Desktop use as long as gaming isn't needed. 7 years only Linux and I feel great about it.

I highly recommend Linux Mint with Cinnamon to anyone just starting out.


I play KSP and Civ on Linux just fine. I stopped playing games shortly after civ 3 (which I installed windows specifically for, lasted about a year) - until about a year ago when I decided to check out KSP.


Solus is great for gaming, and very user friendly, too.


Funny - I use a linux desktop in a VM only when I absolutely have to. Because I can't get much done without a struggle. But to each his own!


Opposite. I use Linux in VM because I can't be bothered to do everything in Windows (when I have to be on Windows).


Reverse here: I only boot my Windows VM when I really need to build something with Windows. Other than that, its Linux all the way. I find it an utterly pleasant experience compared with either Windows or MacOS, which is interesting..


Oh, I use Linux all the way usually, except when I am stuck on a Windows workstation - hence the VM solution.


I should wade back in and try again. I attempt to use a Linux distro as my daily driver every couple of years or so. I can usually get pretty close, but inevitably I end up going back to Windows and MacOS.

There's small UI affordances I miss, there's no MS Office (which is becoming less of a problem for sure), Spotify is iffy, etc etc. So many small things that add up.

Finally - Windows Susbsytem for Linux, or whatever it's called, has really made CLI work on Windows painless. With oh my zsh and CMDer I can do just about everything I can on my Ubuntu machines. Even though Windows 10 or MacOS aren't perfect, the pain isn't quite great enough for me to seriously make a change.


I used it for a while a few years ago at work. My impression was that 90-95% of it works well but there remains this persistent and very stubborn 5% load of edge case bugs.

The problem is that when you hit one of these issues it's infuriating and often costs a lot of time. It also has a way of happening when you really don't want it to, since when you're traveling or in a hurry you're more likely to be doing "different" things with your machine and thus are more likely to hit a problem. Linux desktops are like a car that runs great for your daily commute but blows up on road trips.

IMHO the reason for this is that the remaining 5-10% of the polish that's required to reach excellence in software often takes >50% of the time and is by far the least fun part of programming. As a result people generally have to be paid to do it. Free software rarely does unless it's heavily sponsored or incubated in-house by someone with resources.

This is why I still like Mac. I think Apple as a company is a little past its prime but Mac still offers by far the most reliable and consistently productive desktop experience. Windows has improved but they've decided to encrust it with advertising. Ads in a paid app or OS are a total deal breaker. Windows also still has more edge case issues than Mac, mostly due to driver issues.


Same. Linux desktop all day every day.


Say what you want about bloated Electron apps, but they've improved the Linux desktop experience. Slack, VSCode, Atom, KeeWeb, among others. Also the modern web, PWAs, etc.


Electron apps are by definition packaged web apps, the desktop where they run is irrelevant.



I just went back to using KDE a few days ago afters years on GNOME and boy, did KDE change! Plasma is a welcome change and KDE feels a lot more customizable and flexible.


If I still have to open the terminal and edit text files in /etc for essential things like sound, WiFi, accelerated graphics, and so on then it hasn't really evolved in the right ways.

Linux on the desktop has always been a niche experience for a niche audience. ChromeOS and Android (both based on Linux) are mainstream desktop experiences, as are MacOS, iOS, and Windows.


That hasn't been the case for a while for me, even for graphics stuff - do you use a desktop setup with gnome or similar? I'm pretty noobish at this stuff but gnome, unity, etc should be coming with gui settings baked in.


If you still have to, except you don't have to.


Except I have in the last month, I was following this article:

https://help.ubuntu.com/community/SoundTroubleshootingProced...

Someone should tell Ubuntu that this is no longer needed.


This is for when you're having issues, quite rare nowadays. Similar things happen on Windows, for example fiddling with drivers to get sound to work or stop popping.


Well, you can do that if you want to, but on the more desktop-friendly distros, I have not had to edit a configuration file in years.


Is there a laptop you would recommend?


Dell XPS 13.

Works with Linux out of the box as Dell made all of the driver's work for their developer edition's in Ubuntu. Battery life and and the 3200x1800 screen are phenomenal. Well made and Dell's warranty process is actually fairly good. It's no Apple warranty, but does the job.


Lenovo Thinkpads.


I’m, at-best, a mid-level Linux user and I’m coming from MacOS.

I bought a Lenovo Carbon X1 4th gen a few months ago. I was originally going to keep a Windows partition, but wiped Windows 10 Home after about 20 min. The fresh install of Linux Mint was even simpler than expected. I’ve been really enjoying Linux ever since, with the only issue being the HiDPI support mentioned elsewhere ... a 1.5x would be perfect and once that happens, I’d recommend my build even to my parents who are long-time mac/windows users.


FWIW, I run openSuse Tumbleweed on an Asus Zenbook; all in all, it has worked very well. I have the low-end model with a Core m3 CPU, so it's not blazingly fast, but it plays 1080p videos smoothly.

The only problem I had I might blame Linux for was that it randomly dropped the connection to my Bluetooth headset, and it took a reboot to connect the headset again.


I've used Linux on the desktop and laptop since about 2000, always with thinkpads.

Back in the late 90s there were genuine hurdles to do with Linux desktops, my 1999 desktop didn't work in X with redhat 5.2 because of the graphics card.

Today's when most productivity programs are web based, and so much more is cross platform, its far easier to run Linux than windows.


I like trackpoints, so I've been using various thinkpad models for a while and they all seem to work well.


I use an xps13 9350. I am running fedora on it and quite literally everything works out of the box on install.


I've used recent enough Dell and Asus laptops, no issues whatsoever.


Try to avoid any with optimus graphics (eg a dedicated GPU) if you can.


A 2010 thinkpad works well.


2018 is the year!

(I exclusively used Linux on the desktop from 1995 until I bought my first Mac in 2002).


>I'm worried they're going to let the OS stagnate.

An OS should be stagnant. Its job is to get out of the way as soon as possible, not to constantly wow its users with cool stuff. Any user-facing change will break existing workflows and force people to learn new habits.

Change is bad. That goes for every piece of software, from OSes to web browsers to libraries. Stagnation should be your goal.

Microsoft can't do things properly. Without constant pointless novelty they couldn't convince people to buy a new version every few years. That business model has made them a lot of money, but making money and making good software are nearly orthogonal goals.

EDIT: Maybe I've misunderstood you. It sounds like you're talking about bugfixes, which certainly something Microsoft should work on.


> Smartphones first addressed needs the PC couldn’t, then over time started taking over PC functionality directly

I still don't understand how anybody gets any real work done on a smartphone.

Has work suddenly gotten so simplistic that most people really don't need a keyboard? They're dragging and tapping and writing IM-like sentence fragments riddled with autocorrect typos? And this is acceptable? Seriously?


I think that's the mistake Ballmer made with Windows Phone. He and the other leadership saw the rise of the smartphone and thought "that's the future". The reality is more nuanced; the smartphone created a new market, but it definitely hasn't supplanted the existing "home computer" market, or even really hurt it in any way.

Microsoft should have recognized that, recognized they were late to the party, just let Google & Apple have it, and instead focus on integrating with their products and doubling down on the "home & professional computer" segment they already lead in. But they were late to that as well because of Ballmer's mistake, which gave Apple the in to position the iPad as an up-and-coming product in this segment.


> The reality is more nuanced; the smartphone created a new market, but it definitely hasn't supplanted the existing "home computer" market, or even really hurt it in any way.

This is demonstrably wrong. The home computer market isn't even a market that exists in many parts of the world. To see that phones can supplant the home computer market almost entirely in Africa and India and say it has no effect here is crazy.

You can already watch it happen. Will people need a computer to do long form typing? Maybe, although for many a tablet with a Bluetooth keyboard will suffice.

Will an average family of 4-5 need more than one though? Will many people just get by sharing a laptop one family member got from work, or an old clunker they use just to edit essays for school? Absolutely.

Will smartphones and tablets kill home computers? No, not for a long time. Will they pretty much cap any growth in that market, and turn them into trucks when they were formerly cars? Without a doubt.


> The home computer market isn't even a market that exists in many parts of the world.

That doesn't make GP's statement demonstrably wrong. The home computer market isn't a market that exists in many parts of the world because it never existed there in the first place.

The smart device market may have affected the growth of the home computer market, but I also believe that it hasn't supplanted its existing users by any meaningful measure. At least, not yet.


I can get do a lot of writing on Huawei Mate 9. Droid Vim works amazingly fine for editing. For writing Swype keybord feels faster than writing on full keyboard. I can have my GIT repo on the phone. Jira also has app. So basically a lot of low level tasks is no problem. Of course big project with a lot of files, compiling and building is not going to be on the phone but you can have CI/CD pipeline and do not have to compile locally.

"Management" needs email + spreadsheet, spreadsheets are not good enough on phone screen, but on tablet I think those would do. Just like reading code files with a lot of lines.

I think most of the work is thinking about meat and core of the problem. Writing it down with typos or not is not a problem. You do not write corporate communication statement that will go to millions of customers every day. I do not mind getting email from coworker with typos. When I ask my boss about 5 mins conversation I am not going to write "Dear sir or madam can I have 5 minutes of your time. Sincerly, O.". More like "Hi, Can we talk for 5 mins?". Work is not simplistic writing it down or improving later depends on context. You have great idea and phone, write it down, if you have to show it to millions of customers, then at least 2 other people have to read it anyway.


I think its lots of jobs that didn't need full-on computers in the first place.


So are mom and pop businesses using Chromebooks and Ipads to do their accounting now? Last I checked Windows was still pretty dominant in both Enterprise and midsize businesses, though I could see a food truck being run from a phone (theoretically).


Anecdata: most of the SMBs I know are using web/cloud-based accounting products, and like it for varying reasons (not having to manage / upgrade the software, easy to collaborate with their accountant) but a major one is being able to access their accounts from anything with a web browser.


As a counterpoint, my wife runs her own bookkeeping business and pretty much all of her clients are using Quickbooks on Windows workstations, with no real desire to try anything new. I know a couple people myself that are using cloud-based accounting products but they seem to be the exception. In both cases they are tech businesses, which perhaps makes them more open to trying something new than a typical small business.


Can you even buy a new Quickbooks desktop license now, I got the impression it had all moved to the cloud.


Yes, they don't make it real obvious, but if you look for Quickbooks Desktop you can find the versions you install instead of the SaaS.


I'll take a guess and say that these are all older businesses.

Businesses that started before the advent of cloud software will continue to use QuickBooks because that's what the owners know, and why fix something that isn't broken? But newer businesses that started in the last few years will be much more likely to use cloud software because it's so much more convenient.


A lot of new businesses (most?) just go with what their accountant and/or bookkeeper tells them to use. A lot of accountants continue to prefer Quickbooks.

One of the accountants I know considers herself computer savvy (she's not, but that's a different topic) and she's pretty young (30s) and she insisted I build her a server for her business to run Quickbooks off of. She won't trust a cloud solution, or even a remotely hosted server. Some of this fear seems to be based on the seminars she goes to which are either held by or attended by FBI special agents. They are pretty good at telling you all the ways you may be hacked, and precious little practical advice on how to avoid that while staying sane.


There are non-technical reasons to prefer locally hosted accounting software. Privacy is one if them.


Do you mean privacy, or security? It would be a massive breach of trust if SaaS accounting products were harvesting your business data a la Facebook...


What about harvesting your data to the IRS?


And there's the catch-22: how do you access the web when the desktop OS's are all gone? Tablets and phones don't cut it and there isn't a single stable desktop OS that hasn't been getting progressively worse for the last 5 years.


I don't agree with the premise that phones and tablets won't cut it for this, but if you need a hassle-free OS for accessing the web then ChromeOS is it.


They still do 'the work' in Excel


Windows 10 is really difficult for businesses entrenched in Windows to deal with.

The work that you need to do in order to unwind from legacy Windows stuff like fat apps, IE-optimized websites, and legacy Office-based workflows makes it pretty easy to move to an iPad to Google, etc. The only thing that Microsoft has going for them is that every other vendor is also fucking up client computing in different ways.


I know I'm in a minority here, but having used every windows since 3.1, IMO windows 7 was the absolute pinnacle of windows design, and from a ux perspective I think the best desktop os period (could still be improved, there's definitely individual features that are better in other OSes). I really wish they had stuck with the UX/design and mostly just improved performance/added useful stuff rather than throw it all out and start over.


You're not as big of a minority as you think. Windows 7 is still the most used desktop OS by quite some margin. MacOS hit its pinnacle at about the same time (Snow Leopard, Summer 2009).


I've seen problems with websites as IE versions are upgraded, but that happens with or without Windows 10.

I haven't seen any issues with apps though, so I have to dispute this claim. We just moved 5,000 employees from Windows 7 to 10, and it went far smoother than anyone really expected - everything just works. It's a wide range of apps too, everything from in-house WPF apps to CAD and engineering software.


Janky enterprise crap optimized for IE <whatever> is one big example. The other is old Office stuff like Access databases or workflows dependent on conversion of old file formats. (Last year Office 365 ProPlus purged legacy DBase import for a few months)

IE11 support will go away at some point. Many of those apps are now being updated to work on Chrome -- I know of a few large shops that aren't testing IE or Edge anymore for internal apps!

I do agree that they did a heroic effort on making the update process incredibly reliable. Most non-prehistoric applications transition seamlessly.

Note that I personally equate Windows 10 and the subscription "Office 365 ProPlus". The layers of complexity are astounding. I present to you what I call the "Office 365 ProPlus Build Decoder Ring" - https://support.office.com/en-us/article/Version-and-build-n...

Windows 10 itself has about 8-10 releases that an enterprise needs to think about. Transition from release to release has gotchas and additional Windows-Office gotchas. (For example, the way Office authenticates changes in certain builds) (See: https://www.microsoft.com/en-us/itpro/windows-10/release-inf...)


Haha. I'm living this right now, but we're getting there. We'll have a few thousand PCs on Win 10 soon and will finally deprecate IE8 for Chrome :p. I'm dead set against using iPads for anything serious although I keep losing that battle.


Agreed. They are undermining the areas where they had most strength. It seems rich desktop apps are being pushed aside in favor of the app store model. For new stuff we now usually choose web over WPF.


I see iPads everywhere at smaller restaurants. Especially in food courts. I've even seen a few Android tablets (including at one upscale food hall in Plano, TX). Not just restaurants, too; independent stores are picking up tablets at a huge rate. If a store has opened in the last three years or so, they're probably using a tablet as a cash register.

Tablets are rapidly taking over the market for cash registers.


Many of those tablets are actually running W10.


There's a lot of these about. The license cost reduction for "tiny device" (small screen, <=32GB flash, <=2Gb RAM) has caused them to proliferate. In some ways they're the real replacement for WinCE.


Sources?


Personal anecdote of what I see being sold on consumer stores here in Germany.

Or the restaurants or cocktail bars that happen to have tablets instead of paper menus.

They started with iPads, but as their chains got bigger they moved to a mix of Android and W10 tablets.

Source, personal anecdote getting out of the menu app into the OS.


I guess that's just a difference between Germany and the US (or at least Texas).

Most of the tablets I've seen in stores and restaurants are actual iPad hardware, the virtual keyboard used when entering information is the iOS keyboard, and the Square app follows the Apple design guidelines.

And the rare occasions when I encounter an Android tablet, I recognize it from the virtual keyboard. Case in point: the food hall I mentioned in my above post. When you buy something, it asks you if you'd like to enter your phone number so they can text you when your order's ready. When I chose to enter my number, I instantly recognized the keyboard for that as the Android 4.x keyboard.


I agree in that iPad are being perceived as too expensive here in EU, notably because of the advantaging EUR/USD conversion rate that Apple don't really follow when it come to pricing it's product.

On top of that shop owners tends to rely more on local vendors for cash machines, and many of those still heavily built atop of the Windows ecosystems. In fact I wouldn't be surprised if W10 surpassed Android for this kind of appliance. Here in France I often see waiters use device with old school stylus on UI with really small buttons. I'm not even sure what OS this is to be honest.


I'm pretty sure all the iPads I see are not. In all honesty, I can't recall the last time I saw a Windows laptop in the wild.


Come to Europe, Africa or Asia continents, you will see Windows laptops everywhere.

Or as alternative go to a local games developer meeting group.

There were plenty of them in San Jose at GDC, from what was shown on the live streams.


Windows is not going away. it's just been downgraded from primary strategy at Microsoft


As a long time developer on the Microsoft stack (approaching 20 years now?), it seems to me like this makes Microsoft and it's services less relevant and more easily replaceable in the long run. It's clear Azure and the related services are very lucrative, but I feel this is because of the prevalent use of Windows, not in spite of it.


well, as a long time developer in financial enterprise space (18 years) I think MS made a lot of right moves already since Nadela took over. betting all your eggs on Windows dominance proved to be a faulty (and dangerous) strategy especially since a lot of enterprise started using Linux / open source. For example I know a very major hedge fund that went from Windows everywhere to using every open source database they can get their hands on. The whole container strategy is just much easier with linux as well. it's a very hard battle to win. one big co vs every startup betting on OS.


Microsoft could kill Linux's foothold in the server + SMB space right now just by simplifying licensing.

Get rid of CAL's. Charge a fixed price for a certain # of simultaneously running Windows instances, no matter how you run it. Even charging an ongoing subscription for this would be acceptable. Simplify the licensing situation for SMBs wanting to run an on-premise redundant systems, without needing diagrams to explain the difference between running, standby, CAL's, virtualization, etc.

If you're developing software for a platform, trying to figure out how to optimally license that platform should be the least of your concerns, so you can focus on your product.


This is blatantly false. You're assuming that people only use Linux because they don't want to pay or deal with licensing complexities of Microsoft which is not the case.


That's not the only reason. If the Windows platform brings more to the table than Linux, then for a developer that would weigh more heavily than the complexities of licensing it.

I've supported several Windows-centric SMB accounting platforms, including QuickBooks and trying to figure out the right way to license just Windows, in order to support High-availability or redundancy is headache-inducing at the SMB-sized server level.


>a lot of enterprise started using Linux / open source.

any major players? entire healthcare industry is married to microsoft, at least the big names. same for lots of major enterprises that i know of


Same here, but I don't see any rush to change those Windows desktops by something else, beyond eventually some Macs.


you're thinking about desktops. think servers/mobile. that's a highly competitive space you won't compete in by making windows 10% better every 2 years.


Currently Android seems to have lost the tablet war on consumer electronic shops.

People are either going for iPads, or hybrid laptops runnig W10.

Android devs are to blame, given the little effort to adapt phone apps to tablets.


>Currently Android seems to have lost the tablet war on consumer electronic shops.

And yet they still account for the majority of tablet sales.

>People are either going for iPads, or hybrid laptops runnig W10.

The W10 tablet market share is so low they barley even register. In fact, Microsoft Surface tablets aren't even visible on the following graph:

https://www.statista.com/statistics/276635/market-share-held...


Most statistics, don't classify Surface Pro and other full desktop 2-in-1 as tablets. Small blip for MS was there ARM tablet


I don't see how the Surface Pro could be classified as anything other than a tablet since that's what it is. The type cover keyboard is a $160 option. It's also losing money and there are rumors that their Surface line will be cancelled in 2019.

https://www.theregister.co.uk/2017/10/04/surface/



So what, if I find a source that contradicts your's would you just put yet another link?

I know what I see when I go into a store, travel across Europe and look around for what the other passengers are using.

If you feel vindicated by posting some random link asserting you are right, so be it.


Here's another link that contradicts your anecdotal stories:

https://www.statista.com/statistics/276635/market-share-held...


What you see and reality can be different. I am from third world i do not see much iPhone users at all. And from the link iPad does have commanding lead over android tablets.


i'm not posting a "random link" to prove you wrong. I googled around and saw a few sources all of which seem to agree on current market share, which made me genuinely confused by your comment.


actually, if you drill down to tablets, which the GP was referring to, the iOS has 2/3 market share.

http://gs.statcounter.com/os-market-share/tablet/worldwide


In my experience... yes. At least in the Pittsburgh area, every food truck, small boutique, expo vendor, etc. that I've encountered has used Square and an iPad to handle payments.


For me, the experience is that it's harder to find an accounting software that is Windows-only. Pretty much many of those are running inside of a web browser. Your experience may vary depending on where you live however.


Anecdotal I know but my business employs 4 people and a few (including myself) are full time Linux users. I use Xero for our accounts and I'm very happy with it.


I own a small business and I run my accounting with FreshBooks, which I can easily use via my phone, iPad, or MacBook. I haven't used windows for 18 years. I'm probably an outlier, but it can be done.


It can be done but usually going Windows is the path of least resistance for most non tech users.


I can agree that windows can be the path of least resistance because of legacy business softwares. But please don't even start to argue that windows is more easy to use than an iPad.


For a lot of professional things windows (Or Mac or Linux) is better. For data entry a tablet is a real pain. I wouldn't want to do accounting on an iPad for example.


How is doing small business accounting on an iPad, with a keyboard, any different that doing small business accounting on a Surface? You tap on the screen where you want to put data and type. Also, when using FreshBooks, which is what my comment was about, once you set up your customers, line items, and rates, creating an invoice involves about five taps and maybe typing in the number of hours/quantity, if it differs from the default of 1. That's it. I'm not sure how Windows could possibly make that any easier.


How does "going Windows" make doing small business accounting easier?


You have a lot of mature software available for that task.


I am looking at small business accounting. What are your thoughts on FreshBooks?


Personally, I love it. It is easy to set up your logo, custom web portal, customers, line items, and rates. Once you've got that done, creating an invoice only takes about five clicks. FreshBooks also emails invoices and reminders for you, allows you to easily accept payments electronically, let's you send satisfaction surveys to your customers, and gives you a weekly report on how much you've earned, how much you've collected, and how much customers still owe you.

I evaluated a bunch of other products, like QuickBooks, Curdbee, etc. but always ended up coming back to FreshBooks and have been with them consistently since about 2010.

Their customer support team is also great. I've only had one or two issues with the system, but in each case, they were quick to respond and help me resolve the issue.

The hardest part of running a small business for me is collecting money from customers, and FreshBooks makes that easy. FreshBooks allows me to connect with my customers in ways I wasn't able to before and it has reduced the process of billing to just a few clicks I have to do once a week; allowing me to focus on the services I provide instead of data entry and invoice tracking.

I hope that helps.


Microsoft is squandering the decades of work humans put into drivers, an OS codebase that is unique and omnipresent, and if they can't see how useless and crap all their other services are without the conceptual core that powered their rise to sucess, then they will die.

- windows domains - group policy - roaming profiles - drivers written for in-house equipment, ranging from machining to mass spectrometry, done over decades, and basically compatible back to NT 4, until M$ screws everything up...

They just don't know where their bread is buttered, and their bean counting practices are evidently obscuring it, as they keep doing stupid crap decisions like Office365.

They should stabilize and streamline what they already built...


If they play their cards right, this will actually strengthen that part of the universe. Windows was led by people who wanted it to be a consumer operating system in every home. That lead them to do a bunch of stuff that was nonsensical to the business world (Win8 start screen on Windows Server? Seriously?) This move makes Windows subservient to people who actually understand those users.

Their continued dominance in business is contingent on them preparing their products for the next wave of workers. The generation that grew up typing papers out on their phones is going to have a different expectation of what productivity means. They’re going to have to scramble to meet it. Since Salesforce is not so quietly buying a modern productivity portfolio to take it from them.


It's not the end of Windows. The title of the article is a bit click bait-y.

Without the "we need to dominate the market and be everywhere" people pushing devs to implement feature nobody wants and have to deactivate anyways, Windows will get better for the people that love (and need) to use it.


> The title of the article is a bit click bait-y.

The article itself implies the same. It's very clearly a case of putting the cart before the hearse.


I don't see anywhere in the article where Windows is going away or no longer being invested in... if anything it's going to get more focused on in terms of a pure client OS perspective than some mixed bag of services + built-in apps + OS.

This should be good for Windows as it will focus on what it's good for.. being the base system for drivers, windowing, app launching, etc. While the services/cloud gets decoupled from it allowing more flexibility for their developers/programs to work across multiple platforms.


Everything that has drivers for Windows now has better drivers for Linux. I've plugged in 4x12m cnc machines that just worked out of the box. We did it for shits and giggles to see how bad it would screw up.


When my child gets older and leaves home, I won't be saying its the end of him. He has just matured.

Similarly, this just marks the maturity level of Windows, where there really is no significant innovation in operating systems release after release. Nothing like win 3.1 -> win 95 etc. Its been that way for a while. Obviously its cash that motives this move from Microsoft, however I wouldn't want Microsoft changing the location of the start menu every release just to stir up the market and justify a release (for example) every year.

I remember a cherished time where the next operating system release felt like an upgrade to myself as well as my PC. I could do more than before. No longer. We have it all nowadays and have became spoilt.


The more comments I read on here the more it seems people have only read the headline, or have only a surface level grasp of the content, without really understanding what the author is saying.

"The End of Windows" = "The End of Windows as the core foundation of the various Microsoft divisions/services/apps"

It does not mean the end of Windows as an important part of Microsoft...at all.

Rather it's about decoupling Windows from the rest of the other divisions to reflect the reality of today's non-PC centric world where Window's is merely one client among other mobile/cloud/AI/IoT/etc clients.

As others have pointed out, the perfect example is how Ballmer didn't want to release Office for iPad until Windows had a touch screen version to compete with it. It means more flexibility for both the company and for Windows, which is win-win for Microsoft consumers and Windows users.


Generally I think we call that "clickbait," no? :)

Headlines like that make some people less likely to actually read the article, especially on HN.


Just because Microsoft is desparate to unshackle themselves from a loss-leader doesnt mean its about to happen anytime soon, albeit an announcement like this will certainly goose the stock. The only person angrier about missing the app-store cloud-based moneytrain is likely Larry Ellison whos already furiously trying to saw off the Oracle Database boat anchor.

You're going to need to convince your channel partners this is a thing that has to happen, and you're going to expect a revolt because so many tertiary industries are contingent upon Windows and its licenses. The tear-down for everything from the Windows app-store to the ballmer-era brick and mortar "windows store" is going to be significant. There are also laboratories, power plants, hospitals, and prisons that all rely heavily on Windows, so expect to be on the hook from state level government for quite a while...the "end" of windows also confirms the wastefully squandered effort to get Germany to obsolete its massive and very functional Linux deployment.

You'll also need to have a substantive marketshare in, say, cloud in order to start deprecating Windows. Outside of Redmonds own inflated reporting, Azure isnt exactly a competitor. Most devs would rather walk off a cliff than learn a new API --one that only Microsoft uses-- for cloud that is not EC2 compatible in any way. Conversely there are more than 40 providers of cloud services that all managed an EC2 api for objects.


I think its impossible to understate how much damage Ballmer did to Microsoft. That being said: I think Microsoft is in a fantastic position right now under Satya.

Windows isn't going anywhere, and they're accepting the reality that it might just be a "portal" into web technology in a similar vein to Chromebooks. This is a reality Apple refuses to accept, for better or worse. But, Windows still comes with those massively powerful internals that enable more powerful professional experiences for the users that need them.

This is, really, the only platform in the world that champions both. MacOS obviously has those powerful foundations and can run webapps natively, but its not something christened by Apple. iOS is the same way, but with less powerful foundations. ChromeOS? Android? Linux Desktop? They're not even considerations in this world.

Then you consider the massive success of their Xbox and Azure divisions, and I think anyone who bets against Microsoft right now isn't doing it cogently. They have a lot of legacy and Ballmer's strategic mistakes to get right, but I legitimately think they'll come out of the next 5 years in a better financial position than Google.


Better title would be the end of the Windows Division at Microsoft. Certainly not the end of Windows - it's an enormous revenue generator for Microsoft. But I believe it shows that they believe the future lies within Azure, AI and Microsoft 365. This will probably help them focus.


I'm surprised no one mentioned this but the biggest takeaway for me is that it moves Windows one step closer to becoming open source. We're still many steps away and it's going to be more like the hybrid model of Android or even OSX/iOS/Darwin and but it seems much more feasible now, even if it starts with baby steps.


Windows is touted as the largest git repository. What service like Github or Gitlab could handle something that massive? I'm sure they could scale eventually but if that ever happened MS would likely have to house the repository and at that point they might as well keep it closed source or open source with a subscription cost that goes into the infrastructure upkeep.

I'd honestly love for more portions of Windows to become open source and slowly work up to the full repository. I think there would be an immediate influx of 0days as people find holes but I would hope they would close quickly. It's still a very big gamble to let go of something you've literally built your foundation on but I do hope that instead of Windows silently going into oblivion that they'll open it up for the masses, even if they would no longer have the resources to support it. I feel like someone or company would step up.

Having said all that, I'm going on 4 years in mostly macOS. The missteps in High Sierra are pretty abysmal but not the end of the world. I hope Apple really changes their processes to keep the OS stable but I see this downward trend in just about all PC-like operating systems. It's like everyone sees the end of the line coming and they're just half assing every move they make. I think Meltdown/Spectre was a major blow that has probably single handedly slowed all momentum to a standstill. Granted my sample size is very small but I really hope I'm just looking at this big lull right before an evolution/revolution.


I think that started when Ballmer finished his reign. Even with his developers, developers, developers mantra, .NET ecosystem was not as awesome as now it is, open source with .NET Core runtime on Linux and MacOS.


Early Windows was a great platform for the users that it served. But every few years, the Windows product itself had one of those "off years." It arguably started with NT 4.0 for workstations, then again with Windows ME, then Vista, and then arguably 8.

This interleaving strategy was okay when they were the only market player, but now that there's actually good competition in the computing space (read: smartphones and chromebooks), it's not so good. All of the little sacrifices that Windows has to make to move itself forward are cuts that make people reconsider using it in the first place.

I honestly wonder what world it would be if every Windows release had "stuck the landing." If we had a super solid Vista and a super solid Windows 8, would Microsoft even be in remotely the same position as they are now? Probably not, because it would actually have been the preferred choice, and not just the default.


Unarguably 8! My mother owns Windows 8 laptop and she is still stuck every time she falls into "blocky start" menu or or any of Metro apps.

She just doesn't understand what is happening anymore, and how to exit this mode which she never asked for. What helps here is that all those apps are absolutely useless.

It's a disaster disaster disaster.


i would have stopped using windows years ago if i could get games to run decently on linux.


I think people underestimate this issue. One of the reasons Windows is so dominant in the business landscape is because it's the OS people are used to since early years, mostly because it's bundled with most PCs people buy, but kids love to play games, and while a lot of games also come out for mac these days, Windows is still dominant in the PC gaming market.

It's a bit of a reach to say this, but it's a little bit similar to McDonald's selling toys with meals to entice children.


The reason windows is so popular in the home market is because it was in the 90s. businesses bought windows/office, and people brought the floppies home, which led to schools following suit, which lead to a generation thinking windows = computers.


I grew up in the 90s and 00s and schools were all Apple but homes and businesses were all Windows


Yes, on paper Apple was the future -- kids were learning it at school, and thus would recommend it and require it both at home, and then in the office a decade later.

However Microsoft capitalised on their position in the office, and it was far cheaper to buy a fully functioning computer at home with a copy of windows and office from work. Then schools were under pressure to provide what was normal.

Piracy made windows what it is.


There are many games that run well in Linux, including a large portion of the Steam library!

If you meant a specific game to run, well then you need to weight your pros and cons and stop whining when you make your decision.

Alternatively, dual boot.


Or he could ignore people like you and just use Windows. If you're going to have a bunch of headaches you may as well stick to the ones you're familiar with on the system that actually does what you want pretty well.


> Or he could ignore people like you and just use Windows.

GP is the one who said he'd move to linux if there were games. I didn't imply he was wrong for using Windows. I think you (and all other siblings) are reading more into my comment than there actually is.


I do that for GTA5. The only time I switch back to Windows. I don't even restart, I just hibernate linux and then everything is still there when I come back!


Try running Dota 2 on Windows then run it on Linux (it is available to install from Steam for Linux) and then get back to me.

Last time I tried it a few years ago, it was ~10fps on linux, and a solid 100+fps on windows.


Alternatively, grow up and stop playing games.


That's not really what "growing up" means.


I hope you don't watch movies or TV/shows or play sports


Ever played a card or board game as an adult? Why would playing video games as an adult be any different?


To read things like this and everyone's comments, you would think that Microsoft isn't making a mega-ton of money every quarter, which they are. Everyone keeps talking about Azure and AWS like there isn't room for both and that Azure hasn't been growing like crazy.

Amateur tech punditry at its finest.


I would love for Microsoft to launch it's own Android-compatible ecosystem, starting from the open source base of Android and building alternatives to Google's proprietary technology (location & mapping, email, browsing, app store etc.). They are one of the few companies who have the capacity to do that and already have products in place to cover 80% of what is required.

While I'm not exactly a fan of Microsoft, there is nothing I dread more than a closed platform controlled by a single company, and Google is moving more and more aggressively in that direction. There is now a whole industry dedicated to installing hacked Google products on devices Google does not approve of.


I'm always baffled whenever someone comes up with these wacky Google Android replacement suggestions. It's as if they selectively choose to ignore history and the billions, that would be futilely squandered, pursuing something that has absolutely zero chance of ever succeeding or even being profitable. I'm not sure what fantasy world you reside in, but replacing Google's proprietary apps and services with Microsoft's proprietary apps is not only a significantly less value proposition for the consumer, but there's absolutely no incentive for the consumer or OEM to even remotely consider such a platform. You will never have the ecosystem and you will never have the developer support and whatever developers and OEM's you do manage to bribe over they'll quickly abandon the platform once the payola dries up.

>There is now a whole industry dedicated to installing hacked Google products on devices Google does not approve of.

I think you meant to say there was until Google closed that loophole.


But even Google is moving away from Android! ;)


Microsoft's Windows and Office products are now in the product position of commercial trucks. Every business of any size has some commercial trucks. Businesses buy them, use them, maintain them, and replace them when necessary. They're not exciting, but they get the job done. Nobody thinks about them much.

Microsoft should just accept that they sell a commercial product for businesses and make a good solid product that needs little attention and gets the job done. Like Mack trucks.


You can have my windows when I can game properly somewhere else (and no, no you can’t).


Xbox and PlayStation. The "problem" is that you want to game on your PC, not that there isn't another viable option.


I’d be happy with console gaming if there was a $2000 console with mouse and keyboard controls. I don’t want to play with a controller and I don’t want 3yo hardware.


Arguably, gaming on an Xbox is still gaming on Windows so it's a funny option to pick if the goal is to avoid Windows. ;)


Can you point me to the console version of Dota 2?


I mean, that particular game does run exceptionally well on Linux. I think, even quite a bit better than on Windows, unless your GPU's drivers for Linux happen to be shit, of course.

But yeah, the overall point still stands. If you have a particular game that you don't feel like you can substitute and it's not available on a platform that you'd like to switch to, you're probably not switching to that platform.


Mouse / Keyboard input. Though, that is opening up more and more nowadays.....


Xbox One supports it natively (via good old USB) as of a "silent" update a year ago or so. Not all games support it (for various reasons of engines and developer preference), but you might be surprised the ones that do.


I know, I worked on the project that enabled it. This was more to the latter part of the comment, that games will not be designed around that modality and would always be altered accordingly. Not necessarily a downside for everyone (and more of a presumption), I should point out. :)


I figure that so long as there are PC-using developers building the engines, keyboard/mouse support will make its way into a lot of Xbox games naturally (because why disable something that works great for testing/debug?).

I've also seen enough people that are quite skilled with a controller that they perform better than the average keyboard/mouse player, that I think we'll see less people worry about cross-modal play.

I love having the option to game with an Xbox One controller on my PC, and I love that keyboard/mouse preferring players sometimes have that option on the Xbox. Let everyone play what they find fun.

I've been playing a bunch of Sea of Thieves lately, and it's almost surprising there isn't a controversy about cross-platform/cross-modal play, and yet it is open PvP. I don't feel like it matters if I'm using keyboard/mouse or controller, much less if an opposing player is, and that's a weird sort of magic. (It matters more if the opposing players are jerks that don't follow the Pirate Code.) Props to Rare for pulling that off, and maybe that will be increasingly a thing. (The Gears of War cross-platform/cross-modal playlists have gotten good reviews as well, from what I've heard, though I've not specifically tried them.)


Why game on a console if you can afford to do so on PC?


There's also game development, with both art and engineering departments running windows.


What I don't understand is, with Apple and Microsoft moving to "services" business-models, who do they see as providing the hardware for this future they envision? I don't think we are near the point where either laptops or phones are a commodity. If nobody want to make hardware their first priority, how is anyone going to ensure the best experience for the customers using their serviceS?


Apple is not moving to services, they are very much a hardware company. Services are for them more like a thing to make the hardware more attractive.

Microsoft has not really been in hardware space before (except consoles, otherwise it was mostly accessories). I believe they jumped to laptops, tablets and desktops to have full control of the user experience.


Right, given the *soft in the name, Microsoft has never seen itself as a hardware company and every move it has made into hardware (except Nokia) was reluctant and to fill a need for software they couldn't convince hardware friends to make at scale for them.

Microsoft started building mice because they wanted to add a wheel to them to make working in Office applications better.

Microsoft started building 2-in-1 tablet laptops (Surface) because they'd been calling it the future for a decade and no one was paying attention.

For the most part Microsoft's hardware strategy is at its best in "if you build it, they will come (and copy it as soon as they understand why you built it)" mode.


What's your citation or evidence to Apple moving to a "services" business-model? They are hardware focused, so that's not a correct statement.


I'm not claiming to have insider knowledge of Apple's strategy, but there have been many reports over the last year or two that speculate about the extent to which services will provide revenue instead of hardware.

https://www.cnbc.com/2016/06/08/apple-will-show-off-its-move...

https://stratechery.com/2017/apple-and-the-oak-tree/

https://www.macrumors.com/2018/03/22/apple-services-revenue-...


Is there any evidence that Apple are moving to a services model? More than anyone else, they have succeeded at making services add-ons for their own hardware.

(edited to fix a typo)


Not sure how true it is. It's been a popular narrative recently, including discussion on Stratechery[0]. Some say that Apple has been pushing the idea as a way to distract from slowing hardware sales for the last 2 years or so.

[0] https://stratechery.com/2018/apples-middle-age/


So clearly the market has chosen, and walled gardens where users don't have the freedom to run their own code or apps, their own OSes prevented with locked bootloaders, locked apps (no ability to interfere with apps running on YOUR device, e.g. read their data), ... and so on is now the default policy.

Never again will one app be compatible with others without a business agreement between them ... because that just can't be allowed (dixit $100 million per year+ managers at these huge companies, and I'm sure it's 100% coincidence that this would allow small companies to compete with them. Totally unrelated to this decision). Can't work in the cloud, can't work on android, can't work on IOS. And for that matter, can't work on UWP.

And of course let's not forget that the massive cost, that everyone's data is now 100% accessible to law enforcement and civil courts (and thus to anyone with the money to sue you) is just acceptable damage. After all, that doesn't affect $100 billion plus companies much at all.

Their argument is that it can make web banking unsafe. Yes ... that's true. It can.

What is does 99.999% of the time however is enforce the market dominance of really large players.

But, like people walking into a camp during WWII, nobody holds a moment of silence when they enter the compound with the large barbed wire. Nobody regrets what's happening when they lock the gate. Only years later, when all the bodies are found ... then we stop and think.

This is them locking the gate. Now comes years and years of really really bad applications, and exploiting the lockin.

And yes, the only system not in a locked ecosystem is the PC ecosystem (needless to say, this is the ecosystem that ALL of these large companies use, MS, Google and Apple, for themselves. And now they're locking it for the rest of us)


Windows desktop was always driving demand for Windows enterprise, why else would you need exchange servers and CIFS/SMB file servers? Who would buy MS SQL server licences without the larger lock in effect? Maybe they think that the cloud has broken this dependency, but I think they are bringing ruin over the house of Redmond (in the long run)


About that reorganization :

"Today, I’m announcing the formation of two new engineering teams"

"Windows: Joe Belfiore will continue leading our Windows experiences and will drive Windows innovation in partnership with the PC and device ecosystem"

How does this exactly means they are giving up on Windows ?! Seriously, wishful thinking much ?


They are not giving up Windows. They just adjusted the organization to reflect the fact that Windows itself is no longer as important as in the 90s.


Aye, 'tis the baiting o'the click.


Windows is still a mess... inconsistent interface, multiple ways to configure the same thing, the registry.....

I know it's not the same market circumstances but this feels alot like the decision to stop development of ie6 after Microsoft beat Netscape.


Windows used to be the biggest market differentiator for Microsoft. What does Azure do that Amazon or Google can't? What would make Microsoft uniquely different if it had no Windows?


Microsoft really should have pursued creating a version of Windows specifically for gaming. I imagine there are lot of users who only use Windows for that purpose.


They do have the Xbox..


Yeah, I meant for PC gaming.


I think it would be better titled :- windows is dead, long live windows!

I think this article better explains things http://www.zdnet.com/article/heres-how-and-why-microsoft-is-...


No mention of Windows popularity as crypto mining server OS, due to superior GPU driver SW.


Ballmer must be the #1 executive who has gotten the farthest on the least intelligence.


Ballmer had a perfect 800 score on his SAT math section. He kept the company expanding revenue and profits for a decade and a half, which is an eternity in Internet time. His Windows on mobile strategy was a few years late from being a huge success (agreed, that is also an eternity).


Are we seriously equating business intelligence with a high school aptitude test that covers rudimentary mathematics?


Intelligence is not just your SAT math score (and for the record, I had a 770). His business/tactical decisions were obvious ones once Windows achieved hegemony.


There's a book "Zone to Win" by Geoffrey A. Moore that describes these various transitions for Microsoft and it suggests that it's intentional to transition one business unit at a time.


So Windows will basically keep on existing to run Office and Visual Studio?


People are still loyal to Windows, because that's what they are used to. And PC's and laptops still has Windows pre-installed.


Mac OS is far closer to death than Windows.


Still the best desktop OS, sadly.


If windows 10 is the end then i think It's safe to say I will never be returning to windows for as long as I can help it.

Between it's spyware and its tendency to just do things without telling you, without giving you an easy way to stop, it is just not something that feels comfortable enough to use anymore for power users.

It feels like whatever I gain in convience, is lost when trying to figure out what windows is doing with all that CPU/RAM/Network usage or how to disable some new service that I have 0 interest in.


Dear Microsoft,

If you’re not using the PC, can we have it back?

— General Purpose Computing


General Purpose Computing using PC's is a thing because of Microsoft (and IBM). Without those two there would have never been a PC in the first place.

The forces against general purpose computing are to be found on 1 infinite loop and 1600 Amphitheatre parkway.


The Microsoft of the 1990s and the “Microsoft” of today are very different.

The modern version is trying to copy closed ecosystems, instead of building on historical strengths in stable APIs, backward compatibility, diverse hardware and ISV licensing models.


I don't think they are all that different. They have just shifted their lock-in strategy to a different front that they didn't have access to back in the day now that their milk-cow is drying up.

Microsoft missed the mobile revolution in just about every way they could have, so now they have to make a number of forced moves.


Different, but only partially. In many ways they still cling to their bad habits like lock-in and standards poisoning.

Some good exceptions exists (HTML5 and AV1), but many things are as bad as they were for a long time (Outlook, DirectX and etc.).


Azure (or is it Office 365) seems to be growing rapidly.

Docker/containers and Linux (WSL) have modern ecosystems that intersect with Azure. There’s also a Microsoft open server hardware design for Open Compute.

What are some modern/emerging standards where Microsoft is actively involved?


Rather not involved. MS is notably lacking participation in Vulkan for GPU graphics and compute, because they are pushing their lock-in alternative.

For other lock-in examples, check who holds patents (and charges for them) on exFAT and its usage on common Flash storage like SD cards.


I rode a PC wave that nearly completely bypassed IBM/Microsoft:

1. The "home" computer boom: ZX Spectrums, Acorn, Commodore, Atari with maybe an Apple II or a CPM machines for more serious use. 2. The Atari ST or Amiga wave. WIMP and productivity software as well as gaming and content creation 3. At this point I did switch to Windows but I could just have easily switched to Mac. Apple didn't cease to be a force at any stage. They maintained a strong base in home use and content creation only losing the enterprise.

In our alternate history there was also room for Next or BeOS. You never know.

Now what MS/IBM did enable (accidentally to some degree) was an open hardware platform. However if they weren't around then other players might have stepped in. Maybe CP/M succeeded in going 16bit, GEM might have caught on or the Apple clones might have found a way to legally survive.

On final reading I realise you might have meant "PC" in the narrow sense. If so then you're correct in a trivial sense. If not then I disagree.


I've had computers since the KIM-1 and the TRS-80 were around so I'm well aware of the world of personal computing before the PC.

That also makes me well aware of how much a fully decked out computer cost before IBM released a hardware spec that allowed companies to compete and the vast economies of scale this involved brought prices down to where mere mortals like myself could afford very powerful machines. Unlike the various 8 bit offerings of the time, which were relatively expensive for the computing power and storage options they provided.

Even though the initial PCs were expensive and anemic and did not even have the same capacities as many other machines aimed at the home market of the time it had staying power because of IBM's formidable manufacturing and marketing capabilities and Microsoft made it useful enough that non-technical users would dare to make the jump. Tandy/Radio Shack tried - and to some extent succeeded - to do this too with their TRS-80 models. Competition by the clone builders then did the rest.

So I'm sure there would have been a 'computing scene' without Microsoft and IBM, and it likely would have been an interesting one. But the machine under my desk would likely not have existed in that world, other than maybe as manufactured by DEC, Apollo, SGI or some other player of those days and it would have been marketed as a scientific workstation and cost North of $10K instead of 10% of that.

General purpose computing for the masses rode in on the back of economies of scale that IBM and Microsoft enabled.

(And trust me on this: I'm not exactly a Microsoft fan but that bit they did actually made the world a better place.)


> Apple didn't cease to be a force at any stage. They maintained a strong base in home use and content creation only losing the enterprise.

They almost went bankrupt around 1994.

If it wasn't for Microsoft money and the merge with NeXT bringing Jobs back, they would have been gone.

Up to iPod's success Apple was meaningless pretty much everywhere outside US.


It's amazing how much they nearly lost it all. In the late 80s early 90s apple was a fairly common sight in education, 10 years later it was windows everywhere outside of degree level STEM (where it was *nix of some sort)


Given the context of this thread note the irony that Microsoft actually bailed out Apple to polish up their image for anti-trust reasons.

https://www.wired.com/2009/08/dayintech_0806/


> In our alternate history there was also room for Next or BeOS. You never know.

Now that you mention it, I kind of wonder what would have happened if Steve Jobs had never left Apple in the first place.


There would have been no OS/X (a NextStep descendant) and no iPod, nor would there be an iPhone or an iPad.

Jobs leaving Apple was in a very roundabout way the best thing that happened to Apple because it forced him to learn a bunch of tough lessons which he could then apply to Apple when the time was right.


> Without those two there would have never been a PC in the first place.

Can you clarify that statement? My memory of computing history include other types of personal computers during and around that time.


Yes, but the market was super fragmented with each and every platform trying to lock in their customers as much as possible. The PC changed all that by virtue of the clones, which offered compatible systems and components from a large variety of vendors. Compaq arguably was the biggest force in establishing the clone market and even though as a brand they are gone today (though some of their legacy lives on in HP's server line) the computing world would not look even close to the same without the effort of those clone manufacturers.

The economies of scale engendered by the standardization of the PC platform really made things move.


What do you think of http://opencompute.org? They provide open blueprints for whitebox servers and network gear, for manufacturing by ODMs, with cloud provider purchases driving economy of scale.


I really hope they will succeed but I also wished that they would have at least one product aimed at the desktop. It's understandable that they focus on the datacenter when you look at who is behind the project but one of the things that I love about Linux is that I have a degree of continuity between my desktop environment where I develop software and the servers where it all gets deployed.


At least one desktop or laptop ODM would need to sign up for manufacturing the OCP desktop design. Purism is trying to open up laptop hardware. We need open specs for a NUC form factor.


Uh, I think you need to review history of computing. The market was full of competing platforms in the early 80's. The PC became the dominant platform due to complex network effects, the least not being the IBM brand. DOS was bought from a third party guy in a garage and slapped on top of the x86 platform.


And I think you need to read a whole thread before jumping in hours later and repeating lots of stuff already said.


Sure,

Which lock down version of Apple or Google systems do you prefer?

-- Microsoft


> If you’re not using the PC, can we have it back?

Sure, install Linux and feel free.



Luckily most motherboard makers allow disabling this, which makes installation quite easy.


Offtopic: What font is used on that webpage? It's very comfortable to the eye.


Firefox Dev tools say Freight Sans Pro


Interesting, I actually changed the font on the webpage because I found it too thin.


The end of Windows is the year of the Linux desktop


Linux desktop will never thrive. I'm sorry to tell you that.

it's more likely that some other player just releases a completely new OS to replace Windows. just like Chrome replaced IE (and not Firefox).


Linux desktop will never com

Exactly.


I'm excited by the end of Windows, or really the end of dominance of desktop OSes. This means we will have the freedom to innovate again and do some really wild things.


I won't support OSes such as Windows and MacOS until they give me the same core user freedoms that you get with GNU/Linux.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: