Hacker News new | past | comments | ask | show | jobs | submit login
The bizarre role reversal of Apple and Microsoft (backchannel.com)
227 points by steven on Oct 28, 2016 | hide | past | favorite | 210 comments



> Just check out its striking video for the Surface Studio — it is so influenced by Apple’s playbook that I’m surprised there’s no Jony Ive narration.

What's happened is that other companies have figured out how to emulate Steve Jobs's playbook that's now several decades old and there's nobody at Apple to write new plays.

Jobs always talked about how Microsoft didn't have taste and when you look at MS products up until around 2005ish in comparison to Apple products, he was right. At some point a lot of business people realized what was making Apple successful - they weren't just selling computers, they were selling a lifestyle, they were selling cool. Oh, you are a creative? Well, shouldn't you have a Mac? It's what Einstein and Gandhi would have used.

Jobs was an absolutely brilliant marketer. Since 2005 other companies have gotten successively better at emulating Apple's design, experience, and marketing. They've distilled what Jobs knew intuitively into a formula that they can iterate on. It's not just MS, it's Dell, it's every medium and high-end manufacturer, it's Google. Please tell me what the difference between this [1] Pixel commercial and every Apple commercial made within the last decade is with the exception of the logo.

It's the Applefication of tech production and marketing and Apple doesn't have anything to stand out anymore.

1. https://www.youtube.com/watch?v=HCI1tcu8tQw


Microsoft lacked taste and the ability to execute. With the Surface Book, they absolutely nailed taste. But they still have no ability to execute (the SB suffered at launch from firmware glitches that took months to iron out: http://arstechnica.com/gadgets/2016/02/new-firmware-finally-...).


First, I want to state that I have a ton of respect for what Microsoft is doing. This is the Microsoft I've been waiting to see for over a decade.

That said, here's the truth. Just a week ago I tried out a Surface Book, and within the first few minutes I could immediately tell it was not as polished as a MacBook Pro.

The first thing that stood out is that the screen suffered from a considerable amount of ghosting. It looked great as long as I wasn't actually doing anything. Dragging a window around or even just moving the cursor though revealed clear visual defects.

I also immediately noticed that the trackpad, while high quality physically, struggled with gestures involving multiple fingers. Maybe this is configurable through software, but I just didn't find the trackpad's recognition of gestures to be nearly as good as that of a MacBook Pro.

On the same topic of gestures, doing a four-finger horizontal swipe in Windows 10 to switch between virtual desktops resulted in a very visually jarring animation. The experience just isn't polished like it is in macOS. This surprised me because I use Windows 10 extensively on my home desktop and generally find it very polished, but that's all using a normal mouse, not a trackpad with gestures.

Next I noticed that the laptop's screen was physically wobbling. I suspect this is because the screen is so much heavier than normal due to its unique detachable design. But I found this quite annoying. It was apparent that even just typing on the laptop with it on my actual lap would lead to screen wobble.

All this popped up within minutes before I even tried out any of the laptop's unique features. The version I was looking at had a price tag of $2800. At this price range, I think Microsoft still has work to do. None of this is to say it's a bad product. I was just expecting a bit more from something with such a premium price.


Ask them about their enterprise licensing and see if you still have the same opinion.


When Cortana was launched on Windows, I went to a Microsoft Store to try it out. I found a Surface and tapped the Cortana button, and it opened a "Microphone Calibration Setup Wizard" that looks like it was from Windows XP.

So yeah, taste is one thing, but execution is a different ball game.


Windows 10 still has two different control panels. Last year, a Microsoft PM tweeted he hadn't had to use the old control panel "in months": https://twitter.com/brandonleblanc/status/650519136196366338.... Microsoft had been shipping two control panels for three years at that point. People would be apoplectic if Apple did something like that.


This is kind of what I like about Windows 10; it can be a dumbed-down experience for people that don't care, but if you want, there are gradually escalating levels of configurability. Settings -> Control Panel -> fuck it, I'll edit the registry directly.

I would definitely not want to maintain this codebase, but as an end user, I'm happy.


macOS has something better: all the settings in one place.

Windows used to have something like that too, I think it was called the Control Panel.


> macOS has something better: all the settings in one place.

All twelve of them.


I know that's snark, but I just went and counted them anyway. There's at least 413 individual different settings in macOS Sierra's System Preferences, excluding Flash Player.


No, it doesn't. Preferences, but also 'defaults write ...', or just edit a plist...


Technically correct but very few people care about "defaults write". I can't remember the last time I used it, but I do use RegEdit often enough.


One place I've found it useful is for scripting the setup of a fresh machine, something like this:

https://blog.vandenbrand.org/2016/01/04/how-to-automate-your...

I agree it's likely very few people use it. It can be exactly the right tool for the job in some circumstances.


Yeah, that's a good reason to use it, and it makes sense that the tool is there. But there's a big difference between having a CLI tool for digging around when you need it, and having two separate GUI control panels with some settings in one, some in the other, and other settings in both.


Which is a thing of beauty compared to windows and linux. Fine grained control, yet all per-user file based and thus can easily be reset to factory.


And per user vs. system overrides.


Any thoughts about Cisco?

We're literally trying to be the taste and execution. I work for Cisco. Curious about candid thoughts.


Of routers?

I'm seriously asking. Cisco either lives on the top of rack that like 10 people in a datacenter see, much less only touch once; or it lives as a home router, that most people want to hide but still need good wifi signal.

IMO these are two different use cases. Cisco could execute almost a million times better in the datacenter by embracing better standards and making first boot and reconfiguration easier (I could write a book on this). Otherwise read as, stop pushing people to use your proprietary management systems for your gear, make those tools generally applicable then maybe people would use them.

On the consumer side, I have less experience with Cisco, but making it dead simple to deal with things like multiple wifi access points would be big, with as little setup involvement as necessary. Apple had a really simple to setup router, though it's generally been underpowered and didn't allow for more advanced configurations.


Cisco Anyconnect and the ASA web interface reeks of an outsourced enterprise developed massive legacy code base haunted by developers pissed all over by meddling middle managers. If you were going for the opposite of taste and execution you knocked it out of the park.


ISP Owner here. I wouldn't call their hardware visually appealing. In UI, we mostly use CLI - so not sure what more can be done in that regard. A visual way to looking at live traffic info similar to iftop/htop would be nice.


I want to add the cisco-IOS is an absolute clusterfuck. There is so much room for improvement I wouldn't know where to start. Please have a loot at Mikrotik, they are doing a pretty decent job for a small company. If their devices were powerful enough to handle our edge router traffic, I would drop cisco like a brick. I am not kidding.

Someone really needs to fund Mikrotik big time.


Personally, I say ditch IOS completely and give me a flat ecosystem of Linux or BSD well tuned to the hardware and completely open sourced. Allow interesting features like SNABB-switch type userspace packet management. Throw in some (k)ASLR and other such security mechanisms. Perhaps most important: make an ecosystem that can compete with Ubiquiti for both "average" consumers and developers in terms of hardware but especially software.


Ironically (because of the Macbook announcement) if Microsoft had chosen to go with previous generation Intel processors, they probably could have avoided those firmware issues. Those are mostly ironed out now, no thanks to Intel.


Do they still have tons of high-DPI bugs where some apps work with it and some get all garbled?


Well to be fair HiDPI issues are application issues as opposed to core Windows issues.


It's first and foremost a bad DPI independence API design, largely due to a terrible backwards compatibility of an even more terrible legacy stack. It is fixed somewhat in their modern stacks, such as WPF and UWP, but the vast majority of Windows software is not written with these frameworks.

While it is understandable why the legacy baggage is there and cannot be removed, the end result is a big mess, still to this day. Most (but still not all) first party software is OK to good. Third party software (not apps from the store) is a mess, usually either blurry, tiny or downright broken. Windows ecosystem being Windows ecosystem, there are many alternative to almost everything, and some software is improving. But it is still a big mess.

Apple's transition was a lot smoother due to vertical integration, a much smaller software library, more dedicated software developer base (or rather, much more willing to be early adopters of new APIs) and most importantly, a much better designed API without a 20-30 year old baggage.

It should be interesting to see people's reaction to the software, after jumping ship for macOS to Windows on these fantastic displays. I couldn't stand the mess after a week with Windows 10 on my MBP. (But I do use Windows 10 regularly on my desktop, on a "normal" display.)


> a much better designed API without a 20-30 year old baggage.

Apple's API dates back to the 1980s, just like Windows'. If you want to nitpick, Windows was developed in the early-mid 1980s while Next was developed in the mid-late 1980s.


I mostly meant the software support baggage, but yes, indeed, so many of the Next concepts are still true today in Cocoa API design.


I have a number of mac apps that don't scale. Biggest offender, although also on windows, is steam. I don't know what it is about game developers but they just can't seem to manage ui scaling.


As most new apps start to be built on UWP that issue will go away.

Also, I am not sure what DPI issues you're facing but I have run W10 on my rMBP for quite some time and it's been thoroughly pleasant. Perhaps your issues are fuzzy text due to mixed DPI, multi monitor setup?


UWP apps have a lot of limitations compared to WPF/Win32 apps. As I understand it, UWP apps can't record audio in the background or act as a Bluetooth accessory, to give two examples.

So, even if a developer is willing to invest the time to rewrite an app to use UWP, the app will lose a lot of functionality. Microsoft has again poorly architected their APIs.

Unlike macOS, where there's no penalty to adopting the latest APIs. It's not as if, to use the Touch Bar, you have to give up access to other capabilities, like recording audio in the background.

More information: https://kartick-log.blogspot.in/2016/08/how-powerful-is-uwp-...


I mean, you can just use WPF + the desktop bridge in these cases. There's still some work for the desktop bridge to catch up, but it's perfectly feasible (aside from, you know, feeling like a traditional Windows desktop app vs a "Modern" one without a lot of work styling WPF controls).


I am not using UWP apps. Most of the software I use on Windows are either MFC or Windows Forms. These can also be DPI aware and render correctly, but they require proactive developer support, which is not something that is happening in the Windows developer world right now. Either software is abandonware (but still works), or issues are open and remain open or are simply closed with "won't fix" status. Most developers see this as a non-issue or very low priority. I mean, even Chrome does not support high DPI from what I remember. It looked terrible.

Linux world is in the same boat. It's even worse there, because a lot of the toolkits don't support high DPI modes or are buggy, and developers there are even less interested or have the know-how to fix issues.

With the emergence of cheaper 4K monitors, perhaps we'll see a change to the better, who knows.

Multi-monitor support is another issue entirely.


What's funny is that supporting high DPI on a winforms app isn't even hard. Even if you size all your widgets in pixels, if you just do a little playing with the scaling options and just try out the ui on high DPI, it seems to work pretty well except for the old icons. I forgot what the setting is called, but theres a (non-defaukt) scaling option or two that seemed to "just work" as it were.

The harder part is to consistently fix everything that uses the old win2000 8pt default font instead of a modern one.


UWP is dead imo. Microsoft is ignoring the signs that can tell it to accept it gracefully. There is not one uwp app that I use. They are just not reliable. A splash screen will show up, and then disappear. The app won't launch. You open the fb messenger app and then switch desktops to work. 2 hours later the messenger is nowhere to be found. It just disappears. No error no notification. Nothing. First I thought it's fb incompetence. But this happens with the built in photos app. The built in mail app. Even shell elements (start menu, notification shade) are buggy. Windows 10 is a great os as long as you avoid any interaction with uwp stuff.


Also, it's not just scaling. Often even if an application scales somewhat correctly, often some elements don't, because it's using some open source that draws something in a terrible manner. Or, assets look blurry because of "16x16 is all you really need" mentality. This used to happen on OS X in the first months, but very quickly and uniformly, developers just fixed their custom code and assets. Not so with Windows developers, sadly.


Microsoft took ages to fix first-party applications that shipped with the OS: https://www.thurrott.com/windows/windows-10/83964/microsoft-....

Microsoft's inability to execute is mind-boggling.


It's not. Win10 doesn't even have a proper HiDPI installation dialog.


Why was the transition to HiDPI so much less painful on macOS, then?

Because Apple designed the core OS to handle it much better, and make it easier to support for developers. (which was years in the making, far before the launch of rMBP.)


While I agree that the OS X APIs were superior at the time, it is important to remember that Windows was and still is an enterprise OS, first and foremost, and this is a big mess. So this slowed progress a lot. Modern Windows UI frameworks are good too, but they are used very seldom, or if used, often used incorrectly. Since high DPI displays have been rare and expensive, I think also developers have very little awareness of the problems, and much less will to work on it. Linux is in the same boat.


How many apps from the 90s do you know that still work on macOS?


Sure, but their hi-DPI approach was much more backwards compatible capable than Microsoft's. Having apps still work, but be tiny and unreadable or haveing their UI exploded all over the place isn't a great exampe of backwards compatibility.


Yes, backwards compatibility is great. But it does not come without a cost.


Not it was easier because they also control the hardware and defined HiDPI to be always exactly 200%, compared to various scaling factors that windows needs to support.


I think people don't remember some of the first gen issues Apple products have gone through over the years.


I always found it offensive that they used long deceased, mostly anti materialist socialists to peddle consumer electronics (as if mother Teresa would be using an ipad with her name etched in the back). They should have tried using a living legend without permission like Ralph Nader or Noam Chomsky and see how that went...


You seem to be mis-remembering the facts a little. The Think Different campaign was nearly 20 years ago (1997). If you look at the list of celebrities[1], almost all are/were not "anti materialist socialists."

1. https://en.wikipedia.org/wiki/Think_different


The socialist leanings of the people used is incidental to what I was saying. I may not have stated myself well. For a productive conversation here, are you unclear on the position or do you fundamentally dispute it?


I have never gotten the impression the "Think Different" campaign claimed the depicted people would use the laptop; rather, that Apple's innovation is analogously innovative to... whoever. These are cultural allusions, not spokespeople.

I'm not a fan, but you're interpreting the campaign differently than I.


A computer is just a tool. It's not as important as what it enables you to do.


Sure, but if you verbalized what they were saying "Here's our spokesperson, Mahatma Gandhi" and they say, brought out some actor dressed as the guy telling you how great the iPhone 7 is at the keynote, most people would probably find that in terrible taste.

The ones with civil rights leaders were particularly offensive. Yes, lining up for a new laptop is just like apartheid and the march on Selma. I mean honestly, how garish and impertinent can you get.


This is exactly right.

Apple's rising tide lifted all of us: it forced every other computer manufacturer take a look at the quality of what they were selling, not just feature lists and requirements.

We've all benefited from this tremendously: android was quite literally unusable crap until the last few years (queue rooters and apologizers), laptop pc's were similarly crap, and you just never got the feeling that someone who cared put these things together.

Now? Everyone is starting to put effort into the little things: the packaging, the presentation, quality of materials, usability, etc. It's a tremendous win. I'm almost tempted to say we owe Apple, but then I remember their coffers are already full.

I imagine there are other stones still left unturned, and the next Apple will be the ones to pay similar attention to detail and craftsmanship in an area where there previously wasn't any.


IBM did care about their ThinkPad laptops, they really built solid hardware and software addons, before the brand was sold to Lenovo. One example is the full size butterfly keyboard they built, and the general robust designs.


One thing you're missing is how the market came to them, partly through their own making. Laptops and PC's looked "corporate" Apple products (blue, pink, green computers anyone?) looked decidedly uncorporate. The iPhone looked so not like a Blackberry (with it's accompanying belt holster!) However, the "consumerization" of IT meant sleek personal design was preferred to corporate design. The iPhone helped breakdown that wall and now MSFT and others went out the side door and are reentering as consumer looking devices.


People have been selling brands for long before Apple has been around. And the "brand," isn't the logo. It is everything that a particular entity means to you, which includes all customer service, sales, advertising contacts. Which includes selling things as a lifestyle. This is marketing 101.

I'm not going to pretend that Apple did not significantly influence the direction of product design. It absolutely did.

However, you're making it sound like Apple invented product marketing. They didn't. Microsoft and Google are new to the marketing game because they're entering the consumer hardware business, which is a new venture. When you're the defacto software solution (Windows, Google Search), you don't need marketing. You don't need to build a brand. Your brand is so ingrained in culture that everyone, everywhere uses it.

It's a lot better of a spot to be than trying to reach that goal. Google and Microsoft are now going to try to leverage their software platforms to launch hardware in the same way, but it needs marketing.


Apple absolutely invented competent technology marketing.

If you look at the old issues of Byte, you can immediately see the difference. Most of the ads are generic spec/price lists with maybe some crappy hand-drawn graphics. A few are photos and professionally drawn images with some attempts at metaphor, but they invariably look crude, heavy-handed, jokey, or cartoony.

The primary focus in the early Apple ads is the Apple user at home, with his partner. It's the first thing you see before you get drawn into the copy. It's an attempt to sell soft values, not just hardware specs.

This ownership of soft values is a huge difference that most of the industry still fails to get, and which Apple itself seems to be forgetting.

Apple is the only company ever to create technology products that appeal equally to men and women without relying on any traditional gender branding. (At least that used to be true until "rose gold" appeared.)

MS is sort of catching up now, two decades later. But it's not a natural cultural fit for them, so I'm not expecting a stream of great things.

Google is - well, the brand mascot is an android. That's all anyone needs to know.


indeed, i believe Apple invented "competent technology marketing" as well. But to market to a cynical and highly skeptical group, they wisely chose the "inside-out" marketing route. So eg, when addressing a skeptical crowd (eg, developers) and you tell them how great you are, if they listen at all, they'll infer the opposite. On the other hand, if they over-hear one of their own talking about how great is the Mac, they are far more receptive.

Jean-Louis Gassée (head of Mac development and later global head of marketing, '85 - '90) seems to be have been the intellectual force behind this in Apple.


I thought rose gold was more about appealing to asian markets?


Yeah, Asian women. I know my ex rushed out and bought a pink iPhone as soon as they were released.


Well, Gold appeals to China, but is Rose Gold maybe its own thing?


They didn't invent product marketing, but they perfected the current, most popular industrial design aesthetic and marketing methodology.


I still find the Surface Studio is not exactly breathtaking. I dislike the chrome arms - they are part of the computer and shouldn't stand out like that. I like the computer being on the base, usually a piece of dead space, but that makes the design of the arms more complicated as they need to carry signals back and forth.

It's good, but nothing I would be surprised seeing coming out of HP or Sony.


The arms are probably chrome to invite attention and imply motion. "Hey, play with me, I move!" It's a signal to potential buyers that they're not looking at yet another all-in-one computer. I think it's a smart choice. If you're staring at the arms enough to be bothered, then you're probably not the target market for this device; the arms are completely hidden when used in drafting/drawing profile.

For what it's worth, Apple created a similarly attention-grabbing hinge on their G4 iMac back in the day.


> For what it's worth, Apple created a similarly attention-grabbing hinge on their G4 iMac back in the day.

Yes, but that was 14 years ago.


> "Need a new phone? ... Like new new?"

This line seems so superficial, so fluff. New new? No Google, I want them hot off the tables of production, not yesterday's cooled phones. Really, is new-newness the factor here?

Sounds more like a drug dealer:

"Hey bro... Need some shit? ... Like, new new shit?"


>they weren't just selling computers, they were selling a lifestyle, they were selling cool.

I work for a 100+ person company. Outside of the admin staff, everyone has souped-up PCs doing a huge load of 3D work. The only people with Apple devices (iMacs and MBPs)? The four 2D graphics designers and the executives. Why? The graphic designers would be in uproar and the execs believe clients expect them to present a certain 'aesthetic'. The MBPs are mainly used for email and powerpoint presentations.


> Apple doesn't have anything to stand out anymore

They have execution and commitment.

When I buy a MacBook Pro I can expect to get many years of use out of it. The hardware execution is almost always exemplary and they will continue to support it on the software side during that tenure. Buying a Windows or Android product feels like you're only going to be supported for a year or two and that's it.


I'm typing this on a 2013 macbook. I use it everyday/allday. I also have a fully functioning Asus laptop from 2009. I'm reasonably confident my 2005 inspiron is still kicking. Those two laptops are still running on their original charger. I'm on my third macbook charger in 3 years. I have a box full of broken apple chargers from merely glancing at the cord. I think you're drastically overestimating the macbook's ability to stand out of the crowd in hardware.

In regards to software, Windows XP support lasted 13 years. OSX comes out with a new version every year. Over the last few years, each is buggier than the last. Ever since Mavericks, I wait 6 months into each release for them to fix the load of bugs before I actually do the upgrade. El Capitan, especially, was just an all-around mess.

I love this computer, but I can still effortlessly criticize it in both hardware and software. They are simply not what's made the macbook stand out against a sea of black laptops.


> They are simply not what's made the macbook stand out against a sea of black laptops

The major change in there is that the laptop and desktop market has matured enough that, indeed, this year MBP needs to compete with 2012 MBP and every other laptop.

Before those time, just re-releasing the same box with improved internals was enough to sell, hence an ocean of generic laptop where a brand like Apple could rise to the top with a level above polish and attention to details.

The market changed, not because Apple lost its mojo, but because you can't sell a laptop on its spec alone meaning only the Apple way is really profitable now. Apple/Jobs genius was doing it before, allowing them to reap enormous benefits for years.

This year I'm looking at the new MBP to replace my old 2007 MBP. I happen to be happy with the update (unlike many here on HN), but I would have had no problem buying last year model in case of disappointment. Even a refurb 2012 model like my main machine would have been perfectly fine.


> I have a box full of broken apple chargers from merely glancing at the cord. I think you're drastically overestimating the macbook's ability to stand out of the crowd in hardware.

having to buy a 70 dollar charger every year vs having to buy a new laptop every 18 months.

I'll stock up on chargers.


> having to buy a new laptop every 18 months.

What on Earth kind of violence are you putting your poor laptop through? I haven't had a desktop in about 10 years, and I just bought my third one in that time about a year ago. (The previous one was doing fine actually, but I gave it to my mom)


I have a 2006 and a 2007 MacBook still in use. They still work, but Apple stopped supporting them with macOS & security updates years ago, understandably... so I installed Windows 10 on both of them. And not only do the machines run Windows 10 quite well, but Microsoft will be providing security updates through at least 2020. That's the point where I started to question how much support Apple provides vs Microsoft.


The Surface Studio is a major coup for Microsoft not because it'll necessarily sell well (although I think it has the potential to) but because it's made Microsoft cool.

I showed the video to my wife and she was so impressed that she looked at all the videos Microsoft did detailing the development. She's been a mac user for more than 20 years and tt's the first time she's ever looked at a Microsoft product.

Apple during the Steve Jobs era had a reputation for taking risks, for coming up with new products that impressed and that people just wanted.

A lot of people I know were excited by the announcement of new products and tried to watch the keynotes or follow the live coverage. In the past year, none of my friends care about them anymore. I only cared about yesterday's event because I've been waiting to upgrade my macbook pro and the event was lackluster. For better or worse, by not taking risks and not introducing new products, Apple seems less cool than it used to be and Microsoft showing off the Surface studio highlighted this. It's a huge blow for Apple in term of marketing.

(As an out of topic aside, Apple even sucks at doing boring incremental improvements, the iMac, Mac Pro and Mac Mini are languishing and the new macbook pro 15 inch design choices are non-sensical for professionals)


>the new macbook pro 15 inch design choices are non-sensical for professionals

You can't imagine how disappointed I was that they tap out at 16GB of RAM. I don't use that much memory every day but sometimes it would be really nice to have. I haven't bothered watching it again but the announcement gave me the impression that it started at 16 and going to the purchase page was a sad time.


I totally can, as I'm in the same boat. And the 240 CAD increase just for an extra 8GB of RAM - what a ripoff!


I don't think it's a role reversal at all, rather everybody has been chasing Apple for so long the competition has caught up and surpassed in some aspects. Who doesn't have a laptop that hasn't been modeled after the macbook. Also remember that everything Apple introduces doesn't turn to gold every time, they have plenty of product feature failures.

The only innovation I saw yesterday which will be copied really came out of Microsoft with the Surface Studio and Surface dial. I'm sure microsoft would really like to see a surge in high-end desktops. The only worthwhile technical feature Apple did was adding touch id in the touch bar, but even then how often would I use this? Also seems like the more logical place for touch id is actually on the mouse where my fingers are a majority of the time.

The only commonality between these two is there marketing departments thinking people want to spend $3k+ on products, no reversal there.


>Who doesn't have a laptop that hasn't been modeled after the macbook.

As soon as they can pack the power i'm looking for in my laptops into an apple style case, i'll be there. Until then i'll be sticking with my boxy black Clevo.


Surprise: Virtually all laptops are modelled after an Apple machine. The boxy black ones are PowerBook clones. Apple established the dedign language used by every laptop manufacturer since, in the early 90s: clamshell case, hinge at the rear, keyboard towards the back of the lower half with the trackpad front and center and all that wonderful blank space on either side usable as wrist rests.


Surprise: Apple wasn't the first with a "clamshell case, hinge at the rear, keyboard towards the back of the lower half...all that wonderful blank space on either side usable as wrist rests."

1983: http://cosy.com/language/cosyhard/cosyhard.htm - http://cosy.com/language/cosyhard/ampropn.gif (note: no trackpad - but then, the original powerbook didn't, either: https://en.wikipedia.org/wiki/PowerBook_100, and that round trackball is in just the same position as that round lid latch...)

Apple does a great job integrating elements that may have existed before, and is definitely a trendsetter in styling, but they also get credit for creating a lot of elements that existed before. ("Great artists steal.")


OK, the palm rest and the pointing device, I'll give it to you, but saying that all laptops are Powerbook clones is a big stretch. The "notebook" firm factor originated from NEC UltraLite, released 3 years earlier.

http://old.chuma.org/ultralite/index.shtml

https://en.m.wikipedia.org/wiki/NEC_UltraLite


I dunno, I have to agree with the other dude that they pretty much just cribbed from the NEC UltraLite.


I was thinking recently how Bill Gates often said the biggest competitor to Windows was the previous version of Windows. Alternate OSes didn't have enough market share or compatible software to really be a threat, but customers always had the option of just not upgrading.

iPhone seems in a very similar position. Sure, there are some people who will switch from iPhone to an Android phone. But Apple's biggest challenge seems to be convincing their current customers they need a new phone at all.


They've long solved that challenge. After a few generations of phone (read: a few years), all new iOS upgrades simply refuse to install on your old phone. Then, after a short while, your apps auto-upgrade themselves and then stop working, one by one, because they won't run on such an old iOS. You then have no choice but to buy a new phone.

I found this out by getting stranded on the side of the road somewhere after I discovered that Uber had auto-upgraded itself in the background, and the new one refused to launch on my old phone.


That's not supposed to happen - Uber must be doing their own OS version detection and not using Apple's supported method to declare the supported APIs. Apple have gone out of their way to add a way to install old versions of apps when the new version has stopped supporting the OS you're using (the developer can even specify in iTunes what versions to serve up for what OSes).

https://support.apple.com/en-lk/HT201377

> If there's a compatible version, a message appears and you can choose Confirm to get the latest version of the app that works for your device


> Apple's biggest challenge seems to be convincing their current customers they need a new phone at all.

That's not much of a challenge for Apple. There is always a laundry list of features that people want but that aren't necessarily mature (e.g. Wireless Charging, NFC, waterproofing) and Apple keeps those in it's back pocket incubating. Occasionally it pulls one out and slaps it on one of their devices to boost sales.

They also drive sales by adding or removing features from their devices (e.g. Firewire 400/800, USB-C/MagSafe, Lightning) or dropping support for legacy systems.

Mechanical failure is also a common reason to upgrade but Apple seems to be cannibalizing this revenue stream by replacing mechanical devices (e.g. switches, buttons, touchpads) with solid state ones that use haptic feedback. I'm curious how that will play out, will Apple's Taptic engine be the point of failure?


Pretty sure iPhones today have NFC, it's just not available for anything other than Apple Pay.


They do but it's locked down. They might one day open that up or create peripherals to use it. Similar to how the 3rd or 4th gen iPod Touch magically gained Bluetooth via software updates.


That's why so many payment terminals work with Apple Pay


> But Apple's biggest challenge seems to be convincing their current customers they need a new phone at all

They solved this by just releasing a new color every year.


They have a killer feature for that! Glued in batteries you cannot reasonably replace, unreliable buttons that fail easily, the like


You mean batteries that rarely if ever need replacing (this isn't still the 1990s) and now fewer button to fail (the home button hasn't been a problem anyway for years now anyway)?

Not everything is a conspiracy, FFS.


The buttons failing easily is a myth. There was an issue in one early generation, and that was fixed many years ago.


Thursday's reveal of new MacBooks was so lackluster I am rethinking my loyalty to Apple. The Apple platform has always been the most productive platform for me. But other firms seem to be innovating and iterating on things quicker than Apple has: rumor has it that Apple will more or less copy the edge-to-edge display of Chinese handsets or the rounded edges of the S7 Edge, wtf, what happened to doing really bold things that nobody else has, or can? I guess we might have just hit peak-phone handset. In any case Apple wants to push into the enterprise but won't make a surface like device (what the iPad Pro should have been). Oh well. OS X is still the killer feature for me so I will remain securely on Apple and I'm not leaving my iPhone for any other handset but this is still very disconcerting.


> I guess we might have just hit peak-phone handset.

This is what happens in mature product segments. The first iPhone was released 9 years ago, and that was the last real revolution -- a smartphone whose front was pretty much all screen. Everything Apple and everyone else has done since then has been incremental -- better screens, larger screens, better cameras, faster processors, thinner chassis -- but definitely the same paradigm.

The same thing happened to laptops before that. Some of the older laptops had pretty odd and uncomfortable designs. Sometime in the 90s, everyone standardized on the clamshell laptop with a 4:3 screen (later moving to widescreen), a low-profile keyboard, and a central trackpad below that. If you compare a 20-year-old laptop to a new one, you'll see the same incremental changes that happened to phones -- larger, better screens, faster processors, thinner chassis.

I think the bottom line is that the smartphone market has become as mature and predictable as the laptop market.


> I think the bottom line is that the smartphone market has become as mature and predictable as the laptop market.

I actually think that is a good thing. Commoditization of gadgets usually mean lower prices and standards across models.


> Sometime in the 90s, everyone standardized on the clamshell laptop with a 4:3 screen (later moving to widescreen)

How did that happen? It used to be that a normal laptop had a 4:3 screen and a ridiculous, huge laptop had a huge, 16:10 screen. Now both of those are gone, and normal laptops have even shorter 16:9 (!) screens. It's hard to think of a more user-hostile progression. I don't want to work in a series of cramped side-by-side windows. I want a screen that can display more than one paragraph of text at once.


Because most people use computers to consume content (ie video), which is largely produced in widescreen.


You don't lose anything by watching 16:9 video on a 16:10 screen.

Nothing's stopping you from watching it on a 4:3 screen, either, although at that point you've shrunk the image pretty noticeably.


Things like touchId were amazing features in their own right though.

Now the best they can do is a second lens for kinda bokeh and adding a touch bar instead of physical keys on the macbook.

While dumping the headphone jack and magsafe (and HDMI and SD).


> Things like touchId were amazing features in their own right though.

Touch ID is cool and all, but in 2016 calling it "amazing" is a bit of a stretch. ThinkPads first got a fingerprint reader more than a decade ago: http://www.technewsworld.com/story/37017.html


So what? The technology doesn't matter -- what matters is the implementation of the technology as a user-friendly feature. Apple did it better, as they often do in these cases.

As a feature, the Thinkpad fingerprint readers kind of sucked. All you could do is log into Windows with them, if you had the Thinkpad crapware installed. They weren't very reliable, either. I had one, and I gave up on using it after a while because typing my password was faster than attempting to use the fingerprint reader several times, then giving up and typing my password anyway.

Touch ID is built in to iOS. You can use it to unlock the phone, and to authenticate yourself for various Apple applications like the App Store and Apple Pay. There are also APIs for third-party applications to accept Touch ID instead of passwords, if the user wants to do that. It is fast and generally reliable. It's light years ahead of the fingerprint reader on the Thinkpads.


>what happened to doing really bold things that nobody else has, or can?

This mentality is what resulted in them shipping what is essentially a portable media player that doesn't allow you to plug almost every pair of headphones that has been manufactured in the last 30+ years into it.

Bold!


Hyperbole much?


The thing is, hardware is important but software is arguably more important because it's what you interact with every day (yes, you interact with the hardware too but not directly, besides the keyboard and touch pad). In that respect, I trust Apple more than either Microsoft or Google. I will not trust Google with any of my data unless it's necessary and Microsoft isn't so great in my eyes either due to their ad and data collection policies for the new Windows OS.

I think Macs are designed very well, even though I do question the decisions of the new MacBooks, and I would prefer to have a little bit more than 4 USB ports and thin laptop. I also really like Mac OS, even with it's thorns, because its design is much better than whatever version of Linux I would use (not a big fan of Ubuntu's Unity). Most likely, my next laptop purchase will be either a 2015 MacBook Pro or some nice non-Apple product with Ubuntu installed.


A paid Linux! That's what I said on this comment about "Ask HN – What innovation would you want to see", which 84 people upvoted: https://news.ycombinator.com/item?id=12570030

I wish we would have executed on that wish a year ago, because with the two Microsoft & Apple events, it's clear there is demand for "a Surface Studio, but not with Windows 10". Yesterday people effectively said they stay on Macs because of macOS, but envision switching to PCs because the innovation and specs are much better. Had we worked on that, we would have released a credible alternative to Macs, with distrib that would have embraced the Surface Studio while providing 1. the design, 2. the experience and 3.the privacy that everyone is looking for in an Apple computer...

As a reminder, the idea of a Paid Linux is to fund the open-source community with the same flow of money that Apple and Microsoft get from their OS (at $200/yr), in order to provide the same "red carpet" experience for specific profiles of users (either 3D workers, either graphists, whatever profile we target first). At the market level, one great experience for one type of work would develop adoption for Linux on the desktop. At a more selfish level, the benefit to paying for Linux instead of MS/A is that the new improvements are effectively open-source, so we're effectively raising the baseline of what every other distrib can do. The way to make people pay is by only providing their upgrades through authenticated PPAs, which means professionals will pay because it's easier, and hackers will redistribute versions on Torrent, which we don't mind, because hackers are a benefit for Linux. Besides, even hackers understand the value of funding open-source, so they might still participate. A lot of people would rather pay for open-source than closed-source.

I didn't execute on that wish, because I'm not an OS-level person, and I don't have the UX design background necessary for this endeavour. Nor the marketing know-how to execute at a high level. But I really wish someone would do it.


I feel the exact same way. I've been a Mac user for over ten years largely because of OS X, but I've been disappointed in Apple's direction for the past few years, and yesterday's MacBook Pro announcement was the last straw for me.

Linux is a wonderful server OS, and in fact I do most of my development inside of a VirtualBox VM running Debian, but in order for me to move to a full Linux workflow, I need to use some proprietary software packages like Microsoft Office and Apple Keynote (while I find LibreOffice Writer to be a suitable replacement for Microsoft Word, Microsoft Excel fits my needs better than Calc, and Impress is behind both PowerPoint and Keynote).

In line with a paid, polished Linux experience, another thing that would be nice for me and other disgruntled Mac users is a Wine-like compatibility layer that allows Linux users to run Cocoa programs. There's already a project called Darling (http://www.darlinghq.org/introduction/) that has some of the basic functionality implemented. If this project had more contributors, then it could develop into a working solution for running my Mac programs.

My dream OS would have a Unix-like foundation (like Linux or FreeBSD) with an interface similar to Mac OS 8/9 (with various features from OS X added like Spotlight, Expose, and the Dock) and with Don Norman's UI advice (http://www.jnd.org/dn.mss/apples_products_are.html) taken seriously.

I'm actually interested in contributing to such efforts toward an alternative OS for disgruntled Mac users; I have experience with systems and kernel programming. If there is enough interest, maybe an alternative OS will materialize.


I actually think that this is likely to happen over the next 2-3 years. Every single one of these companies is now going in the wrong direction in a specific way--which opens up a real desire for a real alternative.

We all used to love Mac, we all see that Apple has lost its way and seems to be heading more and more confidently in the wrong direction, and that may be just the push the community needs to actually build something.

And what an amazing achievement it would be--an actually open alternative that runs on lots of (powerful) hardware with the beauty of Mac OS (before Lion lol).

I really think it's going to happen because I think a very large percentage of the community now realizes that there is no existing private company heading in the right direction. So we now have to take the steering of the ship into our own hands.

Abracadabra.


I agree. In my opinion Snow Leopard was the high water mark of Mac OS X (although there are some features of later versions I like such as the auto-save feature of open documents between reboots). From a UI standpoint each subsequent version of OS X has been a deviation from the ideal user interface.

A fully open community "Mac" operating system that runs on any x86-64 hardware would be an excellent thing. I believe the best way of getting there is contributing to the GNUstep and Darling projects so that the underpinnings are fully functioning, as well as working on a Snow Leopard-esque interface.

I have some free time over the next week or so; I'm going to start developing a plan for making this idea a reality!


I'd buy it!

Who's going to fund the upfront development? "We're going to make a new desktop OS that takes on both Apple and Microsoft" sounds like a hard sell for the VC crowd.


And that's the hard part....OS/2, NEXTSTEP/x86, and BeOS were examples of alternative OSes that failed to successfully take on Microsoft in the 1990s. It would be difficult for a VC firm to fund a business model that has failed multiple times in the 1990s. Moreover, the desktop market as a whole is shrinking, and my understanding is VC firms are interested in growing markets. An alternative OS dedicated to power users is definitely a niche market.

I wonder, though, if these is enough interest in the FOSS community to make such an idea a reality, where the project could be started by volunteers and donations could be requested?


It could be, but it seems like that would depend on the brand awareness / credentials of the team writing the software. Note that KDE, Gnome, etc. have been trying to do exactly that for decades, with only marginal success.

So if you got the team that wrote the Apple Aqua UI on board to write it, and have great PR, then maybe. If you get the KDE or Gnome team, no way.


Your paid Linux idea won't work, because money is not the only thing holding back the open-source community. Other reasons are:

- Design isn't appreciated or valued. A successful OS effort can't be driven only by engineers.

- An obsession with choice as an end to itself, and being unable to say, "This is what we think the best user experience will be. You can't change it."

- Fragmentation with too many APIs, GUI toolkits, and so on.

- Participants focusing on users like themselves (tinkerers and geeks) rather than typical end-users who want to use the device to get their real work done, rather than messing with the system.

Until you have a plan to address all these, throwing more money at the open-source community won't produce a macOS-quality OS.


If it's a paid Linux though, then you can get a team that:

-Appreciates design.

-Makes choices.

-Supports the most powerful and popular hardware.

-Yes, focused first and foremost on tinkerers and geeks, who are the main people losing out in the current environment. It's all a group that's increasing in size and will likely continue to for the foreseeable future.

How's that plan sound?


"Makes choices" and "focused on tinkerers" are at tension with each other.

Because tinkerers want choice, such as with multiple window managers, sound subsystems or what have you. If you have N window managers, now you need to build and maintain all of them at a high bar, which is N times as much effort as one.

Plus the combinations: window manager X doesn't work with sound subsystem Y.

An awesome Linux would have just one supported window manager, filesystem, sound subsystem, and all the rest.

You will also need one dictator who understands eng, UX, product design, sales, marketing, and so on. The dictator listens to everyone, and people can present information and perspectives and debate as much as they want, but at the end of the day, it's the dictator's decision.

If you do all this, yes, you can succeed, in theory.


> I also really like Mac OS, even with it's thorns, because its design is much better than whatever version of Linux I would use (not a big fan of Ubuntu's Unity).

I would love to hear your thoughts on the Cinnamon DE. Genuinely curious.


I didn't really go into my preferred Linux environments, but my preference is generally KDE. Cinnamon is a fork of an older GNOME that's reskinned, in my understanding, and I guess the whole GNOME look isn't my preference in general. I think a big part of it is the interactions, and from what I remember there were things I didn't quite like about GNOME that I liked a lot better in KDE.

Caveat: I haven't used a Linux GUI in ~3 years, so my opinions might have changed a lot since then.


Apple hardware is good. Not a fan of treatment apple OS. For me Ubuntu is now a far better developer OS. I just want a similar hardware with Mac touchpad working to make the shift wrt laptop.


I don't use apple products because of the shiny, sexy hardware. It doesn't hurt to have a visually nice looking device to look at when you are spending most of your day on it. I use apple computers (don't use iPhone), specifically because of the OS. I just can't see myself working on Windows ever (unless it makes some major leap of improvement).

MacOS has its problems, but its still many time better than the POS Windows OS. The only other option is Linux/Ubuntu, which is nice but wouldn't be my first choice, but definitely a second choice.

You can't cover a shitty OS with a Shiny dress and fool most people who have been burned by it. They can copy all they want, at the end of the day it still has windows installed...


I can agree with that and felt the same way until I realized (or was forced to admit repeatedly) that 90% of my time using Windows or OS X is now spent either in an SSH session, a web browser, or a virtual machine. Frankly all of these releases, from both companies, come off as rather boring to me. I think all the most interesting innovation, from my perspective, is happening in software now and less "flashy" embedded systems.


Part of the reason that macOS works so well is due to the good integration with the hardware.

Looking at the poor experience of other commenters in this thread, it looks like Microsoft hasn't realized that is how Apple has managed to do it, and the Surface devices are really "just another PC" and not first-class Windows hardware.


I feel like Apple is trying to force innovation with their latest offerings.

I'm not against change, but change for the sake of change is annoying.

The idea well must be running pretty dry at Cupertino


20 years Apple released the iMac which did away with SCSI, ADB, RJ232 etc in favour of USB.

What they have done now with the MacBook Pro is exactly the same thing. They are trying to establish USB-C as the one port to rule them all just like they successful did with USB-A.

Yes change can be hard and some people like yourself clearly struggle more than others. But change is needed sometimes to push the industry forward.


Unifying to one standard port is kind of cool, except for the fact that not even Apple's other devices use that type of connection and users will have to buy dongles for almost everything. That represents a concrete step backwards in user experience.

There is a time to be bold and throw your weight behind a better standard, but there's also a time when doing so makes things inconvenient with little to no benefit.

That wouldn't be so bad if the Macbook were significantly more powerful but here we have a machine with the same max RAM as the Macbook they sold 4 years ago while PC laptops are shipping with 4 times as much.

It just seems like Apple is focusing on making a sleek and visually pleasing device, rather than a device that will be most useful to those who want to work on MacOS.


The next iPhone will likely have a USB-C port. They couldn't have changed the lightening port on the iPhone after only being on the iPhone for a single generation.


More likely IMO is the next iPhone will have a lightning to USB-C cable.


You think they're going to ditch lightning after everyone buys a bunch of accessories? They could, but I doubt it. What would switching to USB C do for Apple?


They're not about to switch to USB-C on mobile. They've been doubling down on Lightning recently with the Apple TV remote and the new Magic peripherals.


"They could, but I doubt it." – you mean exactly like they did with the switch to Lightning already?


If you're going to piss people off by removing the headphone jack, you don't do it again by changing ports next year. You tear the bandage off in one go.


Yes, I mean it exactly like that. Apple could and would switch if it provided them significant benefit (smaller, more robust connector, ability to make the phone thinner, new capabilities). I don't see that USB-C provides them with anything except a loss of control over the connector and a bunch of annoyed customers.


The original iMac was a new product for a new market at a relatively low price.

The new MacBook Pro is an update to an existing product, with a large base of current users who already have expectations about what the product should do for them, at a relatively high price.

That's a big difference, and it's the main reason people are so peeved about the total switch to USB-C. Apple should have added USB-C ports to the existing slate of MacBook Pro ports for at least a generation, to help everyone bridge the gap. Nobody cares about the 2 or 3 mm that Apple shaved off the thickness by discarding all those ports for this generation of the product, and anybody who does can go buy the little MacBook instead.


The original iMac with its 2 USB ports was unusable for anything but the simplest tasks. It was equal to a dumb terminal.


Cutting edge Macbook Pro: 16GB RAM "because battery". o_0

My 2011 MBP has 16GB.


I'll keep buying Apple hardware (even if the latest rev isn't great) since macOS is the only usable operating system. Windows is (still/increasingly?) a tire fire that's only useful if you're a gamer, and Linux is for programmer-masochists.


Why do you think Windows is a tire fire and OSX is usable?

I used my first OSX machine ever earlier this year. I wanted to add some extra keyboard commands, so I had to download a 3rd party program that had to unlock accessibility controls and essentially take full control of the machine. That's absurd to me.

I also hit tons of external display problems. Things wouldn't connect or sync right, desktops not correctly moving to the right screens, no window snapping even though it's a high res screen, etc.

I hit way more beach balls/lags than I do with my modern Windows machines.

Windows 7 was solid, Windows 8 was a bit of a mess (as has been every 1st iteration of Windows [98, 2000, ME, XP, Vista, 7, 8, 10]), but Win 10 is really enjoyable.

It's unfortunate that the BSODs caused largely by shitty drivers in Windows 98/XP has hounded the Windows ecosystem for over a decade, even though the driver verifier has fixed the vast majority of those.


You can customize a almost everything using the Keyboard panel of System Preferences. Yes, 3rd party tools like Karabiner are needed for control of absolutely everything but that is pretty rare.

Also note that accessibility must be “unlocked” because it fundamentally adds a security risk: processes that can inspect inputs in arbitrary unknown applications have a lot of power. What macOS does is actually a feature and it prevents one of the “tire fires” in Windows.

Beach balls in my experience are usually produced by problems in drivers, or programs that are unnecessarily complex (like the ones that are 400 MB installs with entire virtualization layers instead of being all native code). The OS can only do so much.

External displays: yes, definitely buggy on Macs these days, and with some obscure settings. Something to try: with at least 3 desktops/Spaces defined (need to use "+" in Mission Control), you can right-click on the Dock icon of an application to specify the default space to use for windows in that app. These menu commands are not available with less than 3 Spaces created.


> These menu commands are not available with less than 3 Spaces created.

Well, discoverability certainly has been a problem with Apple software lately.

Take Force Touch as an example. Perhaps the menu items will show up with fewer than three Spaces created if you just press it harder?


Last year I bought a new Macbook Air that came with Yosemite, so after I got home and went through the setup process, I tried to update it to El Capitan through the app store. The laptop went to reboot... and didn't come back up. I thought it was completely bricked because pressing the power button did nothing: no LED light, no startup chime. Apple support guides said to hold down the power button for ten seconds and then re-press it, which at least made the laptop turn on and chime, but nothing happened beyond that. Booting into recovery mode didn't work, but luckily booting into internet recovery mode did. I had to reinstall Yosemite over the internet, create a USB image of El Capitan, then install from the USB.

Even after that, things never went smoothly. There was some sort of bug where I couldn't enter my Apple ID security answers in the OS dialog box. Like you, I had problems with external displays, especially with waking up a display when the laptop itself woke up. The very first time I tried to restore from a Time Machine backup I got errors. Problems with iTunes and Photos sync. Not to mention more or less subjective issues I had.

I had more problems with this laptop than I've had with any Windows machine.


In my experience Macs have never worked very well with non-Apple displays. Switching from a Dell to an Apple display was like night and day.


>Why do you think Windows is a tire fire and OSX is usable?

Because OSX comes with a terminal environment that is almost identical to GNU.


How do you edit which keys do what on Windows? I had to do registry edits to get the caps lock to function as control which is baked into the macOS preferences. You can also add as many shortcuts as you like that control anything on macOS in the Keyboard pane. Sure you can't remap every key to something different out of the box but I don't think you can do that on Windows either.



That's a tool you have to download. It doesn't come with the OS.


And yet OSX doesn't have such a tool from an official or even unofficial source without a daemon running and wasting CPU cycles.

I bought a non-apple keyboard and still can't map home+end keys in all applications without a daemon running.


Ukelele? I used to create a hybrid custom Swedish layout that shows up among all the system key layouts http://scripts.sil.org/ukelele


Linux is most definitely not for masochists; when did you last use it? Ubuntu, elementary OS, Mint, etc all offer pretty polished user experiences.


I try to install Ubuntu every 18-24 months or so. A problem occurs in about 2 days that I have to google and spend hours in frustrations. No, thanks. 2016 is not the year of Linux on the desktop.


You will be disappointed with any OS after 2 days of use. It might be because of real problems, or it might be because things are not how they used to be on what you were using before.

Despite disliking GNOME 3 when it came out, I got used to it and I honestly think it's the best desktop out there.


Did you try it on a desktop or a laptop? It doesn't work as well on laptops because of the graphics cards drivers and sleep/hibernation.


Intel only has an open source driver so any laptop with integrated works great.

Also haven't had sleep or hibernation issues in years. Though I've used Thinkpads and Dell XPSs which are probably the best supported laptops for Linux.


Thank goodness you shared your opinion.


"A day later, Apple ended the long wait for laptop users yearning for an upgrade by unveiling a new line of MacBook Pros."

Meanwhile, they did nothing to end the long wait for desktop users yearning for an upgrade. I was expecting at least a cursory spec bump for their Mac Mini and iMac lines. I was hoping to replace my 6-year old Mac Mini sometime soon, but I don't want to do so with a 1-2 year old product.


I keep reading about how unhappy people are with Apples latest reveal. I feel the same way. No way am I upgrading my 2015 Mac Book Pro with the new one.

Still, Apple will not change a thing unless people actually vote with their wallets. If you're really so upset, do NOT buy this damn computer!

If this latest line sees very little sells, they'll get the idea.


> I keep reading about how unhappy people are with Apples latest reveal.

That's probably because a lot of the people who are happy no longer bother to read or comment on these types of stories. There's just too much negativity and it's the same for Apple, Microsoft, Google, etc. People with positive opinions are basically no longer welcome to participate in the discussion.


Suppose you need a new computer for work. Your other options are Windows or a Linux desktop experience, both of which are vastly worse in other ways. (No, I don't want to dork around with spending days compiling kernel patches and installing soundcard drivers before I can use the computer.)

I'd gladly buy any other laptop that was simple and just worked out of the box in the sense that a Mac does. But there aren't any.


> No, I don't want to dork around with spending days compiling kernel patches and installing soundcard drivers before I can use the computer.

You are badly misrepresenting the current state of mainstream Linux distributions. I've done probably 20 Linux installs, and by and large it always just works. If there is some necessary proprietary driver, it's almost always as simple as Menu --> Administration --> Driver Manager and then clicking once or twice.

Kernel patches? I wouldn't know how the hell to do that, but somehow I've been using Linux happily for a decade.


I am using Linux distros on both laptops and a desktop as the only OS and you are definitely misrepresenting the current state of things. It takes a lot of work.

For the laptops, the clickpads never work well. You have to mess with synaptics settings a lot and eventually you get a slightly worse config than the default on Windows and a lot worse than OSX clickpad. This is coming from someone who looked into hardware compatibility and bought a laptop that didn't seem to have any problems.

For Desktops (and laptops * 10), you have a lot of issues if you want to

A) Use CUDA in general for neural networks.

B) Resize VM encrypted hard drive after creation.

C) Dual boot with Windows (things like updating windows or reinstalling a Linux distro after digging yourself into a hole with CUDA drivers mentioned above can wipe grub in a way that wouldn't let you boot).

D) Allow hibernation in a dual monitor setup with proprietary drivers.

E) Use a tablet for drawing on a system with multi-monitor setup and proprietary drivers.

F) Mess with Compiz settings too much when you have proprietary drivers.

G) Dual booting with one hard drive with Linux full disk encryption and non-default partitions and another hard drive regular Windows.

H) Dual boot from one disk and encrypt Linux partition with luks and forgo swap partition.

Having said all this:

I would still use ubuntu/linux because almost all non-.NET/Java tools are easier to use on Linux. Lets you customize your system and code without VM overhead and inconveniences.

The system doesn't get slower over time and nothing unexpected randomly happens (except after updates to graphics drivers).


Most of your list is either a specialized requirement, or something that probably wouldn't be that much easier on Windows / MacOS. I meant that simple things like browsing the Internet, video chatting, and playing music (the needs of 99% of computer users) work out of the box. GP was saying that you literally can't install Linux without kernel patching and command line wizardry, which is totally false, and the only point I was refuting.

I'll add one to your list -- dealing with Linux audio. I have a stable music production setup now, but it took me a long time to iron everthing out.


Not at all an exaggeration.

I recently had to binary patch a video card driver -- using a patch I found on a forum -- just to get it to start X, after I had specifically bought the hardware (Foxconn, AMD - major brands) because they specifically advertised that they fully support Linux and X, which I had cross-verified with the Linux and X documentation.

Why? The video card was a Radeon model "1234" (I forget the real exact number), but the hardware was a newer minor revision that identified itself as a "1234-A" to the OS in its ID string. Linux (Ubuntu) then simply refused to load the "1234" driver, because "1234-A" was not in the list of approved card ID strings in the driver.

Absolutely nothing on AMD's site, nothing on Foxconn's site, nothing on Ubuntu's sites. Contact tech support? Ha.

Eventually -- after days of googling for a solution -- I found some obscure forum where others were discussing the exact same problem. Someone provided a binary patch ("just load up the video driver in a hex editor and change the following 20 bytes to these other opaque values".) It worked.

That is why I don't want a Linux desktop.


I'm not sure when you last used Linux, but nobody compiles their own custom kernel with patches anymore.

You don't have to make up stuff to make a point; yes, Linux is probably not as polished as MacOS (though I doubt that), but it's much more user friendly now than you make it out to be.


I use Linux on a daily basis, including recently building a Linux desktop machine from scratch. Its usability is much better than it was in 1999, yes, but it's still a very far cry from the polished, just-works user experience of the Mac.

No, I should not have to edit any text files to get the GUI to come up.


I've been using Linux as my primary OS since 2009. Not once I had to compiled anything at all. It works surprisingly well with almost any hardware you throw at it.


It's not just hardware support (though even that still requires stuff like editing text files before the GUI will start on certain common machines.) The UX of the GUIs leaves much to be desired - certainly relative to an Apple machine. Linux GUIs were all designed by committee and coded by volunteers, and it shows.


> editing text files before the GUI will start

What text files are you referring to? I've been using linux for 10+ years and have never had to do this.


Initial indications are that they're selling well. Touch Bar model shipping estimates were already slipping 3–4 weeks within 6 hours of going on sale:

http://www.macrumors.com/2016/10/27/macbook-pro-shipping-est...


I'm honestly not surprised. The 13" model looks like a great replacement for my now 4 year old 11" air. Its thankfully not much heavier, but with increased screen space, retina display, battery life improvement and support for 2 external 4k displays.

I've been holding out because until now all the apple laptops on offer have required huge compromises for me. The macbook is too gutless (expecially its GPU which apparently can't drive big external displays smoothly). The recent airs still don't have retina displays and the old macbook pros are so big and heavy. After owning an 11" air I can't imagine travelling with the previous generation 15" mbp.

The only downside is the exorbitant price. But to replace a machine I've used just about every day for 4 years I can easily justify the cost. And I hope (and expect) the price to drop over the next few years. The air, macbook and retina mbp all started out pretty expensive and then dropped steadily over the lifetime of the design.


> If you're really so upset, do NOT buy this damn computer!

Many of the people complaining were never going to buy a mac anyway.


People talk about games/3d hardware for MS products, but I do a lot of work in 2d graphics, animation and Typography and the amount of work apple puts into these compared to MS is superior imho. Even more when you come to compatibility with these in MS land compared to apple, Apple just works, and works beautifully, MS is a mire of API's that perform adequately. 2d is the interface that most people see and really the focus on this is the reason that 'creatives' flock to apple imho.


Apple focused on creating a specific experience.

To achieve that, they started by only targeting specific hardware. Android supports various hardware and so does Windows. Does iOS and macOS do the same? no. Because it would compromise the experience, which is tied directly to the value of their brand, which is ultimately what allows them to price their products the way they do.

Apple focused on creating products people want to buy. I am not from a wealthy country, and I have seen people who have put basic needs aside to purchase an iPhone. For a much lower price you could purchase an Android phone, but they did not care. This is the power of a consistent, pleasant experience, something that Microsoft and Google seek to now obtain through the Surface and Pixel respectively. Let's see what happens.


Bizarre role reversal? I still remember MSFT and IBM doing the same thing in the 90's.


Microsoft was always more innovative than Apple - just Steve Jobs did a great job at marketing, borrowing Porsche design, etc. Let's face it - Jony Ive is now boring, Jobs was really inspiring people. I miss Jobs, Ive is no match.

It's so funny how Apple doesn't release a dual-mode touchscreen laptop because they are afraid their iPad sales will tank. So cowardly - unlike Jobs' Apple!


Apple should have had touch screen Macs a few years ago with a pressure sensitive pen and other devices. They should have given the iMac a touch screen, and given the Macbook series a touch screen as well.

Microsoft got artists with the Surface Pro tablets and pressure sensitive pens, etc.


> A day later, Apple ended the long wait for laptop users yearning for an upgrade by unveiling a new line of MacBook Pros.

Was this a coincidence? I'm guessing not but no idea really.


I was actually fine with Apple's "courage" to take away my headphone jack on my phone. But the courage to take away my Esc key crosses the line for me.


[deleted]


>a developer who needs a Unix-based OS.

Windows 10 includes binary compatibility. bit-for-bit, checksum-for-checksum Ubuntu ELF binaries running directly in Windows. all of Ubuntu user space.

a full copy of ubuntu built in. Windows not working as a unix-like os for developers isnt reality anymore.


I have Windows 10 on my desktop machine, so I'll try it out. However, all that means (assuming it works perfectly, as you seem to suggest) is that Windows is usable to me. That's not enough of a reason for me to get a Surface Book.

I'm less than thrilled with Windows right now because just the other day the built-in Mail app started crashing within a few seconds of opening; it's now unable to connect to my Exchange account. After spending 10+ hours trying to diagnose and fix the problem, I've accepted that nothing short of a complete wipe of the machine will fix the problem.


My MBP periodically crashes any app that tries to invoke an open/save dialog. Its great when I try to download a file, accidentally make preview think I edited something, or try to save a document.

You can pick individual bugs to complain about all day. Every OS has plenty. Sorry to hear about your issues with the mail app though.


I think the surface book is microsofts NUC, a way to raise the bar and make the windows market competitive. they are fine with you buying a dell, lenovo, or hp as long as you use and develop on windows.

if you log into a new user profile, does mail work? if so you dont need a machine wipe.


> Windows 10 includes binary compatibility. bit-for-bit, checksum-for-checksum Ubuntu ELF binaries running directly in Windows

Windows 10 tries. It doesn't run the actual Linux kernel, and doesn't implement quite some portions of it. There is no boot process, so system services can't be run the same way they can on a real Linux install. The VFS is buggy and slow. The install hangs and borks sometimes, without a clear way out. And it still can't run Ubuntu 16.04. And the biggest disparity: no GUI support.


Well, I missed that announcement - http://thehackernews.com/2016/03/ubuntu-on-windows-10.html or http://blog.dustinkirkland.com/2016/03/ubuntu-on-windows.htm....

Are you "Kirkland", you used the exact same phrasing.


If you take a close look at windows, you'll see that WSL is already doing a pretty good job of giving you that UNIX environment (really Linux) on windows. Things like proper apt-get and not having to deal with Homebrew are nice.

It is early days, but things keep getting better. The 'I need Unix' crowd will soon find that Windows is a much better Unix than macOS ever was.


> Windows is a much better Unix

Cutler must be crying right now.


Tears of joy, I presume. This certainly does feel like a golden opportunity for NT to reassert itself... Lackluster team at Apple clearly can't maintain quality standards for macOS + strong Windows 10 performance, faster iteration, and a full Linux environment that's better than macOS's.

Now if only Windows Phone wasn't being completely ignored (who knows, maybe it could have been a better Android than Android)...


I'll down-vote you just for pointing out that you'd like not to be down-voted. If you weren't so over-protective of your ego, you wouldn't feel the need to bring it up, which inherently drives the discussion away from the things you claim to want to drive it toward.

Treat your upvotes as a currency that you can spend on controversial opinions and phrasing. Accusing down-voters of being 'angry people who don't engage in discussion' is certainly immature and hypocritical of you. Generally speaking, there are dozens of great reasons to down-vote someone and not type a reply.


[flagged]


Well, given that Microsoft's profits were once double that of Apple, yes, it is quite a reversal.

(irony of using "M$" when you're talking about them not making money is great, btw)


more like $pple, am I rite?


App£e. or Appl€. Ideally you could use the $ symbol but there's no s in Apple. Sure, you can just throw in whatever character you want, but !nevets doesn't make a lot of sense.

what you're shooting for here is a pun. You want something that looks like a regular character, so @pple is kind of cute, but there are two obvious problems. One, it kind of looks like a direct message to pple, which falls apart. &pple isn't bad, but & doesn't have the association with money, so you don't get the pun.

There are a bunch of currency symbols you can use, there's a list here [1]. Check it out ₩₦€v€t$, then all of your currency puns will be money.

[1] http://www.xe.com/symbols.php


This is HN humor at its finest.


yeah I was going to do that but apple is an american company and it doesnt make sense to use the euro. Using $pple is just as lazy as using m$ to call microsoft greedy.



Which makes Appl€ particularly fitting!


> M$FT

I'm sorry, are you from the past?


"Hacker" as in "Hacker News" is from an even more distant past. The word just got appropriately decaffeinated, so as to be acceptable within our current iteration of group think.


I tried going to a 'hacker' meetup in my city. It was disappointing.

I was asking people if they were programmers, the median reaction was "a programmer? oh lord no, I'm a [marketing/UI/non-technical] person".

It seems the term was coopted and gutted by some in the startup scene.

(I'm aware there are legitimate hackers in the startup scene, this was just one particular event in one particular city.)


In my city there's 'growth hacking' meetups. It's too nakedly marketingistic to even consider seriously. I fear in the English language the word 'hack' is often being used as a synonym for 'shortcut'. The Lifehacker 'hacks' being prime examples.


This one was advertised as for developers/coders/engineers/designers, but I'm in a second-tier (for developers) Canadian city.

Many talented tech folks move to first-tier Canadian cities, or to first tier U.S. cities, which are zeroth tier by Canadian standards, if they aren't tied down.


> I'm a [marketing/UI/non-technical] person".

They are obviously confusing a hack with a hacker.

/me ducks


Even if it wasn't non-technical people, it seems the pop culture of "hacker" attracts all the worst stereotypes of geeks. The really sharp ones don't seem interested in self-identifying as "hacker".


Is the groupthink in this case laughing at the anachronistic usage of M$?


No. The groupthink is what causes the usage of M$ to be anachronistic now, and also what made it fashionable at some point in the past. Treating subjective opinions as objective facts and laughing at members of the out-group are symptoms of being engaged in groupthink.

Nothing wrong (or right) with any of that. I'm just observing.


Some of us are. Once, a long time ago, using the $ was "cool." And "so it goes" with all forms of online culture. (Has Kurt Vonnegut gone out of fashion yet? I used to coxswain crew boats past his summer home.)

I would be quite amused if young "Microsofties" suddenly decided to co-opt "Micro$haft" but with a Richard Roundtree vibe.


I was debating on saying Mi¢rosoft, but I decided against it at the last minute. Oh well.


Let's be clear: using the $ was NEVER cool by any stretch of the imagination, it was just a lot more common in the Internet of the '90s. The people who thought it was cool were always the same people who think it's funny to make up derogatory puns for the names of things they don't like, such as "Internet Exploder" or "Crapple". This is a category of humor that will be forever associated with bitter, angry, condescending-yet-clueless, neckbearded, fedora-wearing morbidly obese IT guys with phone holsters and Bluetooth earpieces that continued making "all your base" references well into the 2000s.


So let's offend them, because denostating products/companies (as childish as it is saying M$, CrApple et al) in your opinion is the same as denostating other human beings, why not just ignore the bad jokes?


> I'm sorry, are you from the past?

Isn't everyone?


Sure, and that's part of the joke. It's also an IT Crowd reference.



Yea, things have changed... a lot. Back in the anti-trust days of bundling IE, bullying OEMs and undermining standards, MS was the evil empire of the day.

Now you've got Google backtracking on privacy and siphoning up user data to create ads that know you a little too well, Amazon as more threatening Walmart of the digital age, and Facebook trying to make sure that the only way you can experience the online world is through the censored lens of Facebook itself.

It makes all of the once-held fear of Microsoft dominance seem quaint.

At least Apple is still building pretty gear and killing useful ports, just like back in the day.


Heh heh heh... that'll show the author.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: