Big surprise. Video Pros now, developers next, common users last.
While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.
Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.
Even Windows is becoming more viable again, ubuntu core and all.
What's the viable alternative? I'm looking for a replacement for my 2013 rMBP. Every PC laptop has a fatal flaw. The Spectre x360 15 comes close, but has less battery life (despite having a slightly larger battery), is bigger and heavier, has only a dual core processor, and has a worse screen.
The XPS 15 is another option. Even with a significantly larger battery it doesn't seem to have better stamina than the MBP. It's apparently been plagued by QA problems, and offers a choice only between a 1080p and power-hungry 4K (can't get high-DPI and 10 hour battery life in the same package).
I suspect, looking at the sales numbers, that you're wrong. Apple is giving up on a certain group of users (who need fast GPUs). They gave on those users years ago by shipping crappy OpenGL drivers. But they've correctly perceived that common users don't want ports, or expandability. They want a fast machine with a great screen and all-day battery life, and they want to hit those metrics in the smallest package possible.
I recently trashed my 2011 MBP (oops) and spent a month without it. I decided to try using my corporate-provided Surface Pro 4 instead to see how I could get on with a similar-spec Windows 10 PC for personal use.
By the end of the month I had gone slightly loopy (seriously - MacBook withdrawl is real), gave up and bought a 2015 rMBP. Personally I find macOS, and its deep integration with Apple's hardware, too intuitive and 'invisible' to the way I work to give up on it.
I use both Linux Mint and MacOS on a daily basis and I can't say that one really is better than the other. I'm sure it would be the same with Windows if not for the fact that I need some unix-y stuff.
I do have a use case that I would really like to get on my primary laptop that neither Mac nor Linux currently offer: Detach screen and use stylus to annotate pdfs. If I was using windows on a surface this would be a major part of my workflow, and I would find it impossible to switch back. Doesn't prove that Windows is superior. (Maybe I will end up dualbooting Windows and Linux on a Surface or Yoga or some such thing eventually).
>> Have you tried the Windows Subsystem for Linux? You might be able to get by without a separate Linux install.
I have been meaning to Subsystem for Linux that since they announced it, but Cmder (Conemu) has been so working well for me that I have gotten too complacent to even try it. Cmder was the one thing for me when I switched to Mac that truly made the transition painless -- I thought I would hate not having OSX Terminal. I don't do super advanced unix shell stuff, and it was just perfect.
I've been using it since it was available on the insider fast ring.
It works pretty well for command-line use. If you need X at all, you're probably SOL.
I've been doing mostly C++ & Python development. I've run into a couple of issues around networking and a couple of UI issues. The UI issues have been fixed after I reported an issue on GitHub.
All in all, it's been a fairing great experience, and the developers have been very responsive to issues reported on GitHub.
Not yet, but I'm following it closely. Currently I don't have a Windows install on any of my work machines, so it's something I will evaluate when I update computers next.
>> By the end of the month I had gone slightly loopy (seriously - MacBook withdrawl is real),
After being on Mac for about a decade, it only took a week of using Windows full time before going back on the Mac felt foreign to me.
If I spent a week or so full time on the Mac, the reverse would probably happen.
These days I only care that my main tools work on whatever operating system I use (it's a great era for end users now that so many tools are cross platform). There are things about every OS'es main UI that bug me, so I don't find myself being loyal to any one particular OS.
Mac hardware is great! But I find using OS X is like trying to juggle with handcuffs compared to the Windows.
IMHO In their quest for simplicity, that have made a very good GUI for users, but hell for professional developers who DON'T only use the command line. One of my lifelong friends swears by his MBP, but he only uses the command line.
I think that depends on your view. IMO, Apple generally does make some of the best hardware in terms of quality and design, but there are exceptions.
The Mac Pro garbage can, for example. I don't know why they chose to make an art piece to replace the incredibly functional and expandable tower that existed before.
Another example is the early 2011 Macbook Pro 15" with discrete GPU, which had serious overheating problems that they just pretended didn't exist for more than three years.
I feel like I'm in prison or an insane asylum when I have to use my Mac or iOS devices after enjoying the freedom that I have using Windows all day.
I could go very far into detail here and list all of the extremely annoying limitations that I run into, but instead I'll respond to your vague complaints with my own. Apple quite obviously wants absolute control over their device and their software whereas Microsoft lets me to do whatever I want with my computers and my software.
I have to have a Mac to make iOS apps, but as soon as those are no longer a thing I'll toss all my Mac stuff straight into the garbage.
I've had the opposite experience. OSX is just weird bsd with a great window manager and ui framework. You can easily sidestep gatekeeper, if that is your complaint.
The hardware fit and finish is also second to none, and runs windows just fine if that's your thing.
The alternative on windows is a machine that spies on me, has horrible ui bugs and inconsistencies I run into constantly, and decides to auto update and reset all my privacy settings in the middle of the night while I am using the machine.
Not to mention it is often used with some awful trackpad. I haven't tried them all but I have never seen a non-mac trackpad I could go back to.
> The alternative on windows is a machine that spies on me, has horrible ui bugs and inconsistencies I run into constantly, and decides to auto update and reset all my privacy settings in the middle of the night while I am using the machine.
This exactly. I'm still baffled by claims that Windows 10 has a good desktop UI when I see its iconography [1], huge click (touch) targets [2], and wildly inconsistent use of whitespace [3]. That's not even mentioning the forced updates and always-on telemetry. I'm not sure how one can say macOS is more of a walled garden than Windows at this point, at least macOS's security features will get out of your way if you ask nicely.
> I'm still baffled by claims that Windows 10 has a good desktop UI when I see its iconography [1], huge click (touch) targets [2], and wildly inconsistent use of whitespace [3].
This almost reads as sarcasm. Some people really don't care about those things, and using them as examples of why you're baffled just highlights that disconnect between you and them. I don't use any bundled windows apps, and I'm rarely in the settings (and I just search for what setting I want), so iconography and whitespace design decisions in windows apps don't even factor into it for me.
Neither OS X nor Windows feel as comfortable as my customized FVWM config did, but windows gets a lot closer nowadays. I had to use OS X at work for a few years and it always grated.
> That's not even mentioning the forced updates
It's possible to disable them, you just have to put some effort into it (it requires regedit). I think this is the right decision. If you want to disable updates and you don't know how to change a registry setting, then for the good of us all, the answer is no.
I want automatic updates. I think this is a good thing. Chrome automatically updates whenever I restart it. This is great.
I don't want updates when I am in the middle of something full screen like a game, forcing a restart of the machine on me. This is madness.
I don't want ads for office 365, Cortana or edge on my desktop. I don't want to learn how to block them. I don't want to use an OS that feels like it is being milked for all it's worth in its dying breaths.
It always asks me, and I can delay it. You've actually had it force an update right then while playing a game, and without having told it "no, delay it" multiple consecutive times (I believe it will only let you delay it 2-3 times)?
Edit: Also, have you set your active hours? Windows allows you to define the times you use your system so it won't attempt to update during those times. Additionally, you can set a specific custom restart time for when it will restart.
I tried to. I play games for a few hours either at night or early am -- say 6pm-1am or 5am-9am are my possible slots. Unfortunately, windows update will not accommodate this -- you are only allowed to set one window with an 8 (maybe 12) hour max, and it must be consecutive. I had to dig around to find this, and it still is not sufficient. I ultimately solved my problem by using the regex editor to convince windows I was on a metered internet connection. Unfortunately, this broke update all together. I turned it back and now its still broken -- apparently the magic auto-updater is the only way get updates -- there's no button I can click to just download the latest update and install it? (at this point I gave up)
What about setting the specific update time to something like 3 AM?
I was just in the windows update settings and there was a way at the top to check for updates right then. I didn't use it so I'm not sure if there's some other gotchas involved with that.
I'll have to check that out, thank you for the tip. What I would really like is a a shutdown button that actually "Check for updates, install if found, then turn off". I'd click that every time I was done using it.
The window inconsistently interrupts my game. Some games it is able to rip me out of it mid session to tell me to restart, other times it silently times out in the background despite my computer running at full blast.
I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone. It is often reset without rhyme or reason to this random default (I think NYC). I don't always notice and change the time one when it boots up because I have steam launch in big picture mode.
Also, why does it even need to ask for active hours by default? I am using the machine at full-throttle. That is a really easy metric for "maybe wait until later". It's already logging everything I do and sending it to Microsoft, it would be nice to see some usuabiltiy features come out of all that data
Inconsistently working is a commmon theme of my experiences with windows. I am routinely baffled that I paid $100 for this experience and wish that there was better Linux game support for AAA titles. I know I throw my money that way whenever possible.
> why does it even need to ask for active hours by default? I am using the machine at full-throttle.
Some people run things all day long. My brother sometimes keeps the video game 7 days to die running all day at home while he's at work. Not updating when activity is detected is a good way to have it never update, and a good way to allow a virus to trigger a condition that may prevent automatic patches to holes it likes to use.
> I have set active hours but for some reason my windows partition - and not my ubunutu partition, so it isn't a hardware issue - does not reliably remember my time zone.
That's odd. Is it actually changing your time zone, or is it just off by a few hours? If it's just off by a few hours, my bet would be that it's a difference in how linux and windows set the system clock (one may prefer to keep the clock in UTC time, the other in the set time zone). If it's the actual time zone that's changing... I dunno, maybe some location service helper and a poorly mapped IP address? I haven't heard of that, but it does sound annoying.
I'm not arguing that there aren't cases where postponing restarting until there is lower load doesn't miss out on some people, I'm arguing that this shows a less respect for the user and is a poor experience. It does not feel like my machine, contrary to great-great grandparent. No other system I own forces restarts, and they all seem much more secure.
Its definitely the time zone not persisting. I've navigated through seas of menus to change it to no avail.
I agree its a good feature to ship enabled by default. Grandma who leaves her computer on for years at a time needs to have her security updates up to date.
I wouldn't call myself an unexperienced user at all but even after months of trying to disable the auto updates in W10 through all sort of settings and tweaks I gave up on it. Whatever I did it never lasted for long. When I'm not in control of my own machine what's the point? I have switched to a Linux distro and haven't regretted it.
I have a 2015 MPB and used it for 1 year before switching it to Windows because I got fed up of the apple dev environment.
It doesn't just run windows fine, it runs Windows with amazing speed compared to macOS. Everything feels (and is) snappier. I'm still left with a very expensive and under-powered machine.
PS. Installing Little Snitch, one becomes immediately aware that macOS does talk to base....a lot!
Sorry but that's just blatant lying. Apple takes great care of optimizing its OS and apps for its hardware, to the point where it's possible to use Final Cut Pro on the anemic MacBook 12 somewhat comfortably(!)
Yes, apple does optimize. I'm not disputing that; neither starting a silly "apple vs" debate. I am no fan boy of either mac or PC - I just use whatever is best.
The point is, that "equivalent software" runs faster. I would wager that if Final Cut Pro was for Windows, and you run it on the very same MacBook12, it would run faster on Windows. Despite this, using Final Cut Pro an example is a bad one, because it was developed by Apple itself (and therefore assumed to be highly optimized to the OS) and it is not available for windows.
It is more useful to compare an equivalent 3rd party (neither made by apple or microsoft) application. I'm a developer, so I use a lot more software than just "apps" so maybe if you try to use a wider set of software, the difference in platforms will become apparent.
Admittingly, I don't have numbers. But I don't need numbers, because you can feel the difference as a user. Install it and see for yourself!
Microsoft takes great care for optimizing its OS and apps for any hardware, to the point where it's possible to run Windows 10 on 10 year old hardware somewhat comfortably. To the point where the Windows 10 Kernel and a lot of the Windows 10 Core (OneCore) runs on mobile and IoT hardware...
To me, MacOS just seems like the best desktop UI and UX out there. I'm talking about stuff like font rendering, gestures, multiple desktops, etc. All of these things on the Mac have clearly had a huge amount of thought put into them. And since these things mediate your entire interaction with the computer, it's a big factor for me.
Windows seems to be a combination of 90's era throwbacks and 2edgy4you Metro design. If they've managed to improve this stuff recently I would be interested in switching, since there's a huge tax on hardware specs with Apple computers.
How recent is recently? Windows 7 was the first windows version I didn't dread using for years, Windows 8 was a small refinement on that, and Windows 10 is a vast improvement on that.
Windows 10 feels like it just wants to get out of my way, which is really nice, since for me the gold standard is a customized FVWM config I refined over a decade to be minimalist and extremely usable for work.
If you haven't used Windows 10, it's probably worth at least a look (as long as you don't mind or have ways to mitigate the privacy concerns).
I'm reading all of your comments here (not just yours) and I'm wondering. Do you spend that much time in OS alone?
I use MacOS, Windows, and Linux each and every day interchangeably. What I do is I start my productivity application(s) and spend time in that. I see OS when I'm copying files or when I'm in shell doing shit. Even when in shell, with baboon on Windows it's more or less the same experience across.
Only thing I want in all of OS' that is only in MacOS is preview. That thing is damn awesome. Everything else is invisible to me.
> whereas Microsoft lets me to do whatever I want with my computers and my software.
Not in my experience. My Windows 10 automation/unscrew-up script alone is like 5 kloc. And I bet 70% of that script could be replaced if I had real control of the system, like dropping a .config file in some folder instead of having to find hidden settings with nonsensical names deep down the regedit hole. Another example, you have to use some stupid hacks to make sure there are no Flash DLLs in your pc. No matter what you do they always come back in some security update.
> And I bet 70% of that script could be replaced if I had real control of the system, like dropping a .config file in some folder instead of having to find hidden settings with nonsensical names deep down the regedit hole.
You should see my ansible playbooks for our Windows Server systems, I don't mind PowerShell per-se but it takes a lot more effort to get anything done compared to my CentOS systems where I can template a config file and be done with it.
I really hope our vendors start supporting .Net Core soon, the SDK for our ECM software is the only reason we're still stuck on the full framework and having to manage a bunch of Windows VM's for our integration software...
It's based on several scripts from Github. A lot of lines are just regex and lists (apps, services, tasks) of things to disable or remove.
I recommend you do your own script by choosing what you want from each type of script. I would release my script if I was sure it wouldn't break random people's computers, because IT WILL. I'm also running Windows 10 enterprise because I want as little telemetry and things shoved up my ass as possible.
Some Windows updates can change registry keys or disable certain policies. I monitor the commit log of other repos to know what I need to update, but they don't always cover everything. Feels like a lot of work but it's actually not.
Here's how I structured it:
- admin.ps1
--- admin-config.ps1 (policies, tweaks)
--- disable-services.ps1
--- remove-flash.ps1
--- ...
- user.ps1 calls
--- user-config.ps1
--- disable-gamedrv.ps1
--- disable-services.ps1
--- ...
Because if you're using a regular user account (like you should) you need to run 3 things:
- admin.ps1 as admin
- user.ps1 as admin
- user.ps1 as your regular user
I gave up on using runAs or any of the things recommended on stackoverflow, something always go wrong so it's easier to do it this way.
For a fresh install, I recommend that the first thing you do is update everything and let Windows install the 200 apps you don't want. Run the 3 things like I mentioned, reboot, run it again, reboot.
My installation is months old and it runs like new even after heavy usage, hardware changes, tons of apps and games installed/uninstalled (this kills Windows 7). Just be careful what you remove, don't ever install ccleaner or any shit. All you need is sysinternals tools.
I'm too lazy to proof-read/make this shorter, hope it helps somebody.
Even assuming that issue was overblown, I distinctly remember that Catbert-style perma-nag message appearing on each login asking me to upgrade. Microsoft doesn't even deem its users worthy of a simple "Don't bother me again" close window.
Let us conservatively say about 10% of the 300 million people who supposedly got the Windows 10 update didn't actually want it. That's about 30 million folks who would disagree with this notion that Microsoft "lets me do whatever I want with my computers and my software".
The best thing Microsoft and Dell have going for them in the laptop space... the 2016 MacBook Pro w/Touch Bar. It was a dud. The GPU issues were the final straw for me.
I thought that switching back to Windows would be a HUGE hassle. The Windows Subsystem for Linux took the pain out of it. That, and I no longer have to fight the we-don't-have-a-macOS-version of $APP issues.
If Apple doesn't care about their computers, why should their users? ¯\_(ツ)_/¯
There is a difference between "design tradeoff" and "design flaw." E.g. I'd rather have an SSD-only machine with a bigger internal battery than use that space for a 2.5" drive bay. But I don't consider that a "flaw" in my T450s. But coil whine (XPS), crappy screens with backlight bleeding and PWM (Lenovo), not using Microsoft Precision touchpads (HP). Those are flaws. That's not a trade-off between different peoples' use cases, but just the manufacturer skimping on some part.
One person's tradeoff is another person's flaw and vice versa.
As someone who has a lot of trouble using any trackpad (including Apple's) that doesn't have physical buttons, should I view buttonless trackpads as a tradeoff or a flaw?
I don't care about the design tradeoffs made for esthetic reasons, I only care that I struggle with them and get stressed out by them. So to me, the lack of physical trackpad buttons on a laptop is a design flaw, even though it might be a completely sensible tradeoff for 99% of the population.
> One person's tradeoff is another person's flaw and vice versa.
If some people view a design decision positively, then its a trade-off, not a flaw. For example, I hate moving parts on my computing devices (they break). So I'm a big fan of the non-clicking force touchpad.
What I'm talking about is flaws. Nobody prefers a laptop with coil whine to one without. Nobody prefers a cheap-o IPS display with an uneven backlight or bleed. Nobody prefers Synaptics drivers to Microsoft Precision drivers. Those are flaws, not trade-offs.
>> Nobody prefers a laptop with coil whine to one without. Nobody prefers a cheap-o IPS display with an uneven backlight or bleed. Nobody prefers Synaptics drivers to Microsoft Precision drivers. Those are flaws, not trade-offs.
But like anything, you have to be sensitized to these things as a negative. Clearly, you've got plenty of things you're sensitized to.
Virtually none of my non-technical friends would even think to complain about uneven backlights, bleed or coil whine, simply because they haven't been sensitized to them.
Coil whine and a little backlight bleed have never been dealbreakers for me in choosing a laptop. All other things being equal, maybe they would be dealbreakers. With the wide variety of laptop options out there, the "all other things" never tends to be equal anyways.
You can make the argument that I have low standards, and maybe that's true. But I am not you and you are not me. Let's not assume that everyone is sensitized to the same things as you or has the same tolerances as you.
If you aren't CPU constrained, max out the RAM and swap in an SSD... you should be fine for a while.
I'm running a 2014 and so disappointed they soldered on the ram, and their SSD is a wierd format nobody else is using. I'm not planning on upgrading any time soon though.
I think the Apple/Mac pro market is really being abandoned in favor of the laptops.. but how much of their base can they piss off before it starts taking other segments with it?
I'm looking hard at the T570 (I've got a T450s docked at work, but 14" is too small for working on the go). It sucks that Lenovo's screens are such crap. Even /r/Thinkpad complains about Lenovo shipping cheap crappy screens.
I also use an HP ProBook G2 450. This is another awesome machine where the screen is subpar. On the Lenovo I an not bothered by it. But the probook unplugged is annoying.
The upper end of the Surface line seems decent hardware-wise, and the detachable keyboards give the option to trade thinness vs. keyboard. One should be able to put Linux on them, most likely.
That would depend on your criteria. For example my MSI laptop blows my mbps out of the water performance wise (I have 2 mbps btw 2014 and 2016- so no mac hating here), but it's about 4 times heavier :) but I don't have to haul it around too often.
If I had to haul a PC laptop around - for mbp - comparable form factor/weight I'd look at high end razer laptops. I'm also not a fan of glossy screens, razer has a matte screen option available (so do MSIs but again they are BIG).
Compared to the 13" rMBP, you get a smaller display and almost two hours less battery life (and people are up in arms about the 13" rMBP being a regression on the battery life front compared to the previous model). It's not a replacement for the rMBP, just a less good, but cheaper, alternative.
Not sure why you decided razer stealth @1.4K is closest to what Mac has to offer when they have a "regular" razer blade in the same price range as the mbp with the only upside to mbp being a better battery life (which is most likely due to a superior but hungrier video setup on the blade).
speaking of battery life - not sure that once you get into 6 hrs vs 8hrs type difference it's all that important.
I used to recommend the Razer Blade, but tons of horror stories on PCMR have veered me away. The only other laptop in the same class that I know of is the Gigabyte Aero.
I'm done with apple. The only MBP I have is from my work and I leave it there. My personal laptop is a chromebook pixel 2015, running debian. I absolutely love it, but I'm planning to upgrade to a new alienware 13. yes, it's a gamig laptop, but it has removable components (sdd, ram), high end graphics (gtx 1060) and an oled screen. all for less than $2000.
I'm researching new laptops, and the two laptops that keep bouncing between the #1 and #2 spots on my short list are the XPS15 9560 and the Alienware 13R3.
I think the Alienware 13 is an ugly, heavy brick (in fairness, you can get uglier), but it basically ticks off everything I need on paper.
My max acceptable weight for a laptop - irrespective of size - is around 5.5 lbs, so it barely makes that, but once you get past the esthetics, it's a fantastic machine. It's available with a quad core CPU with multiple options for GPUs (including the 1050Ti which kills the battery less than the 1060), a nice array of current and future ports, user replaceable wifi, RAM and SSD, and the option for an OLED screen. Oh, and it's also got physical touchpad buttons, which are something I've been missing for a long, long time.
Because my last two laptops have been quad core, I struggle with the idea of replacing my laptop with an ultrathin laptop that has an ultrabook processor that would be slower than what I currently have.
Every time I think the XPS15 is the one for me, I look at the Alienware 13 and change my mind.
The one thing I really hate about the Alienware 13R3 is that it's a "gaming" laptop and the matching gaming bling, but that's a rant for another day.
> Or, crazy idea here, they could licence their OS if they don't want to make pro gear anymore.
Maybe some are too young to remember 'Power Computing', the company that licensed MacOS in the 90s. That was juuuuust before Apple ran in to major problems, and had to file bankruptcy. The first thing Jobs did when he came back was borrow 55 million dollars from Bill Gates and buy back all the licenses and shut down production of all mac clones. One of the only true things Apple has is its reputation for quality. You want to bleed that out, you would do it by allowing clones, thus bringing about the destruction of the brand as a whole.
Power Computing clones were great, though. I had one and it was very reliable, excellent value for money. They were built like PCs and at the time, PCs were a lot better-made in many respects than low-end Macs. (High-end Macs were nice but very expensive, and still not as fast as high-end PCs.)
A clone program now would have a totally different dynamic, since Apple is incredibly successful rather than close to bankruptcy. I still don't know if it would make business sense for them, but for Mac users like me it would be absolutely terrific.
Sell the licenses directly to consumers then? Their reputation doesn't take a hit that way. Just sell it on a "use at your own risk" basis and just let people have it who want to use it on their own builds. I figure the support for the bazillion possible hardware configs would make it a nightmare though.
Or they could license the OS and review the products from licensees(?) before they ship out. Make a test balloon and license to 5 or 10 hardware manufacturers, offer them a reference design and review and run with it.
If Apple doesn't care about high-end desktops anymore (and it certainly seems that way), they could only license MacOS to third parties for desktop use. This wouldn't cannibalize their laptop sales, but would keep power users in their ecosystem. In the end they'd probably sell more laptops too.
>> If Apple doesn't care about high-end desktops anymore
I think Apple makes high end computers for consumers now.
They just don't cater to special audiences like they did before, which is unfortunate for people who don't want computers that cater to consumers (i.e., content creation pros, developers, etc).
I can't blame Apple for following the money, but it is what it is, I guess.
Another one to chime in that I don't consider this crazy...any more.
It didn't work the first time around partly because the cloners had much smaller volume and could therefore always scoop Apple on the most profitable high-end gear, which didn't have component availability.
The licensing model was always a gamble, with the hope that the loss of margin would be offset by an increased base. The problem was that it was "bet the company" stakes at the time, and when it didn't quite work out as hoped they had to shut it down or shut down the company.
Now the Mac line in general and the pro line in particular is a sufficiently insignificant part of the overall business that they could afford the gamble.
The question is whether the licensing fees + revenue from the app store would be enough to offset Apple's ludicrous hardware margins. Their market is shrinking, but it's still a profitable line of business for them since, just like the iPhone, their margins are way higher than the competition.
The mistake isn't the trash can Mac Pro (which is a perfectly fine machine that 100% fits my use case), it's not keeping it up to spec down the road (I'd buy one today if it was, even knowing I won't upgrade it for then next 3-5 years)
The mistake was the trash can, they locked themselves in to a physical design that doesn't allow upgrading easily. Graphics cards come in different shapes and sizes.
If they had sold enough of them, there would be a market for Mac Pro-shaped graphic card upgrades. Swapping out the AMD FirePro D300 looks to only need a Torx screwdriver set, so certainly do-able by a user.
>Some people just need or want a beefy desktop built with upgradable components (in addition to a laptop).
Apple is a hardware company. What you're suggesting doesn't make business sense for Apple. They rely on being able to sell new hardware. Allowing upgrades wouldn't make sense. A laptop last for five years, for most people, being able to slap in extra memory, a new graphics card or just a new SSD would "rob" Apple of a sale of a new computer for an additional two or three years.
This might be all completely true but doesn't regard mindshare or loyalty. Apple has historically had a fanatical fan base, and should be spending a small fraction of their huge cash reserves to make sure they keep a fanatical fan base.
Yes, Apple could license their operating system to just one high-tier desktop hardware company, and ensure that the power users will stay in the Apple ecosystem.
Somebody who will sell a high-end desktop system with: dual-socket Xeons, gobs of ECC memory, lots of storage, multiple 16x PCIe slots, and a very limited selection of high-end graphics cards.
That would satisfy the power users, and wouldn't cost them anything.
They would pay the same price Microsoft does to support random hardware. More importantly, it also wouldn't gain much. As others pointed out, the high-end workstation is barely a business for Apple.
Sure they are losing customer, but compared to the hoards of students buying Apple laptops or iPhone customers, it not really an issue.
They have so many patents on hardware that they could licence the system and clones would still not being able for example to have Apple like touchpad, touchbar screen, MagSafe or even continuity (they could block it for licensed hardware). They will not do that because they realise that they fantastic, proprietary features means nothing to average pro user. Stuff like that is nice to have but more important than that is to be able to run all needed applications reliably. Currently on Macs are only iOS devs and people that hate Windows. That second group could go to linux only if there would be Adobe Suit available on it.
you're a bit aggressive : I use it daily and it definitely works (but I work with it : no games, almost no 3D, just PyCharm, git, LibreOffice, firefox, irssi and claws mail on openbox WM; scanning/printing a document from time to time nothing fancy)
Latest problem I had were with the printer driver (Brother HL-1110 which doesn't seem to know that this printer has much less memory than advertised)
I think that's the key to Linux on the Desktop. It generally ranges from very limited to terrible for your average end user. Your video professional might fare slightly better, but realistically is only going to use Linux if they're working at a company that's invested in Linux for their video production stack.
Where the Linux desktop is truly unparalleled is for a developer. As a developer I'm comfortable enough with the technical details to handle the quirks (like doing things from the command line), and the strengths of Linux truly shine (again, the commandline, as well as choices in DE, software, etc). One great example is Arch. On Arch I can install any developer tool directly from the command line with very little effort. Anywhere else and you have to click through endless menus.
The Mac Pro is barely a business for Apple. There's almost nothing about that line of business that makes any difference to Apple. Frankly, the entire macOS part of the business doesn't really matter much to Apple from a financial point of view.
I think they should give people what they want - a high powered upgradable machine. If they don't want to make it upgradable, then they should at least refresh it annually.
That seems to be Apple's thinking but I think this article shows the reason that that's a flawed approach.
Having content producers using macOS is a huge benefit to their brand and I think it's a big part of driving sales of their iOS devices. At the very least, if they lose the content producer market there will be significantly more friction to producing iOS apps.
Yes, there's a halo effect, but I don't think it's all that huge especially with respect to video pros and I bet it mostly works in the other direction - people like their iPhone so they are more likely to buy a macOS laptop or desktop. Nobody is buying a Mac Pro because they like their iPhone.
The driver for iOS development is profit and as long as iOS users are willing to pay more for apps, it will continue to dominate. And if you want to make an iOS app, you are going to buy a Mac, but probably not a Mac Pro.
The Mac Pro is like the iPod in that it exists in this weird space where it continues to be sold but is clearly not something that Apple thinks is important.
I agree, though I think persuading Apple's decision makers--particularly those who were present for the clone program in the 90s--would be hard as hell. It was a poorly designed program, and allowed clone makers to beat out Apple on price and speedy updates with new hardware. And those who remember it at Apple probably remember all of the problems it threw at the company's feet back then.
Personally, I think letting the trash can linger on for over 3 years is a pretty loud indicator that Apple's not exactly concerned with the line and that, internally at least, they're well aware of just how big a mistake the design was. Of course, it's possible that it's just a reflection of far larger problems with how the company views and supports its Mac engineers.[0] Plus, the Mac Pro's manufacturing here in the US is problematic for the company. That little political ploy can't be easily reversed, especially with the anti-trade FUD that's so prevalent now.
If Apple were willing to acknowledge the problems, they could certainly design a restrictive licensing program that would allow third-parties to build Macs that target higher-end professionals without cannibalizing their consumer lines too much while avoiding some of the problems from the 90s. Strict design requirements for case aesthetics, limitations on software bundling, and other requirements could be used to limit third-party machines to high-end machines to avoid eating their consumer and laptop sales.
Not that I expect it to happen, or even be considered. But something has to change if Apple wants to keep their hold on creative professionals who are now either being forced to switch, doing extensive upgrades on their old Mac Pros to bide as much time as possible, or building Hackintoshes
Not so crazy, Apple licensed their OS before, and then ended up paying-out the clone manufacturers contracts years later then they first released OSX. The model worked for Microsoft in the 80's (I bet the Big Blue is still kicking themselves for that one), but I don't feel confident it'd guarantee their survival now. For one, the operating system isn't nearly as separate of a concept anymore -- it's all the ecosystem.
It's one of the main things I see resulting from the Touch Bar - for me, it's more about Apple ensuring there's a natural connect between the software and hardware, making any kind of separation impossible.
I don't mind using macOS, it's a pretty nice experience, definitely nicer than windows but when I buy an Apple machine the thing I'm buying is definitely the hardware and I'm not convinced the clones do a good enough job of matching that, can I get 14 hours of battery life from the leading air-clone?
I am a Windows, Linux and macOS user (plus some PcBSD for good measure). I wonder why you much prefer macOS over Windows? I would agree that Apple have got 4k/retina displays done and that is much, much better than Windows. Is there anything else that you can identify as an advantage?
I did the transition from Windows to Mac last year after many years of having both and hating OSX while loving Windows. Now I love it.
The event that changed everything was getting a 34" Ultrawide monitor and the Mac experience was just so much more solid and respective of the screen space. I can't quite put my finger on it but I felt more in control.
I have a Surface Book on my desk, but my Macbook Air is what I use daily. For me, Mac wins with:
- Predicable window management - everything stays where you put it
- Tidier fonts and app bevels so it doesn't waste screen space
- Access to apps like Sketch
- No UI delays when using Adobe Premiere, despite the less capable hardware - definitely an issue on Windows
- Much better command line support (not used the Windows Linux subsystem)
- Apps are easier to handle and less bloatware
- Notes app that is actually helpful
- Fast with an SSD - absolutely rubbish without (my mac mini is unusable)
- I have gotten used to clicking the window first, before it doing something in that app - so fewer accidental actions
But, I do still love Windows, it wins on the following:
- Better graphical support (ok, hardware specific)
- Windows Explorer is much better than Finder for me
- Window snapping is easier (but might be also frustrating, can't put my finger on it)
- I can open multiple calculator instances
- Wider app support, but less of an issue these days for what I use
Windows does have a lot more bugs, such as incorrectly scaled cursors in Premiere, or apps that a impossibly small or stupidly large when using a non-retina second monitor.
I've used bc for decades. It's versatile, ubiquitous (Unix, Linux, MacOS), and easy to use. I find it so much better than some calculator I have to use with a mouse.
A developer friend I know used to always prefer the even older unix command line tool, dc. It operates with reverse Polish notation.
Now, I often just start the python repl (type 'python' at the command line) for quick calculations that might require slightly more power than bc.
To bring this back to the original topic, it is this easy access to the underlying unix tools that made me switch to Macs in the first place. Windows, IMHO, lost some points with the emphasis on touchscreen based UI and gained some points with the new Linux shell support.
I would describe the command line support in OSX to be sub-par to what is offered in Windows. The terminal itself feels like a toy compared to the Windows 10 command prompt. And posix doesn't seem to hold a candle to Powershell.
After having given Powershell a chance I don't want to go back to bash.
> I would describe the command line support in OSX to be sub-par to what is offered in Windows. The terminal itself feels like a toy compared to the Windows 10 command prompt.
Pretty much anyone I know on a Mac uses iTerm, which is definitely more capable than the Windows 10 command prompt.
> And posix doesn't seem to hold a candle to Powershell.
In what specific ways? It seems way more intuitive to me and powerful enough for any task that it's not worth bringing a language like Python/Ruby/Go etc. for.
That's really interesting, thanks for taking the time to give a detailed reply.
I love the iOS platform, it just works and has never failed me (I am leaving iTunes out of this). However, even with the scaling/DPI issues I find Windows 10 fast and easy to use. It must be biased as I've used Windows more, but I thought I'd love macOS more than I do, having loved my iPhone & iPad experience. I do always have to put the taskbar at the top of the screen, but that I think goes back to my Amiga Workbench days!
Not entirely. For text fields and buttons, clicking them will usually work even if their window is in the background. But clicking them will also focus the background window.
If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.
> If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.
>If you don't want to focus the background window -- or you want to use an element that can't be used if the window is backgrounded -- then holding 'cmd' will pass the click through.
Damn, is there any equivalent functionality (be it integrated or third party) for windows? I'd love to be able to run games in fullscreen exclusive mode without having them lose focus every time I want to look at a different chrome tab on my secondary display.
My desktop computing happens on Windows because I need graphics card support (games at home) and because my work pc is windows. I chose the Macbook Air for my travel machine because it's extremely light, it's battery lasts more than a day and macOS is unix-y, I initially planned to install debian on it but I couldn't convince myself that the power management would be as good so I left macOS on it.
Window management is vastly superior in my opinion, I frequently just full screen everything and swipe between the full screens, this is particularly notable in my case since I almost exclusively use the keyboard (get bad RSI from mice)
Updates feel like they have much less friction.
But basically it comes down the underlying unix-y system.
I'm never going to do serious development on a laptop but if I pick up a macOS machine it has the unix core tools that I need to get stuff done, vim, ssh etc. (Though there are some pain points mostly they're fixed by brew for machines I'll use for any length of time).
Also, I helped someone set up Parallels the other day on their MBP and tried out air drop for the first time, show me friction-less 4GB ISO copying OTA between Windows laptops that aren't on the same network!
I wouldn't mind buying something like a Xiamo Air if it had native MacOS support. I'd be picky about display quality though. I've never gotten more than 7h of usage out of my new 2016 MBP 15".
The design would have worked just fine if they'd supported it. They won't sell you an SSD upgrade, which is absurd. The design should have made CPU and GPU upgrades possible, even if they were only available from Apple, but those never materialized.
We live in very different bubbles if you see common users being the most sticky Mac users. Almost every common user I know, with the exception of a few students, consider Macs ridiculously overpriced and buy <£500 laptops. Apple are moving further away from them users by increasing prices, and I think they'd rather they buy an iPad pro anyway (though I've never heard anyone non-technical say they want one).
While it doesn't tell the whole story, just looking at things like sales numbers and user agent stats gives you a pretty good idea of the relative popularity in different areas. Apple just has very little presence in some of these places, _especially_ for desktop.
This has nothing to do with the focus on laptops and everything to do with the simple fact that the new Nvidia cards are not supported, and there doesn't seem to be any indication from either Nvidia or Apple that they are committed to providing that support.
It's a strange problem, because there is plenty of demand for it, and I think we all appreciate the outsized role that GPUs play these days; to not offer support for 50-80% of the GPU market seems like a rather poor strategic decision. Particularly since you're really only talking about a team of 30 or 50 within Apple to help Nvidia, the drawbacks are minimal.
This has been an ongoing problem since the summer. Some have reverted back to using several 9xx cards (which have spiked in price) while others have switched platforms. Lacking any real progress on this, I would suspect many in this situation would abandon OSX permanently by the end of the year. And if you give up OSX on your desktop, the incentive to stay in that environment on your laptop, tablet, and phone go way down.
This is a serious problem and the only outcomes are either a) Nvidia GPUs are supported, or b) OSX is abandoned, because the simple fact is that Nvidia GPUs are more important long-term than the entire sum of Apple's hardware; I can replace a tablet or desktop or laptop, but I can't replace a Pascal TITAN X.
> While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.
I've purchased the second generation (new) Macbook this year, and must say, I'm delighted. I honestly don't get what all the fuss is about. It just works.
Why would I change something that works really well with something that might work well?
My new MacBook randomly crashes and the touch bar is a real nuisance. The wretched USB c only ports is a real pain for me. It's slower than my former MacBook Pro (which is strange) and the battery time is worse too. The most ridiculous thing though is that I need a bunch of adapters with my iPhone 7 to charge and sync it using my machine. If only there was a positive side to it I would probably be less unhappy but I cannot think of a single one. I do. not. understand what apple was thinking when releasing it. I'm not the only one at the office that have these issues either so I'm not the only one. And no: we don't have any weird stuff on it.
It leaves me super conflicted: MacBooks used to be the safe choice for a high quality laptop. Now I don't know what my next device would be, but certainly won't be an apple product.
Maybe you should have done a little case study before buying it? I replaced my 2012 mbP retina with the 2016 maxed one. It's a blast, the touch bar, while not that useful, just work and IS more useful than old functions keys. And it's fast, I don't see how it could be faster actually.
It is very painful for me to develop on Windows. The development tools seem very disjoint and not integrated with the OS. Some projects are cygwin, some are Visual Studio. A lot of the Visual Studio options are tucked away in different configuration dialogs. Command prompt is bordering on useless. On *nix I can do everything I need to outside of an IDE, if I run into a problem, there is documentation for just about every program. OSX is pretty nice with brew. At the end of the day though, Linux just feels like it was made for developers.
Those are fairly new and most projects do not use them. If a project was started before Bash on Windows was released, it will most likely be built with Visual Studio.
I don't think that's what he meant. As someone who has done some development on Windows, doing anything was a pain, even getting .NET packages wasn't as easy and intuitive as a Linux package manager is, pretty much every JS framework is easier to get, update, configure in Linux etc. Even working with PATHs on Windows seems harder than on Linux.
At least in the PS4 case, it's based on FreeBSD, but that doesn't matter because you do not develop directly on it, but use a dev kit/SDK.
> WiiU OS & XBox 360 and ONE OS
You can't develop directly on these. You use Windows-based dev kits with custom SDKs mostly.
I am not even going to continue, since you clearly didn't want to understand OP's or my arguments, which was that Linux/UNIX is usually the best dev platform. Even if you target embedded, you actually develop on UNIX. There are exceptions for highly proprietary platforms of course, but that's not what most people on HN tend to develop for, (i.e. it's mostly web dev, mobile dev and such).
I kind of agree with you but at the same time I've yet to see a system that sucks less than UNIX style systems.
Unless we're talking game dev, real-time 3D graphics and such - then Windows is IMO unparalleled - I have not seen comparable tooling/driver support on other platforms - although I have not tried using Metal.
>but nothing that cannot be done better in language repls.
You're thinking on different abstraction layers - sure given a library for runtime X a REPL will be better than CLI, but when you're talking about processes then having functionality exposed via CLI vs Windows way of monolithic apps with GUI only - it's a huge difference for automation/testing/workarounds/re-usability/etc. - and you don't really care what the underlying platform/runtime is.
Uh most of windows is exposed via powershell and can be rn completely from the CLI. .NET, the primary development stack is completely scriptable from the CLI. It's quite powerful. I'm not saying I love the powershell language, I don't. But I don't love Bash either.
Again you're talking completely different abstraction layers - unless you're saying that you can use a scripting language to use something like windows automation API to click GUI and stuff.
My point is that Windows has a culture of black box monolithic apps with GUI focused (if not GUI exclusive) functionality exposing. Nothing stopping you from exposing stuff like commandlets for PS and composing them to build higher level systems - but people don't really do this - and they do in UNIX systems - things end up being more transparent.
Nothing to do with any shell or anything really - more about UNIX philosophy of small focused apps composed together vs monolithic apps.
When you're developing stuff UNIX approach is IMO way preferable.
Developing on Windows is only ok if you're a .Net, Java, or node.js developer; which arguably encompasses a lot of developers. However it's a pain if you're using anything else
Sure, but according to many on HN, being a .NET, Java, JavaScript, C++, Swift, Objective-C, Android, PL/SQL, Transact-SQL, PG/SQL, PS3, PS4, WiiU, XBox 360, XBox ONE, Swift, Cobol, RPG,... isn't being a developer, as it implies for whatever magical reason to use POSIX and a UNIX CLI.
My point is that there's no need to list 'PS3, PS4, WiiU, XBox 360, XBox ONE' separately when it's covered by 'C/C++'. Just being pedantic.
I also still don't see HN's bias against using Windows as a dev environment, especially when a lot of people on HN create stuff on .Net. Sure a lot of people like nix and Apple here, but it doesn't mean they look down on people who develop on Windows just because they don't want to do it themselves.
Because I'm assuming MS invested a lot in both Node and npm to be a 1st class citizens on Windows. I was pleasantly surprised as well when I started working with it last year on Win 7; everything worked out of the box without any tweaks such as the use of Cygwin.
I was free in choosing my tech stack but I couldn't choose my OS, even Windows 10... which means I don't have access to Docker. I couldn't use a VM either because the app needed to run on Windows.
If you can get away with using a Chromebook for a laptop, you might not really be the target user for the Macbook Pros.
FWIW, I have found the new MacBook Pros fairly incredible. They are thinner and lighter, which is quite valuable for people who use notebooks on the go, while still being more powerful than their predecessors.
If I had one complaint it's that they didn't keep around the old 15-inch chassis for a model with more ram and a bigger battery. Made just have one model and aim it at a very specific niche.
But overall, they are very good notebooks. You can power multiple 5K displays from them for God's sakes.
Over the past 2 months I tried hard to "make the switch". I built up a nice shinny new PC, installed Windows and Arch Linux, and told myself I can do this. Two months later, and I removed WIndows from the box (ubuntu on windows is decent, but still doesn't run a default rails app), and am running Arch exclusively. While I like the setup, I realized that I will ALWAYS be at least 20% less productive on a WIndows/Linux box.
The keyboard shortcuts, the UI inconstancies, and on linux the lack of native apps (specifically evernote) just didn't cut it for me. As trendy as it is to hate on Apple right now, MacOS is far and away the best operating system for developers and power users. It's not even close.
Having a single main modifier key on apple (⌘) is SOO much faster and more productive than switching between (alt) and (ctrl) modifiers for switching windows, copy/pasting, new window/tab, etc. It just kills my workflow.
I love my pc build and Arch Linux is pretty good. But I'm currently scoping craiglist for used iMac 5k's. The productivity loss just isn't worth it.
Except that new apple applications have to come from Apple users. If the devs aren't happy with what's available, that means fewer applications. Fewer applications means fewer users, fewer users means a less attractive base for devs... the spiral continues ever-downward.
Why would developers switch from mac to windows? If they're alienated by what Apple is doing, they're unlikely to be happy with Microsoft. Linux, sure.
Of course, I think this is exactly the sort of misprediction people on HN tend to make, where it turns out that (surprise) Apple actually knows more about market demands than random internet commentators. Remember how we were predicting that the new Touch Bar MacBook Pro would fail hard, but it ended up being the best selling MBP ever?
Oh come on, there are plenty of reasons, it's a case of whether they affect you enough personally. Telemetry, closed-source kernel/base system, Microsoft's history of nasty practice in the computing industry, E&E, having to pay money vs. free as in beer, outside of the VS ecosystem a generally poor development experience.
Well, the new microsoft is actually one of the biggest open source contributors.
I'm a UNIX developer (mainly Go), and I use a surface pro 4 with win 10 for development. Bash on Ubuntu on Windows already runs great, and I have the ability to sketch diagrams whenever I want to. Additionally I have docker running flawlessly and a system that just works 99.9% of the time. (as opposed to Linux where I have to hack my way through most of the things)
What software do you use to sketch diagrams? Pre-Surface I was switching between Visio and yEd. Now that I have a Surface I am using OneNote but have a feeling I could be using something better.
For me that picture indicates all manner of problems. The main window does not follow the Windows user guidelines, which is bad enough in Windows 10 already. For decades double-clicking on the top left corner of a window means that you want to close it - it was explicitly mentioned in the 3.11 manual. With 7 (or was it Vista?) they removed the icon from Explorer for no obvious reason, but kept the functionality there - you could still doubleclick the invisible icon and it'd do what you expect.
With early Windows 10, the Settings app didn't even have a title bar so there was no clear demarcation from the window grey to the titlebar grey - how would I know that the grey bar at the top was the way I moved the window but the grey bit beneath the grey bit at the top DIDN'T move the window??
With current Windows 10, the Settings app finally has a title bar that has a colour like all other applications. But the paradigm we learned 20+ years ago (double click top left to close the window) and the ability to get the window menu HAS GONE. I can still press Ctrl-space and get the window menu but the ability to get the window menu from a Metro app or a metro-style app using the mouse has completely gone.
So now I have to learn another method of interacting with the basic building block of the system - windows. When each window behaves differently according to the "metro or not" measuring stick, this does not make for a great user experience. And this is me coming from using Windows since the 90s. How do you think new users and my mum is going to get on??
What's the point of having window guidelines if Microsoft themselves don't even follow them??
So although the screenshot is aesthetically pleasing, I remain very concerned about the direction usability is taking on the platform.
I don't know, when Apple decides to change things to go against the established norms they call it being courageous.
Somehow, I'm not bothered about ignoring learned paradigms from so many years ago. For example, my 12 year old daughter could care less how you learned windows management 24 years ago. She's learned to Windows as it is today.
Just picking up an odd expression you use - are you American by chance? I notice this expression in films and interviews and online and it's wrong: "I could care less".
This expression implies that you currently have a high level of care about a subject, and that you have a range or distance to travel until you reach rock-bottom/zero regarding your level of care for the item.
In fact, if you were to care 80% or 100% about an item, the expression would still be true: eg. "I could care less about my health" which would mean I care GREATLY about my health and therefore have range to drop my level of care from its current position.
The expression that everyone should use is "I couldn't care less", or "I could NOT care less".
This implies that you are at 0% of your care about an item, thereby having an inability to move below your current level of care. It is already at rock bottom. It cannot get any less. It cannot go any lower.
Subsequently, I am assuming that you meant that your daughter COULD NOT care less about how I learned window management. i.e., she wouldn't care in the slightest.
I had to ask because I am hearing this expression more and more and it's incredibly irritating because it is wrong.
I never hear any native speakers here in the UK saying it, but judging how manners of speech drift over here I imagine it is only a matter of time before I hear people saying it, much the same way I now hear the word "like" sprinkled into sentences needlessly. When I hear this expression over here I think I'll suffer an apoplexy and drop down dead.
EDIT: By the way, I don't think Apple's changes have been courageous. I find that they stuck their head in the sand for too long and finally did sensible things. For example, on Snow Leopard and prior, you COULD NOT resize the window from any corner but the bottom right. This was officially STUPID. Thankfully they realised how stupid this was and enabled the ability to resize a window from any edge. Only took them 30 years.
Also in recent versions they have been making a big fuss about the ability to go full screen with applications. Windows have always been resizable, so the ability to get a window fullscreen and not see the menubar is not really a big feature. Nor is the ability to run two fullscreen apps side by side - surely we have been able to put windows side by side since the 1980s? Just because the menubar has gone does not make it a feature. This is not courageous, it is dumb.
They have also made regressions. For example, now when you want to use Mission Control you must move to the top of the screen to see the desktops. Hang on - just tried it again and now it works again. Hurray!
Their new filesystem is still significantly behind NTFS for features. Sad to believe we're all still hobbling along on HFS+ with byte swapping for big endian to little endian metadata every read/write of metadata.
I am not saying that Apple does things any better with software development, but the feel across the OS is generally consistent. This is not the case on Windows 10, eg. how do you find out which build version you are on? Press Windows key + Break to see system properties.... oh it's not in there. I suppose I need to run the Settings app > System > About to display a duplicate window. How do I join a domain? I could use the Settings app and it'll mention joining a school. I could also use the duplicate domain settings from Advanced System Properties and join there, but there's no mention of the word "school". Why do the two sets of windows look different? Why is there even two sets of windows? What the heck is going on?
So now I have Control Panel, the Settings app, Microsoft Management Console and its snap-ins. They all do the same thing. Are the teams not talking to each other at Microsoft? Is nobody at the helm?
For extra fun, look at shell32.dll and see how many icon styles are inside that thing. That's not even an architectural change - that's just pictures in a single DLL, owned and controlled 100% exclusively by Microsoft. All of the icons are from different eras. If they can't get even little pictures to be consistent and uniform, I do not hold much hope of uniform system behaviour.
So you can see the cobbled-together nature that Windows 10 exudes, despite a fantastic opportunity to make the entire system coherent again. And I say this as a C++ developer on Windows as my day job.
It's not whether Windows is aesthetically pleasing. It's that it's 2016 and some first-party windows apps still upscale fonts rendered at half resolution. It's that there are two control panels. That kind of messiness is endemic to the platform.
I just ran Win10 for the first time yesterday to work on a security guideline and I was blown away by how janky the whole thing was. Using it is like crawling through 20 years of computer history; there's things from every phase of Windows since '95 in that UI, and I don't mean UI hints, but entire applications.
I bet somewhere in there there's still UI that dates back to Win3.1.
Well, I think it is quite possible that the devs on the first-party team are not the same devs on the platform team. First-party or not, if the platform provides the tools and the devs refuse to use them, it is on the devs. Not the platform, the devs. Unless you can show me where the company demands its devs do so for no good reason.
>> I bet somewhere in there there's still UI that dates back to Win3.1.
I'm sure there is, that's a mix of backwards compatibility and don't mess with stuff that works. Some people want it, some people don't; can't make everybody happy. Can I complain that OS X provides a terminal that looks like it dates back to the 70's?
> Especially since Terminal.app is far more modern than CMD.EXE?
To be fair, a lot of Terminal.app dates back to ancient Mac OS too. The menu bars, for example, are all still Carbon. There is still a full classic Mac OS main loop running alongside the Cocoa main loop in every macOS application. Apple's done a good job hiding it, but it's still there.
Anyway, cmd.exe is a bit of a special case. What you're really seeing is conhost.exe, which is kept the way it is because it's part of CSRSS, which occupies a similar place as PID 1 does, and Microsoft doesn't want to bring new UI into it for fear of increasing its dependencies [1].
The 3.1 UI that still exists is silly stuff like file open dialogs that aren't "desktop aware" or whatever. There's no reason at all to retain such "compat" because it instead requires users to learn where all such magic folders live in the filesystem.
I'm with you there. Windows 10 looks like developers designed it. Not meant to come off negatively to developers but sometimes we just get lazy and use solid colors.
Windows is beautiful compared to the Mac OS where the software all looks like it was inspired by a 1970's era stereo unit.
The ugly UI isn't even the worst part about the Mac OS though. The worst part is that it just doesn't even come close to offering the same sort of freedom that you get on Windows where Microsoft leaves hooks in to let developers actually do what they want.
Most of the problems with the Mac OS are by design too. I think it's hilarious that Apple folks think it's a really good idea to hide the label on most buttons. I guess you have to "just know what it is" before clicking it or hover over it and hope for a tooltip to popup tell you what the thing will do. Real efficient.
There's really no wonder in my mind why most people don't use Apple anything.
I have to agree despite your downvotes. From 8.1 onwards, mobile and desktop, Windows has been the most beautiful OS. Using my iPhone 7 feels like stepping back in time in terms of the visuals.
I dual boot Ubuntu and Windows on my desktop, and I only switch to Windows when I need to use an application that is Windows only. I find it to be a more pleasant and user friendly desktop experience. Switching back to Windows honestly feels like stepping into the past.
i dunno...part of why i use Linux or MacOS is the knowledge that under the hood they're well designed.
With Windows, no matter how good it looks, I feel like it's an ugly system, not to mention insecure.
How so? It's definitely an unfamiliar system, for me, and there might or might not be vulnerabilities in the current implementation, but I have no reason to believe that the windows NT model is intrinsically less secure than the unix model.
The security infrastructure behind the Windows desktop is a decade ahead of the Linux desktop. Yes, kernels are okay in both cases, but e.g. Xorg is a security disaster.
I just want to randomly poke in the discussion to say that out of the three big OS, MacOS user interface is the most vomit-inducing. I know this feeling isn't shared by most.
Now that windows has nice workspaces and an already usable UNIX environnement (Windows Subsystem still has things to iron out, but it's very nice for a beta), my personal ranking became:
Linux > Windows > MacOS
I see nothing that MacOS offers that Windows doesn't, whereas Windows has potential for much better hardware specs, games, bigger user base.
The only thing left for Apple is their 16:10 screen and their per-monitor scaling.
Same ranking for me here. At home I use Linux and would never go back to the alternatives but at workI had to use Mac and Windows. If Mac did not have a good terminal (iterm2 is what I used) then it would be no competition at all.
I do miss a good terminal in windows but assume the Linux subsystem will fill that hole
It's not the best terminal, but give hyper a shot. It seems to work fairly well plus it's cross platform so you can take all your configs. Downside is it's an electron app.
Windows already has an app store (although it's pretty terrible). It's had non-networked package management for decades (through .msi files that end up as line items in Control Panel). Nowadays there seems to be a move to nuget and chocolatey.
It's not awkward at all. I use cmder that is using Linux subsystem beneath it.
For example:
- apt-get install php
- php -S localhost:8081
- Launch Chrome on Windows, go to localhost:8081 (it works)
There is zero awkwardness or extra configuration there. Only problem is that some things do not work properly yet (because still beta). For example `go get github.com/mattn/go-sqlite3` will not work because of some ?gcc? compilation issue. But it's getting fixed on next release.
Any statistics to back that up? Because some random hacker news comments dislike new MacBooks it doesn't mean other developers would.
I find it hard to see any realistic alternatives for MacBooks in software development. They give you all the tools you need from the Windows world that are not available on linux like Office, Photoshop etc. and it is still a UNIX system which can do all the stuff that linux can without kernel panics, crashes, bugs and other crap. Also unlike Windows or Linux, a single macOS installation can last pretty much forever and doesn't need to reinstalled every year because of the lost performance like Windows. And last but not least MacBooks are well optimised for the hardware they run on which gives them 1.5-2.5 times the performance with the same hardware as on Windows or Linux.
The whole thing about complaining about new MacBooks is same thing as with iPhones. People complain because it's trendy to whine about anything Apple does even if they would create eternal world peace and still everybody will buy them because there is nothing better or even similar on the market.
>Also unlike Windows or Linux, a single macOS installation can last pretty much forever and doesn't need to reinstalled every year because of the lost performance like Windows.
You need to update your arguments. It's 2017, not 1998. I haven't reinstalled Windows on my desktop system in years and it's running perfectly fine.
As a developer on windows, I'd have to disagree with most of what you said.
Windows hasn't been less stable than a mac for years if you buy equivalent hardware (instead of scrapping together a system with junkyard parts or getting a cheap walmart laptop).
Windows does not need to be reinstalled every year if you don't treat it stupidly. My current work windows install is 4 years old, has seen two machines (disk swap) and two major windows versions (in-place upgrade), and it runs as well as when i first used it.
OS X is only faster than windows on mac hardware because apple ships bad windows drivers on bootcamp deliberately. On equivalent hardware from other vendors it is just as fast.
It sounds to me like you don't actually know windows all that well. I use a mac mini at home and a windows laptop at work, and I'd say that they're pretty much equivalent for my purposes.
You mean direct, market-research-supplied numbers? No. I am just one of thousands of people who makes a living in the tech world. What I can say is that
1) A very large portion of other devs I know use(d) MacBooks, and
2) A very large portion of them said "F this, I'm out" when Apple released their most recent machine. Many switched to Surfaces.
Your hyperbole against Linux and Windows does your point no credit. Linux has been rock-stable for a very, very long time, and Windows 10 is a pleasure to use, particularly on touch-enabled devices. MS went through severe growing pains with Win8, but those days are past. And the OS/hardware combinations that have emerged since make Apple's offerings feel old and out of touch.
I agree with you except for "Linux has been rock-stable for a very, very long time". This depends on what you mean by "rock-stable" but things begin to get really annoying really quickly once you move on from relatively simple tasks. Even basic stuff like getting good battery life on laptops, or waking up from sleep/suspend/hibernate, using >1 monitors can be flakey or unpredictable. If they've worked fine for you then congrats, but when they don't it can be infuriating.
This is not to disparage the kernel developers, package maintainers or the distributions themselves - there's a lot of people doing excellent work without being compensated at all. However let's not kid ourselves - linux can still be tough.
I run Ubuntu 14.04 as my daily driver on a custom-built PC. I agree with you that Linux is not necessarily rock solid, but I would instead suggest that the compelling difference these days is that it can be if you put in the work. In years past, that wasn't even the case.
Unless you buy something from System76 or do very meticulous research on hardware support and build your own, Linux is not going to be seamless out of the box. But the real highlight is that it can be seamless at all these days.
My PC has four GPUs, 128GB of RAM and a deca-core CPU. I have four monitors, a bluetooth mouse, and I almost never turn it off. Instead, it automatically suspends. When I'm away from home I can ssh into my computer if needed. I have daily backups that are essentially seamless.
Now, it was difficult to get to this place. I would say that Linux is, and has for a long time been, rock solid if you're not using a GUI. In my opinion, the single greatest impediment to "Linux on the desktop" is xorg, which is terrible (to put it mildly). Nouveau is absolutely crap, especially since Nvidia puts out its own drivers. But my point here is that after a few days of careful installation and setup work, I have had a seamless desktop experience for something like eight months.
What needs to happen for Linux is a software company that takes something like Debian, fork it and dramatically bring it up to wider compatibility with modern hardware across the market. In some sense that's Ubuntu, and it shows in how easy my daily experience has been.
Well, let me rephrase- Linux is as stable as you want it to be. If you want an ideal linux laptop experience, you need to do the research and get a model made with linux in mind. Otherwise, you're just rolling the dice.
That being said, I only use Linux for server-side work, and I can easily say that it is every bit as stable as any other server OS I've used, and has been for a very long time. There are plenty of weedy areas to play in if you want to, of course. But if your goal is It Just Works, you can do that, too. Perhaps most importantly, you can do it easily, with just a little foresight during the spec stage of planning your project.
They switched to Surfaces because they saw what I was doing with mine, wanted to be able to do the same, and Apple had no comparable answer. Simple as that.
But that is just people in your surrounding. Anyone amazed by Surface (tablet and touchscreen things) switched but that is ok, it is not so much of Apple's loss as of MS's gain. I do not want touchscreen, I don't need it, I need classic laptop and many Apple MacBook users want just that, a laptop, no touchscreen.
For what it is worth, I used to feel exactly the same way until I got a machine that had one. Now I miss it when touch is unavailable. Being able to use touch, mouse/trackpad, & keyboard all at the same time is, perhaps surprisingly, a very pleasant way to work.
To pick one example: being able to flick through long documents with a finger while keeping focus on other parts of the screen alone has been a huge productivity booster, and I do not say that lightly.
It turns out that it is less effort to flick/move directly on the item you want to adjust than to do it via proxy on the touchpad.
Like I said before, I was firmly in the "touchscreens are a waste of time for me" camp until I actually started using my Surface. Not trying to sound like an ad, it just has been my actual experience.
Touchpad gets that for me. Only way I can find what you saying more "comfortable" if you hold it in hand or horizontally on table while reading.
Most of my "work" time I spend in terminal. White text on black background, keeping my fingers on home row. I cannot see scenario where touchscreen makes me more productive except in some edge cases, and btw they don't gain over what I already have now. But that's just me. Each to their own, that is only fair thing to say.
But making assumption and claim based on your own point of view is not best thing to do. With Apple, especially it is always the same story, customers leaving and rarely satisfied (it was period around Snow Leopard, IIRC 2009-2010), but still I can't find anything better than MacBook. Dell and Lenovo came close, but when I install Linux on them I make few more sacrifices, summing everything up, Apple just works for me. (I won't even mention Windows. 100% not my cup of tea)
Developer on HN still count. You know that Apple is at least losing some developers. Then there are the people like me that skips this upgrade waiting to see more clearly Apple plans.
So the only question is to know if they are winning some that were on other systems. Objectively, the MBP has more or less stagnated on a performance point of view, it has regressed in autonomy and the new features it gained are at best neutral for developers. Developers is typically the slice of the population that will be affected by reviews, and the reviews are not favourable to this new MBP. Last but not least, competition has not stagnated either. I don't believe there is a better laptop on the market for my usage pattern, but it took me much longer before I came to that conclusion.
It is reasonable to think that the new MBP is less successful than the previous one with developers. It could still grow in raw numbers even with the developers, but it is reasonable to think its growth has been dampened a bit.
I'm not strongly disagreeing with you, but you are not making a better point than parent. I do believe that if Apple has a spec bump release of the MBP ready end of 2017 and is back in the regular 1 year cycle the whinging will die and you will win the argument.
Most servers run linux, which means many developers are bound to access them remotely, which makes the desktop OS a bit irrelevant. And many people may work faster with windows desktop (win7 for me please, none of that tablet crap).
Suggesting linux as a viable desktop is not equivalent to predicting that $CURRENT_YEAR is the year of the linux desktop. You being dismissive of the idea on this ground is illogical.
Why would a developer be on a mac in the first place? Cause ur a fanboy? Cause ur against monopolys?
If i need to build code that apple says can only be built on a mac, i simply spin up my mac VM and point some tools dont need to drink any koolaid to get the job done.
MDI + keyboard skills beats vim ninjas any day of the week.
Apple has never, so far as I am aware, deliberately targeted developers as a primary audience of a laptop computer.
Apple's laptops became popular because they ran an operating system that was Unix under the hood, with a nice UI and good set of non-developer applications, which made them good as machines a developer could use for work and for non-work (Linux on the desktop being not so great at the second part of that). Apple has certainly been happy to make money from developers buying laptops as a result of this, but has consistently ignored the expressed wishes and desires of developers when updating its laptop lines, meaning Apple was popular in spite of Apple's indifference, rather than because Apple lovingly catered to this audience.
In other words: if you were looking for an executive to run around the stage screaming "DEVELOPERS! DEVELOPERS! DEVELOPERS!", I think that was probably some other company, and they got ridiculed for it...
"Science" is not "developers". Apple wanted to sell them on a full stack all the way up to clusters of expensive Mac hardware for parallel computing. And the article you link even admits that companies which have targeted scientific users with hardware have mostly failed -- that market is more likely to buy either extremely specialized hardware or, more likely, commodity machines that they slap Linux on.
> Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.
This is EXACTLY what I did. I build my own PC (after more than a decade not doing it) and bought an Acer R11. Couldn't be happier!
I did kinda the same thing except I still keep my good ol' macbook air instead of a chromebook. Ubuntu on desktop and I can run stuff on it remotely and I got a lot more performance if I were to spend the money on a new macbook.
Most developers use Macs to write Java, Ruby, PHP etc.; these are the developers who would leave first, and that would already cause a lot of damage to the ecosystem.
In college I brought an iMac to learn edit video in FCP7. It was nice, not really fast but functional, a lot of my professors were apple fanboys since the Jobs era. Then came FCPX, even if it's good right now, the first version released was basically a stab in the back to those video pros who were faithful to apple for years, those were who keep buying in the hardest times of the company.
I saw a lot of video professionals jumping to the Adobe Ecosystem or to things like DaVinci for the GPU rendering.
In my case I ended up selling my mac, and with the money build good PC for the price. I feel stupid of not building one from the beginning. Good bye external disks and weird perif; hello SSD, RAID Storage, expandable RAM, and a GPU with CUDA that I can upgrade in two years, without trashing the rest of the system.
This is not a debate of Mac vs PC (that can go forever), or a PC Master Race propaganda, but the Mac Pro it's a bad taste joke. I get the idea of the "apple tax", but it's ridiculous here. The CPUs, SSD, and RAM are old, but specially the fact that the GPU it's soldered in the Motherboard it's just stupid. GPU are one of those rare things were you can still see the double of performance in each new iteration.
I love my Macbook Pro because reasons, but in Desktops, the PC it's the King.
Great summary. The Macbook Pro is still, arguably, the highest quality laptop on the market, but when it comes to the Desktop, there is absolutely zero reasons to consider Apple. I know that there are those who swear by OSX, but I find that on a functional level, the UX is essentially interchangeable with Windows 10 and in a few cases Windows is even preferable.
I'm aware of the Apple ecosystem synergy, and I don't find it surprising that an individual who uses all Apple products experiences a kind of lock-in dysphoria that makes any other alternative appear unattractive, but at the end of the day, when hardware performance is a critical factor in the type of work being done, I just can't justify the insane rigidity and mark-up of Apple's desktop products.
In a dream world they'd make all iOS devices USB-C, release new Mac Pros, all types of laptops, and all of them able to use the same Thunderbolt 3 eGPUs, RAIDs, etc, albeit at different levels of performance.
When you look at the public reaction to iPhone dropping the headphone jack (I agreed with) and the revenue from wireless headphones, we know nothing compared to the analysts locked in a vault at Apple who knew more about making money for their company.
And yet, I still think of the not-quite dichotomy of the two Steves. Making out-of reach-tech available to everyone, and making tech available to cool-people, where cool-people turned out to be a large superset of 'everyone'.
I was under the impression that most people were very aware that removing the headphone jack would become a revenue stream for Apple? That's what all the fuzz was about - a lot of people didn't want to buy adapters or new wireless headphones.
I value a lot the existing versatility of my Macbook Pro, and lament that it falls short when it comes to 3D rendering in C4D/AE.
However, I’m aware that my case is abnormal. The only thing preventing me from owning a high-GPU Windows machine for certain kinds of work is that I haven’t yet figured out how to use it for rendering remotely, since I move around a lot and I can’t bring it with me due to the bulk. So far I’m just sticking to 2D.
If I were a stakeholder, I’d be against Apple trying to dominate the professional market. People increasingly have multiple computers for different tasks so there’s more win in focusing on whichever market has thicker margins.
I have an iPhone because it works nicely with my Mac.
You might be thinking "well all phones will work well with your Mac" but before I started with iPhones I had Nokias for years and that they didn't.
I've had generation after generation of iPhone because of my Macs.
If Apple give up or continue to put out poor value machines (they don't have to be cheap they just have to be good value to my business) then I'll move everything to other vendors.
If that happens I'll probably never return to Apple's ecosystem regardless of which segments of the market are most profitable to Apple.
For me the "thicker margin" argument only holds if Apple were cash constrained or could get a better return elsewhere.
Apple has a massive massive pile of cash sitting around (over $200 billion) and I'd be surprised if standard investments make a higher return than actually investing in their Desktop line.
That's what confuses me about Apple's approach. Just because you make less margin in a line doesn't mean you shouldn't do it...
I think that Apple forget that video pros are also consumers. And their families. Usually Apple products thrive in families where at least one member is a professional that uses Macs for work. It just fits the whole idea of an ecosystem, so these people and their family members have iPhones, iPads, an AppleTV and maybe an Apple Watch.
Once these people start working on Windows machines, the benefit on an iPhone is reduced. It's still a great phone, but they might go for an alternative. Same applies to iPads.
Apple should wake up quick. The extraordinary sales numbers of the iPhone seem to have made the company live in a bubble where there is just the phones that matter. But who knows for how long smartphones are going to stay relevant and if a new piece of technology is going to replace them...
> It was Apple's most profitable quarter ever, raking in $78.4 billion. It also sold 78 million iPhones in the three months leading up to New Year's Eve — more than any previous quarter.
> Net income was almost $18 billion. In other words, a license to print money.
Designers, developers and gamers have always been the opinion leaders in personal computing. Apple showed they could maintain and resurrect their business by keeping their appeal with designers and making an attractive platform for developers. Gamers always stuck with Windows or consoles.
Abandoning designers and developers means gradual erosion of your position as leading personal computing platform; the opinion leaders are no longer on your side telling everyone how great your products are.
Now this might be a really smart strategy when you factor in mobile and cloud computings impact on personal computing. Hard to say how that will play out. My guess is it's a big mistake to leave designers and developers behind, especially as we enter the era of VR content creation.
I think you're significantly overestimating the degree to which people will follow recommendations that are explained by complaining about features they didn't even know existed, even if they are much less informed than the one giving the advice.
When friends ask me "What laptop should I get?" I don't tell them about the differences between various versions of Visual Studio or why I prefer Windows 7 or 10 over 8.
They don't. They look around for somebody with an informed opinion and ask "Hey, I'm switching my computer, what do you think I should buy", go with the answer.
And, of course, they'll trust the answer much more if it's the same thing the expert is using.
This is more a long term issue, of course you don't feel it when some trend-setters leave your product behind for something else. But these people are trend-setters...
I'm personally still 2 versions behind on osX because of Aperture. Now I heard photos can do many of the things aperture can except that you can't organize your library. Wut? Yeah, a real pro would switch to Lightroom right? And Lightroom is also available on Windows. As is Bash. As are multiple desktops. And finally Windows machines are getting quality trackpads and in it is easy to buy upgradable Windows PCs. Still I'd wish the Adobe suite was available on Linux though...
This was my last Mac, especially now that Dell and HP seem to finally understand that people are willing to pay for quality hardware.
This isn't a perfect analogy but if my mechanic said he only drove brand Y because it was reliable and then I heard anecdotally from Uber drivers they drove brand Y for similar reasons it would be an input for my next car purchase.
If I may so humbly suggest it: people on Hacker News and the like are (please excuse the corporate speak) thought leaders and trend setters for similar reasons. Developers are well ahead of the public on what's actually useful as their jobs require much more out of a system than a casual user. It's obvious there is some amount of the HN community that are unhappy with Apple's offerings and are switching to other vendors.
Sure Apple is printing money now: the general public are ready buyers of Apple product. No doubt Apple's incredible marketing is largely responsible, but there's an undercurrent of opinion based on what people like HN members use and some small percent is shifting away from Apple. I don't think anyone can really predict the long term trend here: we'll only really know in retrospect, but it is possible if that trend grows then public sentiment will eventually follow suit.
Exactly. And it seems every king of the mountain makes this mistake. Meanwhile, Microsoft, who many of us shunned for years because of them doing exactly this, is turning around and starting to show promise.
I do predict a visible shift of developers from Apple to MS - not to do Windows development, but to do general development on a Windows device... and possibly in a Linux shell within Windows.
Until they clean up Windows wholesale, top to bottom, there's no way it will ever be the same experience.
There's just so many layers of abject garbage in Windows. These are there for legacy reasons, but they're also a giant hassle if you're trying to work as a developer.
Some Linux distributions are rough around the edges, but few of them are as tired and worn out as Windows is.
macOS at least burns off the underbrush once in a while. There's few 32-bit apps in the wild now, most everyone's gone 64-bit. PowerPC is history. The driver landscape, while significantly more limited than Windows, isn't filled with broken garbage and junk that works on Windows 7 but not 10 or vice-versa.
Microsoft's strength is legacy support: Applications from fifteen, even twenty years ago still run. This is also their biggest problem.
You are right. I'm not saying Apple is doing bad. It's just short-sighted to let the Mac (Pro) die. They might need a more colorful palette of products in the future. It's easier for them to keep the Mac alive than having to rebuild it if they abandon it. It will take them years to get the professionals back if they do so.
Phone alternatives are so bad today, that I think iPhone is not endangered whatever they chose for their computer. Android just sucks. If you have no plans to root your Android phone (and that's like 95% of people) then there really is no advantage of that famous "Oh but I want to do with my phone what I want".
It's hard to believe but security is where Apple won the war. My friends with android phones are still rocking Android 5 Lollipop and that tells you something.
The ability to browse a legible file system on the computing device that I own is not some trivial preference which can be so lightly dismissed. It is a huge deal to a lot of people. There are at least half a dozen reasons why I vastly prefer my Android device over iPhone (after using the latter exclusively for many years), but this one is enough all by itself.
That is the legit case, but I would still trade that for security that iOS offers. I am sorry that I didn't make it clear, but with that sentence I was referring more on those people that talk about aesthetics, widgets, launchers, etc, etc...
The current Mac Pro design seems like such a massive misstep in product design, it will be really curious to see if they will walk it back, or just drop the pro market completely.
I wonder what thinking lead to that product. Did they think the pro market would be worth it to them to keep investing in keeping the product up to date and then it turned out not to be worth the investment? Although, even before then the Mac Pro was only updated sporadically... Was it all just Jony Ive design hubris? Who is the Mac Pro designed for?
> Was it all just Jony Ive design hubris? Who is the Mac Pro designed for?
Apple consistently chooses form over function, but they don't usually let the scales tip this far. The last time they fucked up this bad was the Mac Cube, but back then Apple was proactive enough to kill the Cube within 6 months. It really surprises me that Apple hasn't come out with something to replace the current Mac Pro.
Microsoft realized that the niche creative-type markets that Apple has historically owned tend to be broad-market influencers, and they've started to target them with products like the Surface Pro and Surface Studio.
Competition is great, and it will be interesting to see if Apple will rise to it or fade away.
The Cube didn't replace the Power Mac. It was sold in addition to it. If you compared them on specs alone the Power Mac was cheaper and faster so literally the only reason to buy the Cube was because it looked nice. (Or maybe because your desk was really small.)
The current Mac Pro replaced the old Mac Pro, though. That's the difference. If you don't like it, your only other choices are iMac (built in monitor) or Mac Mini (horrifically underpowered).
Apple could kill the Cube without consequence, but the current desktop Mac lineup is so limited there is no alternative.
It's not the built in monitor that's the problem, is the overheating with the assorted noise and possibly thermal throttling if you use it for something that actually maxes out the CPU/GPU.
Well, and the lack of expandability. While the portable MacBook Pro equivalents are about the same thing (look at Dell XPS 13 pricing), Apple right now has nothing for developers/power users on the desktop side.
Minor correction, but it's the Macbook with one port, not the Air. The Air that is still available has two USB and one Thunderbolt port + an SD slot and the Magsafe connector and so on.
And quite honestly, the single port Macbook isn't a completely horrible idea, Apple is just trying to push a new standard like they always do. It's an entry level computer that does just the basics. How many peripherals are necessary on a computer with those specs? You certainly can't do any games on it, most printers nowadays have a wireless function to connect to, it has Apple's trackpad which is pretty darn good for navigation, else it has Bluetooth for external mice, which can be had for very cheap now with good performance.
I'm all for more options; I don't have a Macbook because my old 2012 Macbook Air feels about right to me in terms of minimalist form with enough function to still do what I want. But I can see how you'd work with a Macbook without running into issues with the port. The majority of the time I use my laptop is just having it plugged in, or I have it when I'm traveling and don't want a bunch of stuff plugged in anyways.
Apple's idea isn't far off from how most people use the laptop.
Having only a single port on one side is indeed a horrible idea; and the MacBook is in no way an entry-level machine (at well over a thousand dollars). Try and step out of the RDF.
I'm not really sure why it's automatically reality distorting when the majority of my use time with my current laptop that has more ports than the newest MacBook is almost exactly in line with apple's idea with the single port.
My needs certainly aren't everyone's, and I've never claimed as such. But really, with the specs on the current MacBooks what else are you going to be doing besides Facebook and some movies? I'm not even confident it can play some high quality YouTube videos much less higher quality rips.
With the price, this is how they've always operated; their initial adopters always pay a premium to be first, and the second or third iteration is accompanied with a price drop and design changes to address initial complaints. The MacBook Air gen1 was filled with issues. The update handled all of these pretty gloriously.
I like being able to plug in a thumbdrive without mucking about with my power cable.
Or plug in to a monitor without mucking about with my power cable.
Or... Just about anything, really. I don't get the choice of a single port over 2 or 3, besides they wanted to push a "unified" (read: confusing) standard but didn't have the board room to operate 2+ of the ports (eg, if they had two, you probably couldn't charge from both).
Then get an Air or a Pro. Or a Dell, or a Lenovo. There are these things called market segments. For some people the MacBook is a great design. For others, it isn't. No developers were harmed in the making of this laptop.
> Then get an Air or a Pro. Or a Dell, or a Lenovo
Exactly! And that's the entire premise of the original article, people who used to be highly satisfied Apple customers are leaving for other, more suitable hardware after feeling that Apple is neglecting their needs.
Seems like almost assuredly they will just drop it.. basically I can't imagine any engineer at Apple that is or was working on it is super excited about something that probably made a grand total of like 50 million USD during the mac pro lifetime compared to like 80 billion on iPhone sales per quarter...
Considering Apple's significant profits and cash in the bank... Apple is well-suited to fund a loss-leader products that bolster their cachet in the professional community -- putting their platform on the front-lines of culture creators.
Furthermore, today's demanding professional are tomorrows Joe Schmoes: having the hearts and minds of pioneers would them better see what's coming on the horizon. It just seems silly not to invest money to keep the power-user niches happy.
> Apple is well-suited to fund a loss-leader products that bolster their cachet in the professional community
They could do this, but they have yet to prove that they will do it. Ever since Jobs, Apple has been hyper fixated on focused, profitable product lines. The Pro is out of step with that, and I bet with laptops making up the bulk of their computer sales, we'll start seeing them move away from the iMac too.
My only guess right now is that they still feel they need the Pro to maintain their computer business, but I fear they are slowly becoming just a gadget company and will forget their roots.
I wish Apple management folks read the sentiment here. Because, I believe if not for Apple, the industry wouldn't have progressed this far, at least in the build quality department (We had Sony, Dell and IBMs pro laptops but they never had the combination of price/build quality/looks nailed).
I think this is a very interesting perspective. If I understand correctly, your argument is that a Mac Pro product need not be a direct profit maker so long as it perpetuates Apple's position as a top end computer manufacturer. Have I understood correctly? Or are you perhaps prognosticating a change in market demands rendering such a product less relevant?
You got it. It's a strategy employed by the automobile industry as well: high-end loss leaders that "signal the brand" and cultivate loyalty among enthusiasts.
Apple's claim is staked on high-margin products, so it needs an elite cachet much more than say... Xiaomi or Dell. And you can signal that to consumers with one-of-a-kind high-end products that operate on the absolute cutting-edge of various creative, technical, and scientific niches.
Porsche for instance isn't earning any money with their LMP1 race cars. They need them to sustain the illusion that they're a sports car company, despite the majority of their sales being SUVs (Cayenne, Macan) and sedans (Panamera).
That, plus also the effect that people in the market for pro computers tend to be infleuntial. They creatd software and influence the purchasing decisions of others.
I think its a bad idea to drop things based on how excited your engineers are. If you go that route, then plenty of companies producing boring software and industry specific products will just die.
Yea totally agree in general -- in this case I just mean that its so obvious right now how much focus Apple puts on their money making products that engineers at Apple would see the writing on the wall and they too would not be that interested in pushing the product forward -- likely the people who worked on it if they were smart tried to bail out of that group long ago. So if both the management and the engineers both don't want to work on something, then both the money and effort is dropped on the product and its dead. When Jobs was alive, people could get Jobs to approve it as a pet project and they would get money and people on it and it would move along. But nobody seems to be pushing those pet projects at Apple anymore.
You got the point.. all Apple want to its pro line is having enough computing power to develop apps for mobile... For Apple, developing laptops systems/applications is not economically interesting anymore.
All said, Apple wants pro developers as long as you are a mobile pro developer.
There doesn't seem to be anyone at Apple speaking for creatives any more. Certainly no one who gets really excited about working with musicians, artists, film directors, and so on, almost as an equal.
Cook has the mindset of an accountant who is a media and trinket consumer, not the mindset of an inventor with creative aspirations. Likewise the rest of the executive team.
To be fair, there aren't many people like that around. But it would cost Apple virtually nothing to host some product focus groups with creative talent to ask "What do you want...?"
That's incredibly short sighted. This way they will lose many of their pro users which might not matter. But then they will tell their friends when asked what to buy to not get a Apple computer. I believe pro users is what made Apple big and it's a large part of what their success stand on.
The Mac Pro design itself is great - Xeon CPU, dual GPUs, infinite storage expandability via Thunderbolt 2.. the problem is that they never upgraded it. The GPU is 4 years old now, for example.
Right, you either choose an integrated design, and commit to expending the resources to keeping it upgraded right there on the bleeding edge, or you build an expandable machine and let your customers do that work for you.
But even before the integrated Mac Pro, they regularly let it go two years between updates of the pro model. They must have known when designing the machine that they wouldn't be keeping it up-to-date and that it would quickly become obsolete?
Apple also has the issue of trying to align their releases to their PR schedules. When a component (like a CPU generation) slips, it really messes with their update cycles (see: MacBooks). This doesn't matter so much to consumer devices, but pros are more sensitive to this, and letting pros upgrade their machines mitigates this to some extent.
Except the GPUs were crap even on release. And there are like 3 applications that can use both? So unless you were in that niche, you'd end up paying a ton of money for two dubious GPUs of which you could only use one.
A new version every year, and space for multiple internal SSDs, would have made it desirable.
However, Apple is now milking the old glory and failing to build tomorrow's glory. It's the classic beam counter mistake. It's probably time to short Apple soon.
If you want a pc with the design of Mac Pro (actually better in my opinion) look at the samsung Art pc Pulse. It is the same concept, but way smaller (about half the size), modern specs, I think you can upgrade at lest the ram (not sure about ssd though) and for the specs the price is OK.
Actually, photos do not do much justice to it. In real life it is way more impressive. The upper part looks like it floats on a colorful aura. The design is an obvious copy of Mac Pro, but they made it way better.
> The current Mac Pro design seems like such a massive misstep in product design
Seriously? Perhaps I'm biased because I own one, but it's pretty much the most awesomely-designed desktop computer ever IMHO, at least from a hardware/cooling/looks perspective. The real problem, I thought, is that Apple seems to have left desktops behind to rot
Sure, it's pretty, and if you want an completely un-upgradeable appliance, it's probably wonderful.
The point is, most professionals upgrade the same computer multiple times over it's life, particularly for fast-moving components (the GPU, in this case).
Making them basically completely impossible to upgrade significantly reduces the long-term value of the mac pro.
Some pros expect to upgrade but many replace systems for simplicity: if you make money off of your computer, spending a few grand every couple of years just isn't that significant. If you have a 4 year old system break after years of heavy (i.e. heat-generating use) the downtime cost can quickly hit the same amount and it's less predictable compared to simply budgeting n-thousand a year as a business expense just like you do with a desk, coffee, etc.
The problem is that Apple has chosen not to put any resources into non-portable devices and so the problem isn't an expense but an absolute lack of options other than switching platforms.
Upgrading components has never been a Macintosh thing. Arguably because simply upgrading 1 component just exchanges 1 bottleneck (graphics) for another one just behind it (motherboard or memory clock speed etc.) It's better and far more long-term cost-effective to simply upgrade the entire system at once IMO. I believe you're vastly overestimating the number of professionals who upgrade individual components (other than memory or hard drive which were always upgradable on Macs).
Why did they make the PowerMac G3/G4 series so easy to open you could do it with one finger if upgrading components "wasn't a Macintosh thing"? To dust the machine out?
I think from a design standpoint it was interesting, and might have been forgivable if there had been slightly more diversity in the GPU lineup and if they had kept it up to date.
The latter one is the big one. You can't just leave an unupgradable professional focused machine sitting on the shelf rotting.
I agree the hardware design is amazing. Product design is more than the hardware - it's where does this fit in in the market? Who's the customer? What's the roadmap?
The real problem is people aren't buying desktops like they used to, and as a percentage of Apple's revenue they're almost non-existent.
They're in a bit of a bind here. They can't cater to every niche need, they're not going to make these quirky, narrow-use spec machines. They can't compete on price compared to hand-built. They can't compete on performance compared to the ultra-high end workstations from other vendors.
Maybe if they opened up a limited licensing program for macOS on desktop machines to a handful of premium vendors that are willing to adhere to certain constraints they'd have a way out of this jam.
Jony Ive has become too senior to be effectively challengeable by more engineering types. That's why all the Mac hardware since he was elevated to his current position, always errs in favour of design at the expense of high end functionality.
I've purchased Apple products for 12 years (and for my entire company for 7 years).
It always seemed to be the obvious choice; it just worked, reliably and with a better user/developer experience than alternatives.
This year both me and my team have started moving away. My next phone will be an Android device and we're now not buying Macbook laptops for new recruits.
Windows, Ubuntu or Elementary OS offer a better experience. I personally can't take the restrictions I'm getting from MacOS and iOS. I'm also infuriated to see my machine being close to unusable a couple times a week while "kernel_task" eat up 120% of my CPU.
The only reason I was still sticking with Apple was the hardware, but that too went downhill. The iPhone's screen is brittle. The battery is capricious. My latest 2 visits to the Apple store resulted in a unusable track pad and a damaged screen on my Macbook (which were then claimed to not be covered by Apple Care).
Others around me share my frustrations.
It may be anecdotal, but 3 years ago I would never have considered buying anything but Apple. I've reached the tipping point and I'm not the only one it seems.
Ehhh, for me personally, I have no working mac, but use iOS - it's mostly just things like having to completely reorient myself around the UX, find equivalents for all of my prior things or repurchase/migrate, stuff like that. Minor stuff but you know -- people hate tiny annoying shit.
Ultimately it's just that I'm very attuned to my phone, so I want to delay and minimize hassle with it as much as possible. The same reasoning isn't always true of every device I own (I run Linux, hack things, I'm quite used to playing with my devices), but I think this is fairly reasonable. I don't really blame anyone who uses Android and feels the exact same way (by all means, I'm sure it's a perfectly reasonable alternative since the Android 2.0 days of yore, when I first went to iOS).
I might pick up a refurb Nexus as a work/burner phone for traveling though.
Did I mention I don't want to have to re-purchase FFVII again?
My problem with Google products is that (it seems) every single product released is designed to collect data in a different manner. It almost as if they only release products so they can collect new data. And you can tell exactly how badly they want your information by just how attractive they make the product offering. Gmail for example.
I always see these types of comments, and it seems like they are either talking from pure bias or outdated experiences.
Flagship android phones are close to flawless not only in hardware but in the integration with Google services, which are far beyond anything Apple has the ability to provide. As a Samsung S7Edge owner I simply cannot relate.
The only thing Apple has left going for it at this point in the pro video market is the ability to encode video using the Apple Pro Res codec, which is an amazing codec. But other than that, they have totally abandoned the pro market. Final Cut X was not up to the job of professional editing when released. The trashcan Mac Pros are terrible for pros because you 1) can't install internal PCIe cards like BlackMagic Decklink cards 2) can't install desktop GPUs, 3) Can't install multiple internal hdds. PCs are just so much cheaper / more powerful / more expandable and Adobe Premiere is basically a Final Cut Pro replacement. There's not much reason to use a Mac for pro video anymore.
I'd guess it's because the external chassis is really expensive, and it's silly.
All the busses are there, inside the device, but nope, you have to drop several hundreds of dollars (or more) on a box that squirts them over a expensive wire because marketing and design said so.
It's true, you could add an external ThunderBolt chassis, but when the trash can Mac Pro first came out I recall that not many companies were making the chassis and they were pretty expensive maybe around $1000 from Sonnettech.com (which was added to the already very expensive Mac Pro). The expansion chassis prices have dropped now. But the expansion chassis won't allow you to add a second CPU which is very helpful when compressing video.
In a way it did. Now instead of installing the BlackMagic Decklink cards, I can buy the Blackmagic UltraStudio or Intensity and get the same function over Thunderbolt or USB3. What was once locked to one machine is usable by every machine I have, including taking it on the road and connecting to my laptop.
Apple saw where the market was going and that an iMac or MacBook Pro with Thunderbolt/USB3 peripherals can do the work that required a fully decked desktop machine 6-7yrs ago, which is great for the vast majority of creatives. But in doing so they have left a segment of the creative community without the expandability, upgradeability and speed necessary (3D, CGI and some VR, etc). But some of these were never really strongholds for Apple anyway.
As a 20+yr veteran editor who has used just about every NLE on the market, my money and time goes to FCPX unless my clients specify otherwise. It's the fastest, most versatile and most stable product out there and the only one that feels like it is truly evolving away from the original late 80s NLE paradigm. And while its use in the high end market (film, tv and commercials) in the US is minuscule, it's global footprint is growing.
It's estimated that there are 20 million developers world wide.
Stack Overflow surveyed that 20% of developers use a Mac.
So lets say there are 4 million developers using a Mac. And lets say they buy a new Mac every 3 years. Then 1.3 million developers buy a Mac each year.
Apple sold 20 million Macs last year. So lets say 6.5% were developers and 10% of those developers are complaining about the new Mac series. If I am right that means that 130 thousand developers are complaining. That's 0.65% of Apple's customers.
Now I don't believe Apple is stupid. They are in business longer than some people here are alive.
So I can very much imagine that Apple decided to go for 2% more customers while losing 0.65% of them while making even more profit.
The article was about video editors, but it doesn't chnage the essence of your point. However.
That little group is power users. New adopters. Influencers. In the late 2000s, if you saw someone using a macbook pro, you knew he was some kind of professional, and most likely, something "sexy" - a developer, photographer, designer... This user base, professionals who care about quality over price, helped create the Apple company image. They loved Apple products and were voluntary apple advocates - I know I was. I convinced several friends and family members to switch to Mac because I sincerely believed that the extra cost is worth ease of use. And because I'm a power user (in their eyes), they believed me. And that same company image helped sell iPhones - the main Apple profit driver.
But now the same high-paid professionals are switching away. "Regular" users are going to see it, sonner or later; how do you convince them that this "Apple quality" still exists, when they see the power users trashing it?
Agree - today's video editors are tomorrow's video editing teachers. Who'll teach the ecosystem they see and work with best, informing opinion down the line.
I would be surprised if Apple didn't augment their line up with other power products in the medium term though.
Losing 0.65% of customers that work in IT could very well cause long term damage that the 2% of regular customers can't make up for. People working in IT often have a much bigger influence on others in that area and you don't want to lose that.
Look at Microsoft. They've been there and are currently investing quite a lot to get developers to use their software.
If Apple were clever, they'd look ahead, but it seems they go for good quarterly results instead and repeat the same mistakes others have already made. It has all the markings of a company in decline.
Think this pains people because we used to believe Apple had values above pure profit and would see the non-monetary value in supplying a high end system for people working at the fringe of technology.
All sorts of decisions that are actually bad for consumers can be justified by pointing to money, doesn't mean they're good decisions. e.g I bet dongle sales have skyrocketed since the iPhone 7 and new MBP.
> Now I don't believe Apple is stupid. They are in business longer than some people here are alive
They were in business when Jobs was there, they almost went out of business when he left the first time. Now that he's gone for good don't for one second believe their position will last forever.
I'm moving to an ASUS laptop with a GTX 1070 GPU. I should be able to do all of my game development on it, be able to play games and do VR, and keep my old Macbook Pro around just in case I do any iOS stuff.
The fact that a full-blown Windows gaming laptop with a GTX 1070 in it is about the same cost as the entry level Macbook Pro (with integrated graphics) is a good reason to.
Game dev, video rendering, machine learning, statistical analysis, CAD/3D work, graphics design, a little cryptocurrency mining on the side, etc, etc. GPUs are used for a lot more than games.
Right. "The GPU is irrelevant to my day to day work," is actually saying that exploiting large amounts of embarrassingly parallel processing is irrelevant. It's more likely in the medium to long-term, that the relevance just hasn't been discovered yet.
I was responding to: "If you're a game dev obviously this is different, but most people aren't." A GPU may not be relevant to you specifically, but it's relevant to far more people than just game devs.
This seems to be a common story. I've got a lot of friends (new graduates) who are in the same position.
Their other motivator for moving away from the Macbook is buying a new one would cost an arm & a leg, whereas you can get a much more powerful machine for a fraction of the price from someone else.
This has been true for years though; I'm curious as to why I hear it more often now. You've always been able to get more bang for your buck with a Windows machine, but that didn't stop MacBooks from selling.
You've always been able to get more bang for your buck with a Windows machine
Actually, Apple has been selling price competitive or even cheap machines compared to other offerings with similar performance for years. From 2007-2015 Apple consistently had the best prices for pro laptops and desktop machines among offerings with comparable build quality.
The $300 price bump for equivalent machines in 2016 and disappointing spec upgrades in 2015 and 2016 are a new deviation.
The price performance advantage of Windows machines is returning from a very long hiatus. And for Mac users it's unwelcome.
Tim Cook's Apple, of course, is all about high margins and raising average selling price. They're eating the seed corn to squeeze out more short term profits. We'll see how long it lasts.
There has been amazing news from Apple since Steve died. There was the introduction two years before everyone else of the new blazing fast ARM AARCH 64. There was retina notebooks. There was the design and thumbprint reader tech on iPhone5.
But all of those were built and designed while Steve was leading Apple. They're like little amazing tech gifts from beyond the grave. It just takes a while to get to mass market.
There has been nothing amazing that started while Tim Cook has been running the place except raising margins by making base products less useful. Keeping the crippling 16GB limit on base iPhones for three extra years while camera images grew and grew was a bold move to drive up average selling price at the expense of customer happiness.
That's the fundamental Cook flaw. He prioritises short-term margins over long-term product farming and ecosystem management.
And he's very good at it. But it's a strategy that literally has a limited shelf life. First you get an exodus of early adopters and creative users, and eventually you get a catastrophic product failure that does a lot of damage to the brand.
Apple isn't there yet, but I'll be surprised if that failure hasn't happened by 2020.
Of course there may be One More Thing in the R&D labs to save the day. But Cook has a poor record with original R&D, so it's optimistic to assume that's going to be happen.
I'd love to be wrong about that, but the smoke signals suggest otherwise.
WWDC this year will be a turning point. The theme is the liberal arts and we'll see whether that means a return to pro hardware, or just some hand-wavey presentation rhetoric.
There was almost zero "amazing" news from Apple while Steve was alive. The vast majority of Apple keynotes were "this year's machine is a bit smaller/ faster/ thinner etc.
And a lot of the very successful products were panned when they came out, including OS X, the iPod, the MacBook Air, etc.
Most of those only took off after a few iterations and changes. The iPod only took off after it went USB for Windows. The iPhone only took off after it got 3G and the App Store. The MacBook Air only took off after a redesign and big price drop.
Tim Cook has just been repainting Jobs's toys for the last five years. Cook's success is a testament to how good the toys were, but nobody wants Woody and Buzz Lightyear anymore.
I'm an iOS developer. I'm sure if Apple were making decent machines I would be more optimistic about my job. But the current situation is making me prepare for an escape route. I'm sure there are many others who feel the same, and that will have a knock on effect on the iOS world.
For a company that understands computers are very similar to fashion (their SVP of retail was poached from Burberry), they show a disappointing lack of foresight by not keeping their taste makers happy.
Introducing OS X as the mobile Unix platform of choice on top of the first ever drop dead gorgeous notebook (The Powerbook G4 Titanium) declared Apple as the platform for discerning power users. Those power users could confidently brag about their systems, giving free advertising to non-power users. The halo effect of which can still be seen today, trickling down to Joe Coffee in your local cafe. How much longer that lasts in the wake of mis-steps like the touch bar is questionable though, even in the wake of massive unit sales to the broader public.
This article is about high end video / graphics users, and workstations are a slightly different use case. But not much. Once these users move off Mac, who is going to be left to champion the desktops?
> ...declared Apple as the platform for discerning power users.
Incorrect. There are plenty of discerning power users who prefer the power of a good UI over a good CLI. What you meant to say is "for power users who happen to use UNIX".
Even at peak reality distortion Apple had a whole 6% share of the desktop market. Just enough to be annoying to the rest of the world. They definitely never had a monopoly on these so-called "discerning power users".
I use Windows 10 as my gaming PC, I own two MBPs one for work and one for personal use, I also make music on both Windows with FL studio and macOS with Logic Pro X. Lately I've been feeling that Windows 10 is really smooth and easy to deal with unlike my previous Windows experience so I've started coding -again- on my Windows and to be honest minus some built in macOS applications like QuickTime and preview I don't really mind going back to Windows at all, Unlike Windows 8 or 7 I feel that Microsoft has started listening to users which is really good.
So they had to switch to Nvidia for their amazing 1080 GPU, makes sense. Guess I wouldn't blame apple for not appealing to a niche market, as a reply on that post said they chase big markets and do really well in them.
It's interesting that in some ways Microsoft may be doing the same. They cut support for older CPUs, they cut a lot of their workforce for QA, and seem to be focusing too.
It'd honestly kind of cool if they margins reaching specialty users became so small that they stopped getting supported so Linux fills in the gap. I mean, the transition is bad, but the end opportunity for Linux and diversity in general seems good.
> It's interesting that in some ways Microsoft may be doing the same. They cut support for older CPUs
Isn't cutting support for old hardware the exact opposite of what Apple is doing? These guys are mad because they can't get the latest and greatest hardware from Apple, not that their obsolete system won't run macOS.
People with the silver 2009 Mac Pros are also mad because Sierra won't run on it. It will run on a 2010 Mac Pro which is, for all intents and purposes, identical, except a BIOS version setting. Apple decided 2009 users now has obsolete hardware. It's only 6.5 years old and mine is from obsolete. 2x4 3.33, 32g, 2x1TBSSD, 2x4TBHDD, USB3, GeForce 970.
When you solder SSDs and RAM into a machines logic board that doesn't exactly make your very expensive investment future proof. Let alone glue a battery into the case and the battery life is horrible on my touchbar MacBook Pro.
Apple machines are now appliances that cannot be modified. They are throw away, so spend your dollars wisely.
The idea of video pros, programmers, and IT guys being "tastemakers" is laughable. People used to ask me for computer recommendations 15 years ago, when specs mattered and computers were scary. Today, people just get a Mac because their fellow non-technical people have one, or because they have an iPhone or Apple Watch already.
Indeed in our household it works they other way around: My completely non-technical wife has prohibited me from buying non-Apple products. ("You know how whiney you get every time you buy a non-Apple product.")
As someone who is a video pro, cutting commercials in NYC and LA (and former post facility engineer), I'm not seeing it. I don't know of one editor or post facility that has moved from Mac to Windows or Unix. One River Media (the co. that posted the blogpost about switching) is using Davinci Resolve as an NLE, a far more niche choice than cutting in FCPX. Resolve is a color correcting tool (a very popular one that I've used to color grade features) that has added editing support. I've yet to meet anyone in the wild using it for editing.
Even the editors I know that cut on Adobe Premiere which is available for both PC and Mac aren't switching from Mac, which honestly has surprised me a bit because of the greater choice in hardware. But for most video editors at this level, you're just trading speed in one area for problems in another. Editors whine and complain every time there is a tiny change in the interfaces they use, they hate change. They have been forced to embrace FCP and Premiere over the years (and complain about it incessantly). Very few will choose to make the jump to Windows for the same reason.
As you step down the ladder, the move will make sense for some. Your all-in-one facilities or one man bands (production and all aspects of post handled by one or two people). But in my experience, this group has already been heavily invested in the Windows side because of the cheaper initial costs (that money you save early will be spent later and the Windows post-house will cost as much or more than a comparable Mac post-house, at least it did when I was an engineer).
And the other aspects of video post production, the CG, 3D and compositing sectors already heavily lean toward Windows or Linux and have for over a decade.
There just isn't a huge need for massive speed increases in the hardware side for most video editors. We've gone from needing very fast, high end systems with fast (and expensive) SAN storage to laptops and SSDs that allow us to do more, faster than ever. iMacs or MacBook Pros are all the average editor needs, with more and more working remotely from home. I cut a project for the NBA over the holidays on the first gen USB-C MacBook and years ago cut a project for REEBOK on the just released MacBook Air. Both these projects came up unexpectedly while I was traveling but went off without a hitch on underpowered hardware (that I bought for web surfing and writing).
That's not to say that I wouldn't appreciate (and most likely purchase) a new and expandable Mac workstation. But for the most part, I'd be spending money to just spend money. It wouldn't speed up 98% of my job. And that other 2 percent isn't slow enough to cause me any issues.
Does Apple really need or care about the "Pro" market?
It seems like all of their priorities are related to everyday consumers (beats headphones, air buds, several new phone models every N months, etc.).
Why would apple waste their engineering man-hours building products for any market except the common consumer who aspires to be part of the wealthy crowd?
I guess this might be their direction now, but at one time Apple did market heavily towards creative professionals.
I think this helped Apple's "hip" image among consumers. When you, say, see your favorite electronic band in concert and notice the Apple logo on their laptop, that reinforces the impression that these computers aren't "stodgy" Windows PCs (ala the ads 10 years ago in this direction -- https://en.wikipedia.org/wiki/Get_a_Mac)
Many "common" consumers have a creative hobby to be honest as well, and I think developing pro tools can help develop consumer / prosumer versions. See: Garageband, a consumer oriented DAW from the folks that made Logic Pro.
"Creative Professionals (or students)" has been a very visible, if not core, segment of Apple consumers. To me, this may seem desirable due to the perception that these people prefer to use tools that are 'the-best-of-the-best'.
They aren't stupid - it's entirely possible that Apple have run the numbers and found it uneconomical to keep pro gear up to date. So they simply give up the share of the market that contains video pros and similar. Obviously they'll lose the whole families of those professionals, and perhaps their whole businesses too - but Apple must have calculated that risk.
So most likely it's simply better business to cater to the consumer/prosumer part of the market and ignore "true pro" gear.
History proves again and again, that yes, big companies - even when they are presently turning record profit - can be indeed stupid. Very, very stupid.
Thinking that just because a tech company is big and therefore has the brightest minds there running the products and as such they can't go wrong and it's us, the users, that just don't understand their fantastic strategies is, as we can clearly see from the past, just not true.
Just look at Microsoft in the 2000's for instance. Or at Yahoo in the end of the 90's.
Usually companies make the money with the boring products and then they showcase their talent with superbly designed and not-all-around-versatile demos that make a loss.
It's kinda neat that with Apple it might be exactly the reverse.
I dont understand why Apple doesn't offer a clone program for workstation Macs. Dual socket and above only (so that it doesn't compete with the iMac or the Mac Mini), restrict it to Intel, but give Dell/HP/custom builders the opportunity to build the pro systems that don't make much of a profit themselves, but support the army of content creators for their other devices.
For everyone who wants Apple to bring back an expandable machine like the old Mac Pro with dual CPUs, PCIe slots and SATA bays, I think this is the best plausible scenario.
I still don't think it'll happen though, and even if it did, you'll be paying Mac Pro prices for a Dell/HP.
I've begun painfully shifting away from Mac Pros for scientific computing work as well, which six or seven years ago I wouldn't have believed. But my lab recently bought a high-end workstation and the MP didn't make sense, and my current MP is getting a bit long in the tooth and, unless something very unexpected happens, it's replacement won't be made by Apple.
I contract for video work from time to time for a few decades now.
Windows has always been the majority of video editing till about 2009 when Final Cut was the buzz word. I HATE Final Cut and just its vocabulary was so frustrating. Most were either Premier or some special Linux farm for HUGE projects. Apple won due to the idea that creatives used Apple and nerds used Windows. I always was the Amiga guy then the Linux nerd (Light Works is great but the plug ins are limited)and now I just am happy to not have to use Final Cut for anything.
Has anyone else realized this article is almost a year old? I agree with the whole macOS exodus, as I was a Mac user of 9 years before switching to PC in mid-2015 (you know, when it wasn't yet cool to do so). At first I thought I made a huge mistake, but since then I built a powerful rig last summer and I love it. Windows 10 is fast, stable and reliable, three words I used to describe OS X when I switched to Mac in 2006.
As a former developer now Project Manager, I don't code much anymore, but I tinker with stuff every now and then. I got into gaming for the first time since I built a custom PC back in 2004, and you know what? I love it! I thought I was "past that" phase in my life, but I enjoy my Windows 10 Desktop PC.
I still miss the Mac, and man do I miss the retina screens! But I feel macOS is simply now a legacy product to Apple that they simply cannot afford to ditch, not so much for business - obviously that would be bad - but I think it would be much worse from a PR perspective more than anything. And the whole thing bothers me. I feel like macOS is now an obligation for Apple to string along, but its real focus is now iOS. We all know iOS is the future, who are we kidding? At least with Windows 10 I feel Microsoft is trying to add neat features and updates annually now, and since mobile passed them by in the dust they have to scramble desperately to get Windows 10 in the forefront of tech enthusiasts again. I think for developers it's working (or at least they're trying very hard). I'm kind of excited for built in VR functionality, ubuntu core (WSL) seems really promising for developers.
Man I'm the same. I ran PCs since the 90s and put my own together through the 00s, but switched over to Apple fully in the late 00s. Having gone back to a Surface Pro and putting a desktop PC together again in the last year...I don't know what I was thinking moving away from it.
Windows 10 is amazing, and being able to pick and choose components is personally very freeing.
I'm finding problems with the escape key, not the hit/miss which is not a problem at all, but while coding I like to rest my finger on Escape key on certain occasions. That's not possible on MacBook Pro Touch. Also have to adjust that my Launchpad button is now where Siri is by default, next to the Touch Id Sensor. Everything else is better, touch pad, for which I thought I won't have use for included. Also, up and down keys are somehow easier to miss which is strange, probably due to keyboard elevation. As for the keyboard, finally, this is keyboard. I started developing on Mac as iOS and Android developer in 2012. Had hard time to get used to gummy MacBook keyboard. Touch Bar is something keyboard manufacturers in the PC world tried to make for years ending up with horrible keyboards no developer would touch. Industrial design is better as well. My 'old' MacBook Pro Retina now seems big and clumsy (both are 15"). This is the feeling I have when I use it.
A lot of Pro are also moving from Mac Pro to iMac. Where the current processing power fits their needs. But Video / CG Pros seems to have infinite appetite for CPU and GPU processing power, and Apple's line up are not catering for them.
Not only do I wish they could rethink the Mac Pro trash can, I wish they could design the Mac Pro with Server Rack in mind.
I agree that Apple shouldn't be chasing every bit of chickenfeed and producing lots of similar products. In fact, I would like to see Apple go back to 4 main macOS products; home desktop, home portable, pro desktop, pro portable. Getting rid of the MacBook Air/MacBook/MacBook Pro/iPad4/iPad Air/iPad Pro/iPad Pro2 confusion would go long way to making it possible to recommend an Apple product to my friends & family.
As it stands now, when someone asks me what computer to buy I have to interrogate them on their exact usage pattern and then spend a couple of hours looking at all the different Apple products to see which one might serve them best. 10 years ago it was simple; you just want to send email, browse Facebook from home, get an iMac. If you wanted to compile code at Starbucks, get a MBP. Goofing off at the library, get a MB. Simple.
The Apple product line has expanded because their user base has expanded. They're no longer selling a few hundred thousand Macs to a few categories of creative professionals. They're selling many millions of them to people all over the world.
Even so, Apple to this day doesn't chase after every conceivable ecological niche in the market. They only sell two headless Mac models (ignoring internal component options). The Airs are likely on the way out, so their clash with the MacBook is likely just a temporary transitional thing. Many of the products you listed are actually just different generations of the same thing.
Personally I edit video in Blender on Linux. But from what I understand, these days Premier is kind of the default for general purpose, non-legacy video editing on Windows. Resolve has a free as in beer version that's supposed to be pretty good.
For the external GPU route on a mac, does anybody know how fast the bus would need to be? Like could you use two USB-C ports in parallel on the latest and greatest mac or something??
External GPUs are explicitly supported by Thunderbolt 3, and some people have had success with it on TB2 as well. The 4x PCIe 3.0 lanes offer enough bandwidth for most purposes. It's kind of poorly supported on Mac OS though, and the enclosures aren't cheap.
It seems like the Akitio Thunder 3 would work -- did you try that one? I guess you would need a MBP with thunderbolt 3 then.. so 2016 models from like 3 months ago then?
In theory the nvidia out-of-os x drivers are for people using their cards on the pre trashcan Mac Pros. In practice they allow you to make yourself a decent hackintosh desktop. Or used to. Maybe the 1xxx cards don't work on the old Mac Pros, in which case i don't think nvidia can justify continuing putting out those drivers.
Who would this hurt? Apple. Very slowly though. Most people running hackintoshes are opinion leaders wrt to tech stuff, and when they switch to Linux or Windows, Apple will lose more mind share.
Yeah but a good bunch of the pre-trashcan Mac Pros are still in service. And those do take PCI Express cards. Apparently NVidia felt it's a market worth serving, since you can still get Sierra drivers from them for anything before the 1xxx series.
Okay, the drivers are labeled "for Quadro", but they support basically everything.
I work in a video production facility. We hate the trash cans. 3rd party chassis for them all fall short of an "all in one solution". These are our last Macs for production. We'll be on Windows boxes by this time next year. As a lifelong Apple user (Starting with the Apple II) this makes me sad, but Apple has done this to themselves.
Pro was always a stepchild at Apple. Steve Jobs never stopped by NAB for the Final Cut Pro press events. And more than a decade ago I started seeing middle managers being "promoted" from Pro Apps to other divisions like iTunes.
The hard truth is that we pro folks aren't that lucrative. Pro users probably sit in the bottom of a smiling curve with high-volume consumer products on the one side, and high-revenue Enterprise on the other. To a company like Apple, pro users represent the worst of both worlds.
That's why you also see "media storage" companies like G-Technologies, who introduced pro products (like the late G-Speed) only to abandon that market for high-volume, low-touch consumer products like LaCie Rugged.
I want a new MBP with an nVidia GTX 1080 as much as the next guy, but I'm not holding my breath.
Let's not forget that long before Apple exited the "Pro" market, companies like SGI went out of business. So did all the companies that used to make graphics hardware targeted to pros. They were overwhelmed by ATI and NVIDIA's repurposed gaming GPUs.
Yep. Tricked-out gaming rigs catching up with the heavy-iron graphics workstations was hugely disruptive. Low-overhead boutiques could suddenly do the same work as high-overhead facilities. But then you blinked, and the same work was happening in no-overhead places like The Director's Living Room.
Apple should license a single boutique manufacturer to make high quality workstations. It's clearly not a big enough market to interest Apple, but with the right agreement in place they could come up with a satisfactory solution.
Signed
- A Mac Pro owner that is seriously considering getting something else.
Apple is the Teflon company. It seems like no matter how many subpar (yet overpriced) products they put on the market, their reputation and public perception never suffers.
To this day, I have a negative opinion of Microsoft for their questionable practices in the past and god knows they've been lambasted in the past for their general attitude (though it feels like this is starting to change). Apple on the other hand can totally afford to rehash their products from a few years ago, sit on a stash of gold, disappoint designer and programmers alike for years, leave Mac OS on a sidetrack and never has to face any kind of serious backlash.
It really says a lot about their marketing genius.
It also says a lot about the composition of their market: Apple is likely not super concerned about snubbing professionals because they've moved on to bigger fish.
I seriously doubt that it simply has not occurred to anyone at Apple that not providing higher end GPUs and other features that various professions rely on will mean losing their business. Especially since the internet has been screaming that at them for a good while now.
Have you thought that maybe the tech pundits are just wrong? If they say Apple gear is "subpar" and yet it still sells like gangbusters, that tells me that the products were probably not "subpar" at all. Critics can be wrong, but audiences are never wrong.
I moved to OS X for Protools circa 2002, when Windows support for it was just awful. When I build my next studio machine, I think I'm moving back.
I need a machine that is large, powerful, and expandable. The Mac Pro isn't this.
I don't like Windows, but the only programs I'm going to use are Protools and Ableton Live. Just like on my gaming computer, which essentially just runs Steam (and games) I can deal with it.
For live sets, I'll keep using a Macbook Pro, because if one breaks I can buy another in any city in 5 minutes and I will know it will work almost immediately for what I need.
Did exactly this in December for CUDA based 3D rendering.
Been a mac user for 15 years. It was hard to make the jump but I came to the realisation Apple just doesn't care about my work and therefore my money anymore.
For those complaining that all PCs have flaws, the great thing about PCs is that you can build your own. All the parts for my new Skylake Xeon arrived this week. Can't wait to see what it can do! I'd been waiting for the Samsung 960 Pro 512GB which finally arrived. The 960 Pro is my boot drive. A 960 Evo 1TB is my data drive. No more spinning glass platters for me. Build your own and you get to make your own compromises.
I agree on the 'has' comments. The only thing that he got wrong, IMO, is that Alexa is going to be a success. If personal assistants are to take off it's going to be inside the phone (the most personal of all devices), not some speaker in the living room.
Phones are much more available (always with you) and personal (sometimes you don't want to share stuff with your whole household).
After thinking a lot about buying an iMac for development (I've been using MacBook Air for five years now), I just decided to forget about it and consider another platforms as ASUS VivoPC X, it is good looking enough, incomparable horse power and way too much less expensive. The only thing I am sure I am going to miss is MacOS.
If apple reinvent the IDE for use with touch on ios they might be able to drop their laptops altogether for development, and maybe market some kind of cloud-cube that people who want more grunt could purchase, rather than be limited to what can fit in a 2mm thick laptop.
Yup we do. OS X user for 11 year, recently switched to two laptops. One Window 10 laptop for video and photography work, one Linux laptop for the rest.
I was under the impression from friends who work in Hollywood and television that not much video editing was done on Macs, anyway. That Avid software running on proprietary/custom hardware was standard. But this was five or six years ago. Maybe people had been switching to Macs in more recent times? (I remember a similar discussion came up when Apple consumerized Final Cut, and was told that pros don't use it anyway.)
due to the inherent bandwidth limits that Thunderbolt has as compared to the buss speeds of these GPU cards
[citation needed]
There probably are applications that need every single GBps of the x16 connector, but just saying "bandwidth limitations" isn't sufficient - see also the common SLI setup which switches to x8/x8 if two cards are installed.
ITT: People pretending the world's wealthiest company isn't building a 64 bit ARM desktop powerhouse.
Why would they continue dumping money into refreshing Intel-based systems when they screwed them on 32GB of ram in the LPDDR4 thing on the new rMBP?
Can you imagine how annoyed the Apple people are that they can't sell you a $499 32gb ram upgrade along with your $1299 1500gb SSD upgrade on your $4299 computer?
Apple's future is ARM and to expect any more powerhouse Intel systems from them was folly even a year ago.
While Apple is focusing on trying to create the thinnest notebook on every generation, other companies are actually making useful computers, laptops or otherwise.
Right now, I've decided to take the money I'd spend on the cheapest Macbook to buy a desktop system, plus a chromebook. I can have mobility and a lot of performance, for a fraction of the price.
Even Windows is becoming more viable again, ubuntu core and all.