Am I the only one that sees OS X as the biggest reason to switch to Mac? I mean Windows is good, but nowhere good as OS X. And please, don't tell me Ubuntu or other linux flavors. They look good (and are good if you are programming on them) but the UX is still lacking a lot. (Never mind the confusion of the different flavors, packaging systems, and configurations). Also god forbid you have a problem (especially a hardware problem) and then try to debug it. Good luck searching online for a resolution.
I never had success with Linux. My Macbook pro has had its shares of problems (Wifi issues that later resolved with a system update) but it's nowhere my experience trying to install Linux and battling the drivers issue.
Anyone figuring out the Linux/Laptop problem is re-inventing the Macbook Pro/OS X.
Here are things that I'd pay $1,000 on top of the current Macbook Pro model:
- Thiner/Lighter
- Longer Battery Life (5+ hours)
- 32/64GB RAM
For OS X:
- Less cluttering (ie: remove all Apps and let the user decide what to install, like Siri and crap).
- Native Package Manager
That's about it. I'd be buying the new Macbook Pro in a month. But if Apple releases something like the above, I'm more than happy to drop 5-8k usd into it.
If MacOS had: up-to-date OpenGL support, Nvidia made drivers that supported new cards for it, was not locked into Mac hardware - that would be tremendous.
On the other hand, if Windows had a proper shell and cli tools, like cygwin with zsh, but native and not Ubuntu layer inside - that would be tremendous.
If Linux, any desktop variant (Fedora my poison), had Adobe's support for their DCC apps and great battery management for laptops - that would be tremendous.
If Windows and Linux had the above + Preview from MacOS - that would be tremendous.
> On the other hand, if Windows had a proper shell and cli tools, like cygwin with zsh, but native and not Ubuntu layer inside - that would be tremendous.
It is PowerShell, and it really is. Until recently I thought Windows had poor CLI support, and I discovered PowerShell and now I favor it even more than bash.
Am I crazy? Possibly, but PowerShell is truly a piece of gem in the CLI history. It is a thoughtfully crafted product regarding what "command-line interface" should look like.
It is a thoughtfully crafted product regarding what
"command-line interface" should look like.
How so?
I've never appreciated what's actually good about powershell... you pass objects around? So...? Is that a thing that's useful?
If you want to just automate a task, having intermediate objects that are serializable (eg. strings) so you can `foo ... > blah` and inspect the value of blah before continuing (`cat blah | command2...`) has always seemed far more tangibly useful.
Having methods on an object you can invoke like a REPL for the OS sounds like a good idea, but I've never actually found it useful; its like the python REPL; useful for prototyping and doing stuff after you've imported the 50 packages and setup all of the environment, but once you open a new instance, you've got to spend the time doing that before you can actually do any work; and its useless for scripting.
...but, powershell gets a lot of love from people; so what do you actually find it useful for?
Honestly curious, I've only ever touched it briefly and then swapped over to other things.
> If you want to just automate a task, having intermediate objects that are serializable (eg. strings) so you can `foo ... > blah` and inspect the value of blah before continuing (`cat blah | command2...`) has always seemed far more tangibly useful.
String parsing is the bane of my command line scripting experience. Even in targeting "identical" environments, all it takes is one changed installation default altering the output of one of my many commands for my scripts to break - usually in some non-obvious ways that require a good hour to get to the bottom of, rework, and fix. To prevent such changes from forcing me to rewrite my entire scripts every time, I try to centralize such text parsing and munging in one place, "deserializing" those strings once and feeding it to the rest of the system. Shipping around this deserialized state in command line scripting languages can be so awkward at times, as to warrant rewriting the entire thing in a proper programming language. Inter-operating between your new program and your existing scripts will, of course, require even more text parsing.
Don't get me wrong, sometimes munging text is your least horrible option. Powershell still lets you do that.
But Powershell's objects also let you, with great frequency, skip the "try to 'deserialize' text that was really formatted for humans and isn't versioned, can be ambiguous, and otherwise was never written with machine consumption in mind" step. If I feel the need for a 'proper' programming language for parts of my script, I can write C# modules and use them from powershell without writing a bunch of text (de)serialization code on either end. This singlehandedly eliminates entire swaths of the most brittle, opaque, and otherwise obnoxious code to ever grace my scripts.
I've been using shells on Linux VMs and Macs for years now and I've probably written less than 50 functions, and the number of times I've typed sed or awk is probably lower than 100. I pull out the real scripting languages for real jobs. I only use shells for quick things.
Sure, I could write a Python script right now that would read me the last lines of a log file on a remote server. Or, I could just type something like this and hit enter: ssh user@server "tail /path/to/log"
I think what GP is trying to say is that PS is in an awkward position between the two. It has a deeper understanding than bash/zsh/whatever, but it also requires more typing. Yes, PS will fix issues like the ones you have described, but typing in long names (at least for me) defeats the purpose of using a shell in the first place. I don't want to type in "Get-Item" or whatever a million times, nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
> I pull out the real scripting languages for real jobs. I only use shells for quick things.
It's my experience that the latter eventually morphs into the former "without question".
And go figure, the build server doesn't have python installed. Or only has python 2. Or only python 3. Ditto for a coworker - this being game development, a lot of those coworkers aren't programmers, and won't be able to debug "hey python is missing" on their own - sucking up IT and developer time.
> It has a deeper understanding than bash/zsh/whatever, but it also requires more typing.
Aliases, tab completion... you're not wrong, but I've not found it an issue in practice. In fact, rather the opposite: I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using. This is perhaps because I'll script anything that gets annoying. I don't spend a huge amount of time doing bespoke commands in a shell, though.
> "Get-Item"
gi
EDIT: "Get-Alias" (or gal) will share a lot of shorthands. TIL %{...} is just using % as an alias, and that ?{...} is another option.
> nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
> It's my experience that the latter eventually morphs into the former "without question".
I suppose we do vastly different things with our shells. Looking through my history, it's mostly things like "cd", "ls", "vi", "make", etc. and my longest bash script that's stood the test of time is 12 lines long, with the most complex part of it being an if statement in a string (trust me, there's a reason for that). I've ran much, much longer shell scripts, but I almost never write a shell script longer than 20 lines.
> And go figure, the build server doesn't have python installed [...] this being game development, a lot of those coworkers aren't programmers
AHHHHH ok we definitely do work in very different atmospheres! I suppose in instances where "coworkers aren't programmers, and won't be able to debug", I would just write a Python script and use PyInstaller so they could just double-click on a .exe
But if I'm on someone else's computer and they don't have Python or anything like it, then I would honestly just install Python. But I definitely see how you or anyone else would object to this, and I can totally understand the view that it's much better to use PS in this instance.
> Aliases, tab completion... you're not wrong
You're write, there are aliases and tab completion, just like on bash/zsh but on PS, I have to remember both "gi" and "Get-Item". Sure, I would use something like "gi" all the time, but whenever I look up something and see a StackOverflow answer that says "Get-Item", I have to know what that means, which means I have to memorize both the long and the short versions of a lot of things. On Linux shells, I feel like I only memorize a short thing like "cat". Sure, I also have to know what it does, but the same applies to PS.
> I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using
The letter flags part is a fair criticism. But don't all shells suffer that? It's the cost of writing quickly. I could Google "what is gi" but instead I choose to google "what does set -E do?"
As for the part about "reading documentation to decode bash/zsh scripts", I think that this discussion sums up why what you're saying is true for PS as well: https://news.ycombinator.com/item?id=14034414
> I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
I was working on two programs. One would do stuff and print JSON to stdout, and the other would take JSON from stdin and process it. I had a Linux VM running inside Windows. From my VM, I ran something like `program1 > file.json` and then I ran `cat file.json | program2`. This way I could inspect the JSON file at any time in case something went wrong in one of the two, independent programs. Everything was working just fine.
Then I stopped writing code and testing it in my VM. I decided to go the Windows route, and update my code outside of my VM, and then run my code in PS. I ran `program1 > file.json` and it worked like a charm. Then I ran `cat file.json | program2` or whatever you run in PS (it's been a while) - but it didn't work. So I assumed it was my fault. Time to debug. I looked at `file.json` line-by-line, and it was just fine, so program1 was fine. I looked at program2 line-by-line, and it was just fine, so program2 was fine. I went to my VM and ran `program1 | program2" and everything worked fine.
How was it possible that my code worked just fine in Windows, but not in Linux? It turns out that when I ran `program1 > file.json`, it fucked up my json file in a way that was like undetectable. I ran `program1` in PS, selected the output, and copy-pasted it into a text editor, and save the file as file.json. Then I could run `cat file.json | program2` or whatever from Windows and it worked like a charm.
To this day I am not sure what happened. Also, program2 supports a file name as an argument which it will then open and read, so some of the commands I listed may be slightly different from what I actually typed, but the gist of it working perfectly in bash on Linux but not on PS was enough to destroy me. Perhaps the issue was something about encoding? Sorry if what I'm saying does not seem very concrete. Here are some links that demonstrate (possibly different) issues people have using redirection:
> But if I'm on someone else's computer and they don't have Python or anything like it, then I would honestly just install Python.
The problem is scaling this to many coworkers. At some point it becomes "wait for I.T. to get around to automating the install across the fleet" or make your scripts install python, ninja-like. But it sounds like you're more able to rely on python, so that probably makes more sense for you (if only so you don't have to rewrite the same script for non-Windows boxes.)
> Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
It sounds bad enough I can totally get where you're coming from. Heck, it's basically the exact same place I'm coming from with the "strings shot my dog" quip ;)
> Perhaps the issue was something about encoding?
Something to do with e.g. UTF-8 BOMs or line endings (\r\n vs \n) would be top of my paranoia list. I'd break out a hex editor or binary diffing tool (I've used 010 Editor a couple times for this) if you find yourself in the same situation again. Understanding exactly when I have a single string with newlines vs when I have an array of strings with implicit newlines when joined isn't something I've got my head perfectly wrapped around yet in powershell, and could be another possible cause.
> Sorry if what I'm saying does not seem very concrete.
You're offering what you know, and I appreciate it :). Sorry for the short reply (I need to be somewhere...)
I'm using a large corpus of Powershell scripts, mostly written by enthusiastic Microsoft consultants, and even if I know exactly what they do I cannot stand the mysterious imports, the sequences of script invocations and bare statements that leave the session with the desired invisible state (and, conversely, closing and reopening the session after every major command just in case), the automagical option handling, and so on.
1. It's amazing because Windows-only admins (or predominantly Windows admins with almost no Linux experience) have never seen anything like it before. Until PowerShell, the state of the art was VB scripting or batch files, both of which are (objectively) garbage. Regardless of how long the rest of us have been working with shell scripts, Python scripting, etc., Windows users have never had the opportunity to do similar things with similar tools which are included with the OS.
2. It's amazing because it does a lot of great things that even bash scripting can't do. The idea of passing around structured data can be super handy for a lot of common topics. For example, on Linux I have to use 'ip addr list' to get list of IPs, grep to get just the IPs, awk to get just the IPs, and now I have a list of IP addresses. It's a huge stupid hassle that I have to do every single time I want to write a script that takes advantage of IP addresses.
Making everything a string makes sense when it's 1970 and you want everything to be compatible, but when basically none of the tools I use on a day-to-day basis provide the option for easily machine-parse-able output, it ends up very frustrating. The (theoretical?) promise of Powershell is that all output is machine-parse-able.
The benefit of passing objects around is that you could do things like Get me a list of network interfaces | filter by interfaces which are up | which have IP addresses | just show me the IP addresses. The few examples I've seen make it feel like your shell is some sort of half-bash/half-SQL system where you can filter, process, and loop over objects.
I can't count how many shell scripts I've had to write which parse output to get the list of data I want, then go back over that same output again to do actual work on it. You can hack a lot of stuff together with ugly hacks; getting all the interfaces on a MacOS machine with IPs except loopback? Maybe 'ifconfig | egrep "^[a-z]|inet[^6]" | grep -B1 'inet' | grep '^[a-z]' | grep -v lo' would do it. In most cases. Probably there's a better way to do it, but if you just want to get something written then you can hack it in like this, or loop over 'ifconfig -lu' (which, on my machine, shows 13 'up' network interfaces), etc.
I've been using PS for quite a bit of AWS automation lately and I have to say that I don't like it. Sure, you can pass objects around, but I've happened upon more than one cmdlet that does the wrong thing with the incoming object. In one case I was passing a "String" object to a cmdlet that accepts strings and it didn't know what to do with the object so it generated a fairly obtuse error. I had to manually cast the String object to a string. Grrr...
Another thing that bothers me is the lack of single line composability. In most Unix shells you can pipe things around with abandon; it's not pretty but it works. One more than one occasion, while working with PS, I've had to create a cmdlet because there is no way (or I don't know how) to store intermediate values between cmdlets. One example was processing things in a loop. I had to store the current value in a variable and then process that variable in another line. I know someone here will give a solution, but I looked for an hour before giving up and creating a script file.
On top of everything else, the cmdlets from Microsoft have differing switches for the same thing. One command might use -ServerName while another command will use -ComputerName. So, basically, you end up looking everything up before you can use it. I know Bash isn't much better, but at least I can expect that the tools are separate and not really designed to work together. I was expecting more consistency from PowerShell.
But why is it that the PowerShell terminal emulator is (graphically) even worse than a tty ? I've worked on ttys more agreeable on the eyes than Windows terminal emulators.
Windows is supposed to have better font support than Linux.
Also, if someone here has an answer: Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
> Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
Windows' way of program executable placement is using the holy Registry. It's called 'Application Registration'[1] and was introduced to reduce the needs to modify the system-wide PATH variable. (They thought it was a bad idea to modify a system variable so frequently, and I partially agree.)
You can find registered applications in `HKLM\Software\Microsoft\Windows\CurrentVersion\App Paths`. Very few programs use that feature, which is unfortunate, but popular applications like Chrome and Firefox register themselves in it. That's why you can invoke `chrome` in the 'Run' dialog.
Edit: Another context to add: at the time App Paths was added, to modify PATH you had to edit AUTOEXEC.BAT manually which was painful. Not only that, but also PATH had a length limitation of 128 characters. You can find more details in the Raymond Chen's blog, as useful as always.[2]
> Not only that, but also PATH had a length limitation of 128 characters. You can find more details in the Raymond Chen's blog, as useful as always.[2]
I should note there are still length limits - in practice you'll run into issues with as few as 2047 characters:
Debugging this is really annoying, as one of my coworkers found out when one too many applications decided to add multiple paths to PATH (for example, nVidia CodeWorks has added no less than 8 subdirectories of C:\NVPACK\ to PATH to support Android development - for gradle, ant, jdk, ndk, and the android SDK's support, build-tools, platform-tools, and regular tools.)
Said coworker ended up spending some time using directory junctions to shorten the paths in PATH to the point where his dev environment was useful again.
Hm, interesting. I suppose this is what http://scoop.sh should be using? (The per-user setting, "HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\App Paths").
Oddly it appears the python2.7 installer uses this (global setting), but not the python3.x one (It would seem that python2 could register python.exe (as it currently does), and python3 could register python3.exe (as it currently does not)).
It certainly doesn't seem to make much sense for python3 to have an option to add itself to the path environment variable, and an option to change the path length limit - but apparently not an option to use this "modern" way of registering itself? (Unless, python2 and python3 installers, when fighting it out, default to only registering python2... which makes sense, but is painful).
But based on a windows hyper-v vm with only python3 installed, it looks like python3 does not use this setting.
It is not only ugly but also slow. One of the pain points while using PowerShell. It seems that Microsoft didn't care much about the emulator until recently. Thankfully things are changing, it was improved a bit in the Windows 10 anniversary update and Microsoft promised to improve it further. I'm optimistic about it.
> Also, if someone here has an answer: Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
This get's to the real issue of what makes powershell so horrible. It's not that anyone loves bash scripts, it's that there are a tonne of great utilities that bash script tie together. Windows doesn't have this.
Maybe it's great for Windows, but it's not great for working on Unix machines, which many of us do.
I'd be more open to Windows if trying to manage Unix boxes from it wasn't like trying to build a ship in a bottle. I always feel like I have one hand tied behind my back trying to do my job in Windows.
For simple web browsing and office work and such, it's fine.
I made a wrong statement. PowerShell is a great shell! I want a shell and gnu chain from coreutils onwards within Windows. Babun takes me close, but not close enough. Namely it's slow as hell, 64-bit is no really there, and it's cygwin. I'm one of those "runs Vim and writes their own Makefiles and uses GCC and stuff like that doesn't like cygwin" guys.
I see. I'm also kinda a "runs Vim and writes Makefiles" guy, but I'm more hopeful about the Ubuntu layer being more seamlessly integrated into the native rather than waiting for Cygwin to improve. After all, WSL is official, and Microsoft seems to put a lot of efforts into it.
As a side note, like you said, Cygwin is slow. I once measured how much time compiling things consumed on both Cygwin/MSYS2 and WSL. `./configure` was 3 times faster on WSL, and `make` was 2 times faster. I assume the reason to be WSL's process management being lighter. This is another reason I'm looking forward to see the improvements to WSL's native integration.
Great question. I use, all the time, applications which need full speed and full GPU support of which some are only on Linux and some are only on Windows and there's some overlap where most from both are on MacOS. Simple VM doesn't cut it, I've tried.
What about the other way around, Windows host + *nix VMs? I run FreeBSD in VirtualBox, forwarding apps to VcXsrv via PuTTY. Works very well. I even made a tray icon script to launch that setup https://github.com/myfreeweb/xvmmgr/blob/master/xvmmgr.psm1 :)
I discovered PowerShell and now I favor it even more than bash
Same here. Might be biased because I never completely mastered bash as I don't use it that much. But after the first bit of the learning curve is over the rest just seems to go automatically with seemingly way less searching the internet: things are just easier to discover and figure out by yourself, and that also makes it easier to remember them. Plus you can visually debug it. Also some of the bash things I'm addicted to (autojump and fzf) have some pretty good clones for PS, namely ZLocation and PSFzf, those are real timesavers for navigation/history search for me.
Powershell may be good (I wouldn't know), but the terminal emulator (cmd) is absolute garbage. Even in Windows 10, you still have to edit the registry just to use a decent font, There isn't a reasonable way to change the colors, &c. I honestly can't tell what has changed in cmd since Windows NT.
Sure, there has not been much progress with terminal emulators in the past couple decades, but in Windows there has literally been none at all.
PowerShell is very serviceable, but it has weird points of failure. Especially with some FOSS projects that seem to think cmd is the furthest they will go as far as Windows support, then they completely choke on powershell.
I think if I liked the rest of the .Net toolkit more, I'd be more enthused about it. It makes things livable on Windows, which is better than the dark days of Vista/7.
But powershell is an all or nothing proposition. At my job, the majority use OSX. I use Linux. Since we all use bash, there's very little friction here. I can't "just use powershell" in this scenario at all.
That does help, and is welcome. But convincing a team to switch something as fundamental as their shell is an uphill battle. I'm sure PowerShell is great (I've never used it), but shells always struck me as the kind of thing where "good enough is good enough".
"if Windows had a proper shell and cli tools, like cygwin with zsh, but native and not Ubuntu layer inside"
With my understanding of the Windows architecture, such as it is, there isn't a great distinction between "native" and "Ubuntu layer" because the "native" is really already a "'native' layer". If they continue to polish the Linux support, it will basically become as native as the core of Windows already is.
Agree on the OpenGL support. I really wish Apple would just update OpenGL to the newest OpenGL so I wouldn't have to change platforms completely in the near future.
Then again, developing for Apple currently means more compatibility with old hardware and drivers, but OpenGL 4.3 (which Apple doesn't support) includes Compute Shaders, something I would really like to explore, but can't as Apple supports only 4.2 with a limited amount of extensions.
Depends on your computational problem. For image, video or audio processing where the Apple SDK is superior to most others it might be the best deal, for others maybe not.
For some tasks, the Apple code is orders of magnitude faster so it's not a problem. The ability of the Mac Pro to smash through H.264 video with GPU acceleration is pretty much unparalleled. Compared to FFMPEG it's not even close. When it comes to general compute, though, it's easily outclassed.
It's all about benchmarks and cost/benefit analysis.
I've seen code that works on the iOS 20-100x faster than a desktop equivalent because the iOS version is done using Metal and the desktop one is barely vectorized CPU code. In some edge cases a "server farm" of iPads might outperform an equivalent spend on Xeon-based servers.
Why limit myself only to Apple platforms ? Apple is going towards a more closed walled garden all the time, and I don't particularly like that.
Also, Os X users are a relatively small group compared to Windows users, so if making for example applications for VR, limiting yourself to Mac is a killing move.
It's sad though, as I really love the Os X as a development environment.
It takes a while and it's really boring, but you can configure the scaling for at least the major graphical toolkits (Qt 4,5 whatever and GTK 2,3). Java apps are yet another separate thing to configure.
My number one complaint about OSX is also window management. I've never found anything comparable to the way Windows 7 let you use the left/right arrows to line up windows on half the screen. That is the only thing I miss about Windows, but it's a pretty big one. Also, when dis/connecting monitors on OSX, windows often get lost.
ps- if anyone has any suggestions or recommendations, I'd love to hear them!
Not sure what specifically you're talking about here, but if you wanna line up two windows side-by-side you can long click on the green "full screen" traffic light button in the top left of an app's window (e.g. Word), and choose which half of the screen you want it to maximise in. Then do the same for another running app in the other half and it'll split the screen between two apps at full size.
Okay, I just long clicked (5 seconds) on the green button and all it did is maximize the window. Tried long click + drag, didn't work. What am I missing?
I was in the same boat and after some Googling I found that you need to have "Displays have separate Spaces" checked in the Mission Control preferences. For some reason it requires you to log-out and back in after activating it, but afterwards I can indeed use split view as described.
I don't remember changing this setting, but it's always possible I did years ago. I have never been a big user of Spaces and somehow missed out knowing Split View was ever a thing, but it's actually pretty neat.
Works for me, long click, it resizes to left half horizontally, then when I release it fills vertically. On the right side is a collection of my other windows, I tap one and it fills the right side.
I second the Spectacle recommendation, but it is frustrating that you have to download an APP to do something that should be built into the window manager.
Yeah, there have been many many iterations of these apps, by different developers. SizeUp + Cinch, Divvy, Spectacles.
While I think that Apple should have just copied Microsoft's approach (i.e. what Cinch does) the wealth of options for window management means that everybody can find something that they like.
I've found Spectacle leaves a bit of padding space under where the dock folds out - potentially useful, but unwanted in my case, and I wasn't able to find a setting to change that behaviour.
I use BetterSnapTool instead, which seems to have a better collection of settings and solves my (admittedly minor) gripe with Spectacle.
I absolutely love moom. I can't imagine going back to using macOS without it. It's amazing to me that this kind of functionality isn't built in. I use moom entirely via keyboard to easily move/resize windows onto a set of customizable "grids." With moom I spend almost zero time/effort fighting with window placement and almost always feel that windows are exactly where I want them. It's a great feeling!
Best tool ever. The custom touchpad gestures are great, too. You can do stuff like three-finger swipe up/down to switch to previous/next tab, and rotate right/left to open/close taps.
I hardly ever use the touch gestures but BTT will let you drive everything from the keyboard. I can put windows almost anywhere at will now.
I know I'm being selfish here but it would be nice if Apple covered this sort of stuff instead of concentrating on emoji and Siri. Both of which I have no real use for.
There is Divvy which is an interesting approach. There is Hammerspoon if you fancy Lu's hacking and Phoenix if you fancy JavaScript. Hammerspoon can do more.
There is a great app called Hyperdock that does exactly what you want with window snapping. It works so perfectly that I sometimes forget that it's not a native feature of OS X.
Not sure about your window size problems with external monitors, I find that OS X handles this amazingly well
Not sure if it's just my experience, but I've found Hammerspoon to be extremely demanding on my older 2012 MacBook Air. I find single purpose apps somehow work better.
But I do love the amazing level of configuration and scripting that Hammerspoon allows you to do.
Ubuntu has these similar features in whatever default window manager thing it uses (I don't know the difference between window manager, desktop, etc for Linux). It also has "geographical desktop management," i.e., in windows and mac your separate desktops are to the left and right only, on ubuntu they can be up, down, left, or right. Might be trivial to some, but to me, having a "physical space" metaphor for my workflow is tremendously helpful.
Also you can drag left/right to snap to one half of a screen.
Anyway I tried to replicate this behavior on Mac with spectacle. It's meh. If you fuck up and fullscreen something and then try to half screen it, it seems to break horribly.
I've been using Amethyst and kwm for the past 4 years and are very happy with proper tiling in OS X. There's been some minor inconveniences but over all it's great.
I don't really find this to be a valid complaint. There are so many free tools out there that handles window management for macOS. I personally use Spectacle. Here is a list of 20+ window managers: https://www.slant.co/topics/526/~best-window-manager-for-mac
I don't understand how OSX and Windows lack a drag modifier key. Like, in most Linux WMs you can hold Ctrl and click and drag anywhere in a window to move it.
No one has mentioned it yet, but I have used slate for years. It's a programmable window manager. You can program the Windows 7 functionality you described with a few lines of config, if you wish, and that just the tip of the iceberg.
Hyperdock, yes it's a paid app but it allows for what you want along with my favorite gesture of swipe up/scroll up while hovering over an application's titlebar to take it full screen. There may be free options out there but I've have hyperdock for years. It also gives you dock previews when you hover like Win7.
Divvy is my go-to window manager on macOS. I've got it set up to arrange windows on each edge as well as a "communication" window in the upper left corner of the screen for Slack, etc...
This is one of many features i much appreciate in Mac OS X. Active corners is another powerful features that makes window managing and file moving a bliss. I find macOS much stickier than iOS. I can go without my iPad but not without my mac.
It's hidden, looks like an afterthought ux-wise, and is inferior to the interactive bread-crumb style that the windows explorer uses. But better than nothing, thanks.
>and is inferior to the interactive bread-crumb style that the windows explorer uses
Actually the Finder has interactive breadcrumb style paths that's not hidden at all (just not on by default):
View -> Show Path Bar
It also has a "path dropdown" with all the directories up to the current path, shown if you command-click on the current folder's icon+name on the top-center of the Finder.
"Show Path Bar" isn't interactive, but it's worth noting that the Command-Click trick lets you navigate to any of the folders along the path, not just display them. (Which I'm sure you know, but just in case readers don't.)
I've never found macOS's window management to be that bad, but to be fair I've been using Moom for a decade or so, and lately have been using the "split full screen app" trick a lot. (Someone else mentioned the long press on the green "full screen" dot for that, but you can also do it just by making an app full screen, going to Mission Control--which I do with a four-finger swipe--and dragging a second app on top of the full screen one.) That's not as useful for 27" monitors--in most cases I prefer to actually have untiled windows I can rearrange and resize with the pointer--but it's terrific for laptop screens.
>"Show Path Bar" isn't interactive, but it's worth noting that the Command-Click trick lets you navigate to any of the folders along the path, not just display them.
Not sure what "isn't interactive" means, but the OS X "path bar" let's you do the exact same thing (as you describe for the command-click on the folder icon): by clicking on any folder along the path you can navigate to it.
Note that it takes a double-click for that though. Perhaps you were only single-clicking?
Just hold alt/option when copying something and the Copy option will change to Copy path. The path can be pasted in the Go Menu (or with Shift-Command-G)
But Finder lets you drag any folder into an Open or Save dialog to navigate the dialog to that folder. That's something that continually frustrates me in Windows. As near as I can tell, the only way to move a Windows open dialog to point to a folder that's open in Explorer is to copy and paste the URL (which means throwing away whatever's in my clipboard).
What is so horrible about it ? I have the exact same reaction every time I have to use Microsoft Windows. Window management is one of the strongpoints of MacOS.
> A network drive hangs? Good luck with Finder..., the whole system halts.
I've never encountered this problem on both AFP and SMB shares.
Fullscreen mode loses focus constantly with 2 monitors, SMB shares are very slow especially with large amounts of files to be listed. Finder crashes a lot dealing with network filesystems. The search on Finder is extremely bad and doesn't hold a candle to Windows 10 search.
I am typing this on a Macbook Pro so I am not some Windows fan. OS X has fallen way behind in its desktop incarnation. The only reason I use OS X is due to its Unix shell for development. I don't think anyone can honestly say that as a GUI OS X is better than recent Windows.
> Window management is one of the strongpoints of MacOS.
Can you give an example of this? I regularly switch between macos, windows and ubuntu and I have a hard time coming up with much positive to say about macos window management (unless a third-party tool is used).
I never understood the whole obsession with running applications fullscreen. The whole point of a multitasking OS is to run multiple apps at the same time. If you need to run your apps fullscreen you probably need a bigger monitor.
For the other perspective - it actually boggles my mind when I see coworkers with one desktop open with 3-4 windows kinda mushed all around, different sizes, etc. Like, you paid a lot of money for that good display, why are you not using it?! How can you stand having to scroll all over a window to see something, when you could just fullscreen it? Is it that they need to context switch quickly? Why not just use multiple desktops, with all fullscreen or at least fully-half-screened apps?
I never understood this type of workflow and when I was in school it was everywhere, it's like apple was pushing for people to have little windows strewn out over an otherwise gorgeous display.
I am using it, that's the whole point of having multiple windows open. Each window shows whatever is relevant about that context, and it takes up no more space than necessary so I can have as many sources of information visible as I want.
Also, it's nice for spacial awareness. I feel a little bit lost when a window takes over the entire screen and blanks everything else out.
Regarding spacial awareness, because I have the same window in a given "desktop" all the time, my spacial awareness is locational. If I'm in my IDE, I know it's 2 desktops to the right to get to my terminals. From there, 3 to the left to get to my company chat. Etc.
It's even better in ubuntu because desktop switching isn't just left/right, it's up down. So I can really think geographically. I guess that's just the kind of brain I have - for example, if I want to look for tickets for a movie I want to see, I search "movie theater" in google maps and click the theater in the location I want to go to, then navigate through ticket buying etc. shrug who knows man
The common use-case is laptop vs. desktop. If you have a 34in. WQHD display, full-screening one app is going to be silly; if you're stuck with the built-in display on an MBP when you're at the cafe, it's much more practical.
command-tabbing between apps instead of windows is just wrong imo. You constantly have to keep in mind if the window you're switching to is part of the same app or a different app.
The way command-tabbing brings all an application's windows forward always bugged me too. Fortunately, there is LiteSwitch (http://sysbeep.com) that lets you switch to the frontmost window only.
While I'm at it, I'll also plug https://manytricks.com/witch which I find invaluable for switching between windows and just got an update.
It bothers me more that windows explorer doesn't have tabs natively. I use that all the time in Finder and it's way more convenient to be able to right click on a directory and say open a new terminal tab in that directory then having to go to the title bar and type in cmd in windows.
This is true, but better touch tool, moom, divvy, etc all leapfrog windows or ubuntu window management.
The network drive problem, I have not run into with my NAS, so idk where to help you there.
I'm especially partial to sitting there with a 4+k screen with nothing open, opening two finder windows to copy a file, and having them spawn on top of each other as tiny little 500px squares in the middle of a vast ocean of unused screen space.
I was a MacOS user from System 7 through to Yosemite. I now use Ubuntu 99% of the time (I still have a Mac that I casually use). IMO Ubuntu is now superior to OSX in almost every way. For me, Ubuntu really does "just work", and gives me a vastly better package manager, and the ability to use the i3 window manager, which has fundamentally changed how I use my computer.
I am specifically saying "Ubuntu" and not "Linux" because I don't care that it's a bloated distro, I don't care about tweaking things just right. I don't care about the freedom and flexibility that "hardcore" distros give you. I just want to turn on my computer and get to work, and Ubuntu gives me exactly that.
What about stable APIs for development? I only ask coming from a C++ background. I see much fragmentation in Linuxland, with encouragement for Vala, every other GUI tool written by RedHat being written in Python etc. so it doesn't encourage me back. This might be a bit harsh as I recall using Qt with joy on Linux but the big push appears to be for other languages.
Your experience may be different but I'd be interested in hearing it.
This might sound silly, but for me it's Network Link Conditioner for simulating different network conditions. The Linux equivalents are much harder to use.
Either you're trying to start an argument that I want no part of, or you're asking the wrong person. I don't have much experience with any linux desktops (I've just done server stuff mostly), so I have no idea. Hence why I'm asking.
I love OSX but never understood why people pay so much for those apple displays. we use them at work and they suuuuuck. super heavy, can't adjust height, can't put it in portrait mode which is .. I don't know useful 40% of the time, glares like a mofo, and cost 2x-3x similar quality displays from not Apple. Is the look of them that desirable?
i finally switched for same reso Dell units on my desk and really happy about that.
For a long time, Apple displays were the only displays that also provided a proper laptop dock. Not just a cheap USB-to-XYZ-multiport-adapter with the limits those bring, but a real dock connected via PCI-in-a-wire Thunderbolt, making the ports on the back of the monitor just as good as ports on the machine itself. Combined with the monitor's MagSafe adapter and a MacBook of some sort, you had a beautiful setup: the only wires you needed to switch from laptop mode to desktop mode were the monitor's MagSafe and thunderbolt cords. Just keep all your desk hardware plugged into your monitor, and save the macbook's power adapter for when you actually needed to charge on the go.
There's also something to be said about the consistency of the old Apple displays. I've seen dozens at this point and none had noticeable backlight issues, and they look good right out of the box instead of requiring the user to switch off 15 gimmick settings to get a proper picture.
They had their issues but they did several things right.
The glossiness of them is highly desirable for anyone doing visual work: Photography, video editing, UX design, graphic design, etc. Colors and contrast pop a lot more.
It's a personal preference, however, so we let people at work make their own choices.
Apple displays have long been very high quality with accurate colors. They also usually come with a built in hub and ability to power a laptop, which is very handy. It allows the monitor to essentially be a dock.
>I love OSX but never understood why people pay so much for those apple displays. we use them at work and they suuuuuck. super heavy, can't adjust height, can't put it in portrait mode which is .. I don't know useful 40% of the time, glares like a mofo, and cost 2x-3x similar quality displays from not Apple. Is the look of them that desirable?
No, it's more than the glare means no bloody anti-glare coating (and hence more sharpness), the colors (saturation etc) were often reviewed and measured best-in-class, and same for angles of view, brightness, etc.
Oh, and portrait mode, while nice, it's at best a niche use.
To be fair, when I bought my Apple Thunderbolt Display in 2012, I compared it to other monitors available at the time, and when you compared not just the display quality but the fact that it had an integrated Thunderbolt dock, it really wasn't too overpriced. (IIRC, Dell's closest equivalent was $949.)
The problem is that Apple almost never lowers their prices unless they come out with a "new improved" model of something. So, as a given product continues to be sold without an update or a price drop, the more steadily outrageous its price point seems. To the point of the original linked article, this is a serious problem for the Mac Pro. It was expensive at introduction, but it's been downright absurd for the last two years. This is an issue across most of the Mac product line currently, though.
I don't use Apple displays although I use Apple hardware. I can justify putting a 4 digits if it makes my life easier. I don't see how an Apple display can do that. I found Dell/HP IPS to do the job for my charting needs.
No, as a pro, my reasoning is they are great for photo/video work, and even affordable compared to competitive solutions (I mean at the time, they don't make them anymore, but the 5K iMac screens are excellent).
Also, colorsync works much better on Apple monitors. I have two monitors on my desk, a Dell P2815Q and a Apple Thunderbolt Display. The Apple had a default colorsync profile that looks perfect, the Dell, I got close, but I can't get quite right. Colors change slightly when I drag a window from one monitor to the other.
Most users don't realize how good Apple monitors are, and how much better they make your work day.
I just don't understand scrimping on something you will spend all day staring at. Getting something "almost as good" for half the price is a terrible deal.
Er, are they good? I have a gorgeous Dell IPS 4k monitor at work and (I know, it's very silly) I actually get a tiny thrill whenever I pop my IDE open in it. It's just so darn pretty.
Sitting in front of a Dell 4k monitor and having used a good 4k monitor I am would not call Dell's offering good.
I have never used an Apple monitor, but I have used several other 4k monitors and they are great. My Dell monitor is just fraught with compatibility issues and needs special software to work with win 7. I use Ubuntu and it barely works there, my coworkers have to treat theirs like special snowflakes.
I think I communicated that poorly. I only need to use win 7 every 3 months when I change my domain password. I don't need a 4k monitor for that, so I go without.
On Ubuntu the monitor needed much finagling to get working right, unlike my AOC or Asus 4k monitors which both worked when hotplugged using HDMI.
My coworkers, who aren't all devs, have more work to do in windows and they needed the special software.
How odd, maybe I'm just quite lucky then. To be fair I'm not doing any sort of dual-monitor setup. I just forgo my laptop screen entirely (close the lid) because I've found the one 4k to be plenty. Maybe if I started getting more complicated than that, things would get weird.
Most of my workmates have them. They look like glossy mirrors, and several complain about it. When you can see your reflection in them, plus all the room lighting, that's a terrible quality for a general purpose display. Meanwhile I have a much cheaper matt Dell monitor which is much more fit for purpose, and I don't suffer from the glare and reflections.
also, they run really warm... I don't use my cinema display nearly as often as I would like because I can't tolerate the heat coming off of it for extended periods of time.
oh yeah, our whole floor heats up so much they gave every person a futuristic Dyson fan to go with the display :)
this can't be good for energy consumption either
oh yes this is another issue. if they break they can't be re-ordered so now IT is stocking them for replacements. add another X to the price per person :)
Thinkpads work out of the box for every regular feature (I had to write a shell script to make my X1 Yoga handle folding it into tablet mode -- ~15 LOC).
A thinkpad + ubuntu I think, will hit all your needs. Lenovo preinstalls Ubuntu on some of their thinkpads in some cases (afaict, large enterprises). The thinkpad line IIRC works to use well-supported hardware for Linux. You can check the certification list to be sure: https://certification.ubuntu.com/certification/make/Lenovo/
I was on the mac a long time, but eventually switched to Linux and am much happier.
EDIT: Thinkpad P series has 16/32gb of RAM, donno about battery/size/weight, as there are a few models, and you'll have to figure out what your preferences are in trade-offs.
I have a P50. It is a heavy laptop, but it has a mobile NVIDIA GPU which can easily outclass an Intel Iris graphics processor.
The only front this laptop falls extremely short is with battery life and weight. The P50 is really good if you need the power and I usually get around 5 hours of battery life without Optimus because it interferes with my workflow under Linux.
Same. Linux on a P50 with 64GB ECC ram, Xeon. Battery life isn't hot, but I'll suffer this compromise for the power and flexibility it gives otherwise.
I currently have a thinkpad with ubuntu in it (not preinstalled should it matter), and unfortunately I'm considering switching the other way around. There are just too many things with this combo that don't work the way they should, like webcam and microphone not working, public wifi with redirection not redirecting etc. I'm completely fine paying some 50% premium to get the support and customer care of a major corporation.
Most people don't have 5-8k to drop. The main problem with your specs is thinner/lighter is diametrically opposed to longer battery life and more RAM which consumes more energy.
In 2015 the macbook pro had a 99.5 watt-hour battery (100 watts is the limit to take on airplanes). Now it has 76 watt hours.
So you could get approximately 30% more battery life had they not prioritized thin and light.
Everyone benefits from thinner and lighter. We only benefit from more battery life if you reach the end of your capacity.
For phones, it's an easy product decision, thinner/lighter is always better. You give the most benefit to the most users, and those who really need more battery life can get a battery case and pay the weight/thickness costs alone.
For a Macbook, it's a closer decision, but I think in this case they still reached the same battery life as the older laptops made the decision reasonable. Remember that it wasn't that long ago that a 7 hour laptop battery life was extraordinary, 10 hours should be plenty for most users. And there are also battery packs you can get if it isn't'.
I heard that Apple wasn't able to get a custom fitted 85 watt-hour battery ready in time for the release, and I expect they will refresh the MBP lineup with it when it's ready, giving an 11-12 hour battery life.
I was lucky enough to go through all stages of old-enough-to-have-a-laptop school (highschool and college) during whatever we should call the period of time in which batteries in laptops rapidly evolved. So I started out a freshman where laptops were hunks of solid IBM plastic chunking at like, 4 hours, maybe? And came out of college with surface pros and macs humming at 8 hours. For me, it wasn't until they were pushing 6-8hours (of ACTUAL usage) that laptops made sense as education devices - I'm at school for 6-8 hours , I can squeeze in maybe 30 minutes to an hour of charge time if I want to be stuck in one place for that stretch of time (as opposed to, I dunno, walking around, working out, whatever). So there was an actual finite number of battery time for me that made the things viable.
I thought battery life on my MacBook was great particularly after AppNap was introduced and the aggressive shaming of power-hungry apps via the battery menu.
Then I build some C++ under Xcode and battery life is not so good...
Probably great for web browsing and casual use as used by 98% of users though.
Just because that's what you want, did not mean it's what everyone wants. I, for one, would love a solid inch thick brick of a phone with a replaceable battery, and a 2-inch thick laptop with sturdy replaceable parts.
I know one is discouraged from questioning whether a commenter read the article, but I'll point out that Gruber is on about the Mac Pro, which is a desktop machine. I'll doubt the new one will have "longer battery life".
> most of their pro users use MacBooks and most of the rest use iMacs — and that they have big plans in store for the pro segment of both of those product lines
I think this makes talk about the Macbook Pro on topic.
Let's not mince words. Ubuntu and linux OS'es in general are TRASH as far as user experience is concerned.
Lately I even tried elementaryOS, and it's worse than Ubuntu. They keep saying how it's not a copy of OS X, and it evidently isn't as far as user experience is concerned, but on top of that they're obviously inspired by a design that's now completely outdated. At least Ubuntu is looking ahead and thinking of touch interfaces.
Ubuntu is genuinely the only somewhat passable option for people who don't know, nor should know, what process thread is, or even how many cores are in their CPU. Ubuntu has a somewhat consistent UI but still suffers from all kind of major bloopers. I mean, what the fuck. It's 2017, and it still doesn't save the last window size & position in most apps. It drives me mad. Some do, some don't, so it ends up worse than not supporting it at all.
I got fed up with Windows and Ubuntu so I bought a five year old Mac Mini. Sierra looks amazing, and it runs silky smooth. I don't AAA game and this will most likely serve me very well for web development. Came with a big SSD drive too.
It's kinda sad nobody can compete with Apple. But if anybody will I don't think it's the "free software" world.
Maybe it is the attitude that makes it "TRASH." It seems you have only tried Unity and elementaryOS? Right now I use Cinnamon, which is similar enough to Windows to not be "TRASH" imho.
Also here is a Linux joke for you:
If you don't like certain things, just fork it and do your own thing.
Cinnamon is good. Until you try to install a modern nvidia driver. At least, that was the case 6 months ago.
Dual monitors should be plug & play. I shouldn't have to add a new repository to apt-get, I shouldn't have to choose between 15 Nouveau drivers and 15 potentially system-breaking nvidia drivers. It should "just work".
Dual monitors, and just display output in general... this is 2017, this is very basic, expected functionality. It doesn't matter how complicated it is to implement - the user doesn't give a shit, they just want two monitors.
Free software offers a spectrum of quality and innovation; and you've picked the worst of it.
I use a desktop (i3), package manager (nix), editor (Emacs), programming language (GHC Haskell), file system (ZFS), bidirectional sync (unison) and security (gnupg) that are collectively far more innovative, powerful and stable than anything Apple have ever produced.
Of course, if you are specifically looking for consumer tech, ease of use and support, then open source probably isn't for you.
>Of course, if you are specifically looking for consumer tech, ease of use and support, then open source probably isn't for you.
If you can look past your smugness a bit, why? Why is it that if I want "ease of use", open source isn't for me? Do you not see the problem here? I can't see how you can argue your favorite projects are more "innovative & stable than anything Apple have ever produced", but in the same breath say that open source isn't for someone who wants ease of use. What exactly does "stable" mean to you?
Ubuntu are trying for ease-of-use and it's a noble goal. But if you are going to judge them only on ease-of-use, then it is difficult for them to compete with the resources of Apple, who are the richest company on the planet.
My point was that there are quality open-source projects out there, after you appeared to assert otherwise. But ease-of-use is not something developers/startups seem to be interested in spending time on.
If you want me to qualify stable, then let's compare Apple's bidirectional iCloud sync to Unison, or Time Machine to ZFS snapshots.
I used to enjoy tinkering with computers and spending all my time on Linux trying to get things working. I didn't edit any code, just spent days getting ndiswrapper working, reconfiguring my desktop after an upgrade, refinding my partitions after LLVM upgrade decided to forget them, adjust myself to the steady removal of configurability in GNOME, adjust to the deprecation of things I used every day (Konqueror has gone! Use Dolphin! It didn't do half of what Konqueror did), faff around with bust graphics and failed sleep/resume, adjust to the "new" way of window management that decided that 30+ years of windowing paradigm was "distracting" and stopped the use-case tests of someone's mum who had never used a computer before finding it easier to use etc. etc. etc.
I eventually got fed up of all of this and went to OSX with Windows alongside after 15 years of Linux use, and that's from RedHat 5.0 and 6.2 days. No not RHEL, RedHat.
The "ease of use" argument is sad, and precisely what some forget when developing software - it's there to be easily used, else nobody will use it. The computer is there to work for YOU, not YOU work for it (ie, spend hours fighting with it).
You only have to look at Windows 8 to see that "ease of use" was abandoned on the Start menu and see what a mess that was.
Linux doesn't have to be that inconvenient. You can now buy machines preinstalled with Ubuntu LTS. If you want to keep upgrading to the latest and greatest, yes it can be a rough ride.
Don't listen to him, there are open source UIs that are easy to use. Gnome, KDE, and XFCE all behave pretty darn well and are pretty stable.
i3wm, the window manager the post above is talking about, is incredibly complicated, but provides efficiency and a sense of accomplishment when learned. That reward from learning something complicated is where the smugness of most open source enthusiasts comes from. Don't look too much into it.
Any perceived smugness on my part was a response to baiting from the parent such as "It's kinda sad nobody can compete with Apple".
Open-source can compete on many fronts and offers many other advantages (i.e. freedom), but on a pure ease-of-use assessment, I do not agree with you that Gnome or KDE could sway the parent, if Ubuntu completely failed to do so.
When you use a proprietary OS, you rely entirely on its creators to create a system that does what you want. When something in Windows or OS X is not what you want (or is broken), you can't do anything about it. When something in a free OS is not what you want, you always have the option to use something else.
Free is about transparency. Not snooping on my files, do machine learning mumbo jumbo on my habits, surreptitiously nudge in the direction they want is huge part of it.
Free is a promise, a promise that I do what I say. If the software doesn't, it is for all the world to see my deficiencies.
Free is also about not being an asshole. It is about accepting the fact that, just because the users use my software, I don't get to control their lives.
Specifically? People typically point to the telemetry and forced updates, but I've managed to disable both, using what were admittedly much-too-difficult procedures or third-party software. It's annoying, but not that annoying.
I really like Finder and Spotlight, but not enough to be tied, via licensing, to any specific hardware.
Less cluttering--I can uninstall apps on windows
Native package manager-- I guess Windows has Chocolatey and I use npm for dev work.
* HiDPI support in all apps, whethere they are aware of it or not.
* Built-in PDF editing and creation from all printable content.
* POSIX scripting and CLI.
* Very clean and consistent configuration system (defaults). Reset an app to factory? Delete one plist file and potentially an app support folder. Got a new Mac? You could even boot it up from the old harddisk. Good luck doing that with windows registry.
* Touchpad support.
* Systemwide fulltext search with indexing and complex search terms. Somehow MS still hasn't caught up with 10.4 Tiger it seems to me.
* Superb discoverability of power user features with in-app help system and hotkeys displayed in the menu.
* A consistent menu system.
* Powerful and system wide screenshots.
* Very good screen calibration out of the box
* Cmd-C / V work everywhere, including the Terminal.
* Very good terminal with good color schemes, tabs, unicode and even emoji support.
On the other hand Windows has:
* The best keyboard-only UI (although ribbons were a big step backwards in that regard - very hard to discover now)
* The best graphics drivers
* The best Office version (although Google Docs has mostly replaced the need for me)
* Windows-P, I really like that menu
* The Windows 10 task manager, pretty neat.
* Pen and Touchscreen support.
Overall MacOS beats it hands down for me when it comes to productivity.
I think most people who haven't used an Apple laptop extensively are not aware that the touchpad actually works, and you do not need to carry an external mouse to use the device comfortably.
Apple touchpads work so well that I don't use a mouse at all anymore. I have a Bluetooth touchpad with my external keyboard, and I even bought an old Fingerworks touchpad to use with my PC.
The power of gestures is incredible. Managing macOS windows without a touchpad is awful. With a touchpad it is the best.
We have those at my work, and we also have an iMac with a magic trackpad, and several of us have macbook pros.
It's a huge improvement, but it's maybe 75% of the way there. And the points at which it's NOT there are very noticeable and annoying. There's still regularly times where the trackpad just gets totally confused and you can't move the cursor for 1-2 seconds. Mehhhhhh
I have had a work Mac for the last two years. I haven't found Apple touchpads to be this revolutionary change that'd make me like touchpads that others have and still carry a wireless mouse around. I'm not the only one in my office.
I literally don't know what I'd do if I didn't have the capability of Automator combined with completely-customizable keyboard shortcuts everywhere. It's seriously a power user's dream.
> The best keyboard-only UI (although ribbons were a big step backwards in that regard - very hard to discover now)
Use the Alt key.
For example, open Word and press Alt to show the keyboard commands. If you want the keyboard commands for the Home tab, press H as shown.
Alternatively, to open the Ribbon and show the keyboard commands for the Home tab, press Alt-H.
So, if you want to center some selected text in Word, press Alt, H, then AC
If you want to insert an image, press Alt, N, P and so on.
The Ribbon makes Office programs much easier to use, so you probably won't want to learn many of these key sequences. However, the ones you already know will almost certainly work.
For the record, I much prefer Windows 10. However, the fact is that Apple doesn't sell any of the hardware I use. It doesn't make a proper tower desktop and it doesn't make a small rotating-screen laptop that doubles as a touch tablet.
Even if I was willing to compromise on hardware, less-functional Apple products would cost 2x to 4x more.
As I wrote I still do consider keyboard UI on windows the best, because it's AFAIK the only widely supported destop OS with full keyboard control of the GUI. But the Ribbons are not for me - it's a menu system that's also trying to be a context aware palette, which makes it worse than either a classic menu (easy to skim though and find what you want, especially on MacOS) or a palette (can be placed wherever it's the most useful, i.e. allowing the shortest mouse travel).
Please note that I still do consider Windows Office the best version nevertheless, but for different reasons.
The Ribbon is a much better UI because it takes up less room, makes more features more accessible, and provides much better discoverability. To appreciate the pros and cons, I'd suggest going though the full account of the development [1], though there is a simpler intro/index [2] that links to a good video [3].
Of course, there's also personal taste, and you are perfectly entitled to prefer whichever menu system you like. However, the Ribbon won a decade ago, so at this stage, it would probably be more useful to learn how to make better use of it. My Alt tip is just one example.
Windows has WSL which gets better in the new Creator's update.
>* Touchpad support.
I don't use laptops, but I've heard good things about the touchpad in the Surface line, and Dell XPS.
>* Superb discoverability of power user features with in-app help system and hotkeys displayed in the menu.
Windows has had hotkeys in menus as long as I can remember. I know they where there in 3.1. If you press Alt in Explorer you'll get overlays with hotkeys over the buttons and menus.
>* A consistent menu system.
I'm guessing you mean that the ribbon is inconsistent. Most programs use the regular menus, and the ribbon is just a glorified toolbar with tabs. I don't see the big deal.
>* Powerful and system wide screenshots.
Windows 10 has Win+Print screen to save fullscreen screenshots as a file. For more control, there's the snipping tool that's been included for years now.
>* Cmd-C / V work everywhere, including the Terminal.
Ctrl+C/V works in Windows terminal too.
>* Very good terminal with good color schemes, tabs, unicode and even emoji support.
I usually use ConEmu. I just tested, it does support emoji, but I don't see the point. I just tried "mkdir " (edit: seems like HN eats my emoji, but that's supposed to be a directory with an emoji in the name), and it worked as expected. If I use the built in terminal in VSCode it even looks nice, with colors, but it's just two blank rectangles in both cmd and powershell.
>>* Cmd-C / V work everywhere, including the Terminal.
> Ctrl+C/V works in Windows terminal too.
If you mean Command Prompt by "Windows Terminal", this is not the case (if not, I'd really like to know what "Windows Terminal" is - I use Command Prompt for DOS/Windows things and Kitty for *nix related things). At least I have had to turn it on explicitly in the Command Prompt options (quick edit mode), and it's one of the first things I do on a Windows machine after setup.
alt key in windows does nowhere near what the help on macs does. you're looking for a command or forgot where it was in the menu or want some documentation -> open up help, type a query in the unified search and you're presented with docs as well as commands. Highlighting a command shows you the full path in the menu by opening it up and now you can even sed the hotkey. That's what I mean with discoverability. Every provrammer who uses GUIs should have a look at how that works, I consider that mandatory homework.
> screenshots
Obviously windows has screenshots, but you overlooked the word powerful. MacOS has all the features of the snipping tool right there on system wide hotkeys, no need to open up an app first - including delayed shots and area selections. It's not a big deal, but it saves enough time that it's a total no brainer for me to provide screenshots fo whatever question someone has (even when it's just a distraction from my actual task) while on Windows it takes a crucial 10-15 secs longer to do the same per shot and would disrupt my workflow.
> WSL / Terminal
I do acknowledge that things are getting better there and this is a great development - if/when it gets there I'll consider windows among my primary PC choices again.
Btw. it's telling that Windows-only users always overlook my point about the registry in these discussions. I use all three desktops and I can tell you, not having a central registry in an OS is a huge productivity win. With windows I spend days every 2 years getting a fresh state again while on Mac I can just copy over the file system (using automated tools that support thunderbolt cables, copy half a TB in 30min and are again built in) and start working after a coffee break. Yes there's imaging in Windows, but then you have to regularly keep those up to date and in the end you spend even more time if you only manage a handful of PCs.
NTFS supports compression, hard links, streams, transactions, quotas, and has a master file table (or multiples of them) which enables tools such as Voidtool's Everything utility to find files in nanoseconds. APFS does not support any of this.
The MFT is very helpful in that it is on the disk itself, as opposed to Apple's solution of getting a separate utility to index the disk (Spotlight) and generating a giant database file on the filesystem in your hidden Spotlight-V100 directory. Spotlight is on the filesystem, not IN the filesystem.
The NTFS page on Wikipedia lists all of the wonderful features of NTFS. Also Windows Internals 6 details some in wonderous detail.
In any case, APFS is far better than HFS+ (which has to flip all metadata's endianness as it is stored in big-endian format). It also has single-threaded access to this metadata, from what I recall. John Siracusa's review 6 years ago of OSX 10.7 Lion detailed the poor state of HFS+: https://arstechnica.com/apple/2011/07/mac-os-x-10-7/12/
> * Very good terminal with good color schemes, tabs, unicode and even emoji support.
To be fair, the built in Terminal.app sucks, you have to get iTerm2, which is a 3rd-party app, but at least it's not another $40 replacement app that shouldn't suck by default, (the same cannot be said for Finder and its replacements).
The iTerm2 features page[1] has a pretty good run-down of all the extra stuff it can do. Personally, I'm a big fan of the split panes, and I use iTerm2 full screen with two panes.
That said, almost identical behaviour can be had in fullscreen with two Terminal windows sharing a full screen desktop space.
I also enjoy having my terminal be a slightly-transparent black rectangular slate, with no title bars, corner curving, etc., but that's purely preference and has no impact on functionality.
I think you touched on a big win for macOS in my book:
> Specifically? People typically point to the telemetry and forced updates, but I've managed to disable both, using what were admittedly much-too-difficult procedures or third-party software. It's annoying, but not that annoying.
Before I switched to macOS about 8 years ago I was used to the mindset of "oh this doesn't work the way I want but I'm smart enough to figure out how to fix it" and I took great pride in being able to fix my PC no matter what happened. I'd dig into the registry, I'd futz with inf files, and drivers. I didn't mind it too much, it wasn't "that annoying". And then I switched to a MBP my freshman year of college mainly because it meant I could use Windows/Linux/macOS and I loved how solid they felt and had people around me rave about the hardware and longevity of the machines themselves. I played with macOS and found that after getting used to it it was a joy to work with. It took a little longer for me to realize that I wasn't spending all my time making sure my computer kept working, it just worked on it's own. The OS that I had belittled and mocked for years for being "a toy" or "dumbed down" actually was insanely powerful under the hood, extremely intuitive, and looked beautiful. That last point may sound stupid, I know I used to think it was, but it's a big deal. You are going to be staring at this for 8 hours+ a day. Trust me it is way more enjoyable to look at something pretty than something not. When it comes down to it for me macOS is built on a rock solid core and makes switching between it and the linux servers I work on a breeze with beautiful apps and a beautiful UI all of which JustWorks (tm). For that I am more than willing to pay the MacTax (tm).
Although Apple have been known to change things for change-sake, I find it is possible to take a Mac running Mac OS 10.2 and 10.12 and you will find your way around with ease.
Of course, under the hood they've rewritten everything like DNS and actually using the hosts file etc. but from a usability perspective you are right that you end up fighting with the OS less (so I have found).
I dunno, I've done all kinds of stuff to my windows 10 machine to keep Candy Crush Soda from reinstalling itself. Yet every few weeks/months that King garbage ends up on my start menu.
I use windows 10 because I'm a .NET developer and a gamer. If I could use OSX on my desktop instead, and have access to the same steam games - I can't think of a reason I would stay.
> Less cluttering--I can uninstall apps on windows
On OSX you typically don't need to 'uninstall' just drag to the trash. Yeah, some apps leave some garbage behind - but the same happens on windows when you 'uninstall'
> Native package manager-- I guess Windows has Chocolatey and I use npm for dev work.
I think the biggest detriment on this point for windows is that the command line interface on windows is not friendly. I recently worked on a project where half the team didn't know powershell, and the other half really loved powershell. We also used some stuff from the node ecosystem. We had scripts that would only run in cmd.exe, powershell scripts, and scripts that only worked in bash. In the end http://cmder.net/ saved my butt, since I could have all three shells open.
> I dunno, I've done all kinds of stuff to my windows 10 machine to keep Candy Crush Soda from reinstalling itself. Yet every few weeks/months that King garbage ends up on my start menu.
Another anecdote, I right clicked and clicked "unpin from start" and never saw it again.
Some people don't like having junk they are not using on their computers, even if they 'never see it again', it's all about being in control of your own computer, (as much as possible, that is).
It's still there though, isn't it? Granted you don't see it in the Start Menu, but it remained installed on your system. God knows if it's not having its own process at launch, doing similarly nasty stuff?
It isn't. You could easily check that in Task Manager or Process Explorer and/or Glasswire or whatever.
Most of these things take up a trivial amount of disk space -- some of them are just placeholders -- so they're not worth the time taken to worry about them.
You never asked for Solitaire or Freecell either. And you never asked for Notepad or Calculator or whatever. Did you have mini-hysterics about Solitaire being bundled, and did you waste hours worrying about it or removing it?
Candy Crush Saga was actually one of the benefits of Windows 10 that Microsoft promoted, for two reasons: (1) Candy Crush was enormously popular, and (2) it provides practice in touch operations, just as Solitaire got people used to using a mouse.
This is idiocy. Solitaire and Freecell are originally not video games, and nobody is making money off selling them. Candy Crush Saga, on the other hand...well, I don't think I need to explain to you who benefits from having it preinstalled on Windows.
This is idiocy. Freecell was originally sold as part of a games pack for Windows, and Candy Crush Saga is a highly-rated game that's otherwise available from the Windows Store. Thousands of people have downloaded it.
The Good Times video in Windows 95 also sold Edie a lot of CDs, but it would be equally stupid to criticize that. In both cases, Microsoft is providing something entertaining that demonstrates some benefit of the operating system.
I don't give a shit if it's highly rated. The Chainsmokers are also highly rated and completely critically empty as well, just like Candy Crush. To compare the two is lunacy, it's the difference between Windows Media Player coming with Bach sample music versus Nickelback. Thousands of people downloading something is no excuse for advertising it and forcing it down the throats of probably millions more paying customers. It sets a bad precedent for Microsoft either way because they're supposed to be the company that supports Freedom Zero where Apple doesn't, and yet increasingly your use of their software is contingent on accepting ugly and invasive advertising. This is also why "you can turn it off" is a non-criticism. "You can turn it off" is an acceptable response to Siri being included with macOS, I shouldn't even have to deal with advertisements in the first place.
If you don't see this as part of a bigger trend I don't think I can help you do so.
I guess you think it's clever to imitate me or something? Anyways you clearly have nothing more to contribute. Microsoft wanted to give away Windows 10 "for free" but realized they'd be screwed money wise, so they loaded it up with advertising and telemetric crap to try and compensate. It's really that simple. I'm sorry I bothered engaging in the first place, and I certainly won't continue if you just imitate me with more inane bullshit in your reply.
Note the "think" in my statement. I have no evidence. I just can't fathom Microsoft actually thinking pre-installing Candy Crush on Windows 10 provides a net positive value to their customer, more than the company King Digital. With its psychological scammy IAP scheme. Or maybe Microsoft is just in on the IAP scam. Either way, its the first thing I removed when I installed Windows 10.
Microsoft's Brandon LeBlanc certainly promoted it as a feature:
"Solitaire. Hearts. Minesweeper. These are games that have been played millions of times over the years in Windows. And they are coming back in Windows 10. If you’re a Windows Insider, you can check out a preview of the new Microsoft Solitaire Collection that’s included in the latest build of the Windows 10 Insider Preview (Build 10074). In addition to these games, we’re also working with partners to bring some of their great games to Windows 10 too. And we’re excited to be able to announce today that King will bring their game, Candy Crush Saga, to Windows 10. Candy Crush Saga will be automatically installed for customers that upgrade to or download Windows 10 during the launch! Over time, other popular and awesome King game titles will be available for Windows 10. Ever since Candy Crush Saga arrived for Windows Phone, I’ve spent countless hours of fun matching candies. I’m really looking to playing Candy Crush Saga and King’s other game titles on Windows 10."
For you, maybe. For me, I have zero intention of going through the hassle when I can just use Linux instead. Any hassles here are easily learned and generally don't _require_ the use of third party stuff doing black magic under the hood.
I'm not following here. Apple has great power management. Sure, it's closed, but you can actually rely on it today. Put linux on it and all of a suden you have power management issues.
Windows is pretty bad on high DPI screens, and even third party tools don't help. There's inconsistency everywhere, and most older software is just blurry.
Chocolatey is nowhere near as good as e.g. pacman.
You mentioned disabling telemetry, but the fact that the OS you paid for spies on you and shows ads is a big turn off.
It's terrible to script, you cannot make use of the billion unix command line utilities (and there is no equivalent eco system either), you need to use the mouse too much, and it's awful to program with in any but a small number of sanctioned programming languages.
What language are you wanting to program in that is awful on Windows? And regarding your other points, Windows scripting story has come a long way in the last few years.
Your concerns sound like they come from many years ago and that maybe you haven't taken a look lately.
What baffles me is that it is 2017 and Windows comes with less than it did decades ago. At least prior to Win2000 there was QBasic and edit.com, but now starting to work on a Windows machine means downloading and installing basic tools.
Need to SSH/SFTP? Download.
Need to edit code? Download.
Python? Download.
Perl? Download.
A reasonable terminal? Download.
Yes, I might have to go to the Mac App Store and get command line tools for C, C++, Objective-C, etc. It is unfortunate that XQuartz is no longer installed by default, granted. (PowerShell is very nice to work with though, I give MS full credit for that.)
I could live with having to download python, perl, etc., but someone please explain why having an absolutely fundamental ability to edit text and securely connect to work and transfer data to/from remote machines is something that any OS should be shipping without?
starting to work on a Windows machine means downloading and installing basic tools
While that is true, it's a bit of a moot point imo. I have yet to see an OS which had everything I needed for any actual work installed out of the box. As such, the way to deal with this is have a script download/install everything for you (and eventually copy or link all configuration files). Using Powershell getting any of the examples you mention is basically a one-liner just like for other package management tools, something like `openssh, miniconda, strawberryperl, conemu | Install-Package`. Or you can go more of a DSC way with Powershell Dsc.
Windows does have a C# compiler by default. It is located in the `Windows\Microsoft.NET` folder. Not very discoverable, though.
And starting with Windows 10 a metro app called 'Code Writer' seems to be installed by default for coding. (At least it's there in my installation.) I didn't try it though.
I'm glad. MS has made a lot of good programming tools. I really like Visual Studio and was a huge fan of QuickBasic 4.5 back in the day. Turbo Pascal was the killer app that first enticed me to the PC world.
It still befuddles me why some of this basic functionality cannot be a standard part of Windows. Not every Mac user uses emacs, or even knows how to open the Terminal app, but the fact that basic tools are supported means that there is just a baseline level of infrastructure to work with.
> but someone please explain why having an absolutely fundamental ability to edit text and securely connect to work and transfer data to/from remote machines is something that any OS should be shipping without?
The ability to edit text _does_ ship on windows - notepad. (not that I'd recommend it but it does exist). Also, OSX doesn't ship with anything better.
FTP isn't a fundamental requirement for everyone, I don't have an FTP client installed on my machine, and don't have any intention of installing one.
Notepad??? OS X has shipped with both emacs and vim since Day 1. I'm sorry, but you seem quite unfamiliar with Macs.
You are correct, I would never suggest anyone have an FTP client; I have that service turned off on every machine I administer. SSH / SFTP are fundamental tools however.
They are provided in WSL, but the Windows Subsystem for Linux is a separate install atop Windows, and the installation process is currently very complex and lengthy.
After going to the Windows store and upgrading to Windows Pro. That's another advantage of OS X- no need to target various OS revisions- the baseline OS IS the OS.
Am i going mad? Windows has a command line ftp tool, I used it this morning. Is it just a Windows 10 thing? because it was also on my netbook running 8.1 out of the box.
Apooogies, I forgot that it ships with emacs and vim. So you use the vanilla installations of emacs and vim for your editing? Or do you download a more recent version and configure plugins? Notepad is a perfectly adequate editor if you need to quickly change a config value, but I wouldn't consider it a usable day to day tool. Nor would I consider a vanilla vim install with no customisation a usable day to day tool.
> I would never suggest anyone have an FTP client; I have that service turned off on every machine I administer. SSH / SFTP are fundamental tools however.
Pedantry at its finest.
I have neither ssh nor sftp on my workstation and I have no need for either.
I've tried to do work on windows 10 Pro on a brand new 8 core Ryzen machine with 16 GB of RAM, SSD storage, and an RX480, and working in windows at the command line is 100x (no exaggeration) slower than Linux.
I'm pretty sure you didn't try PowerShell. It is at least as good as bash, if not better.
> you need to use the mouse too much
I find this more problematic on macOS. I was forced to use a mouse much more on macOS. It doesn't even allow me to press the yes/no dialog only using a keyboard. On Windows nearly every item has a shortcut, sometimes even better than GNOME (but worse than KDE, IMO).
Powershell might have some neat features, but the verbosity of the commands (don't bring up shortcuts here either - you need the full commands as that's what everyone else is using) just drives me batty. I find that they don't lend themselves well to mnemonics or easy memorization.
Who needs ls when you have Get-ChildItem?
Who needs grep -r 'pattern' when you have Select-String -Path c:\ -Pattern pattern?
Now that I think about it, Powershell has a conceptual similarity to Applescript. A proprietary, verbose, and hard-to-discover English-like syntax belying a great amount of power over the target platform.
Microsoft might have gone a little far on the whole "code is meant to be read first" thing, but the verbosity does make it immediately obvious what every line in your history does.
For me one of the joys of using macOS is when you realize adding option or shift to a shortcut does a similar but more specialized thing that is more useful.
In Windows the shortcuts have always seemed less logical to me, though the newer Windows key ones are an improvement.
I find it hard to take powershell seriously, considering last time I tried using it for anything, I just wanted to quickly download something over http. Looked for something like 'wget <URL>' online, the only results I found was something like multiple verbose lines of instantiating an http client and calling methods on it. That's fine for a programming language, completely horrible for an everyday shell, from the perspective of someone who generally only uses Firefox and a terminal emulator in Linux.
After some googling just now, it seems like there's now an Invoke-WebRequest command, but that too looks like a hell of a lot of typing compared to Linux shells.
There's also the fact that lots of stuff in windows wasn't designed to be accessible through the cli. You can't, for example, make a powershell script to toggle an audio output device on and off, as that's only available through the GUI.
> the only results I found was something like multiple verbose lines of instantiating an http client and calling methods on it.
I blame the documentation. Not the official documentation (they're not excellent, but OK), but various outdated resources residing in many blogs and sites, including Stack Overflow. I mean, what's wrong with
iwr REMOTE_URL -o LOCAL_FILENAME
? The situation this command wasn't searchable by you is unfortunate, possibly because PowerShell has come a long way since its introduction in 2006 and it is hard to remove the outdated resources online. But for those who know what features/commands are available in PowerShell, finding them isn't particularly difficult.
> There's also the fact that lots of stuff in windows wasn't designed to be accessible through the cli. You can't, for example, make a powershell script to toggle an audio output device on and off
This is just one example. On the other hand, I found nearly every thing I had done using GUI could be replaced by a few lines of PowerShell code. Actually automating my day-to-day GUI operations was my way to learn PowerShell, and mostly it worked great. Microsoft is adding tons of commands each release to expose more system functionalities. There are exceptions of course, but "lots of stuff" is a bit exaggerated.
You're not weird. PowerShell does have barriers to entry and that's why I also had used Cygwin on Windows until recently. And it might not change because it is a very different monster than POSIX. It is unfortunate for sure, but still I think it's worth learning especially if you use Windows frequently.
Even if you don't, learning it is a fun experience. It's like learning Haskell just to feel another way to program, even though you're not using it in practice.
I recently installed Windows 10 as a secondary os on my laptop for music making.
The biggest annoyance was configuring the system to not get in my way: Never ever put something in the front of the app I'm working in, forcing a change of context. Don't use up all of the bandwidth when downloading updates in the background, etc. This was much less of a hassle the last time I used OSX. Apple might give you less options for configuration, but at least out of the box it is/was much less disruptive.
Overall I like the look and feel of windows 10, and some of the problems were caused by third party software. I do think a lot of the defaults actually make sense for the mass market, but they should be much more transparent and easier to change.
For example, I expected disabling Cortana during installation would, well, actually disable it.
I find my workflow in "bash for windows" horribly slow. My prompt command (that does git/virtualenv/etc checks) takes 4 seconds to run on windows and 0.04 seconds to run on the same machine dual-booted to ubuntu, and about the same time on a 2015 Macbook pro running MacOS. Everything else in bash for windows is similarly slower than bash on ubuntu. I haven't found a native windows implementation of bash that's anywhere as responsive as Linux or MacOS. I actually find Windows to be a better window manager than MacOS at this point, but I find myself most effective under Ubuntu and i3.
Just the fact that I can download some new software and still see a 2001 era file dialog or such (because it was made with an older, but still supported ancient UI lib), is enough to put me off windows for life...
Personally, rather than thinner/lighter I'd pay for it to be a bit bigger but capable of letting me open it up and replace RAM and the SSD as was possible up to the 2013 ones.
The resell market is good enough that you can sell your laptop and buy one with the upgrades you want for around the price the upgrade would have cost.
That's a workaround, not a solution. Not being able to replace memory is absurd. But being able to replace a hard drive (the part most likely to fail) is just evil.
No it doesn't. Lighter makes it easier to manipulate. Thinner makes it more difficult to hold and manipulate. Did you ever try to "manipulate" a sheet a paper without a surface to set it on?
I'm genuinely curious - why do you need 64 GB of RAM on a laptop? Are there industries where this is a necessity? At that point, wouldn't you be better off having remote machines?
I see this question often, and I totally don't get it. I do use 64GB for work, and in light of this news that no new Mac Pro will arrive any time soon, I'm considering bumping that to 128GB[1]. But I don't do high-frequency corporate hegemony work, or video editing, or any of that — I'm just a programmer.
Moreover, you could delete IDEA and Xcode and my 5 browsers and 7 text editors and my 30 terminal windows and git client and all the other work-related stuff open right now, and I'd still easily use 64GB just fucking around.
I wonder: how is it that people don't use 64GB of RAM? Do they reboot their machines every week? Do they fastidiously quit applications even though they'll probably use the app again within a few days? Are they just all like, "modern memory-swapping technology is so awesome compared to 1990s System 7 'Virtual Memory' that I love to watch it work, even though things run an order of magnitude slower in many critical sections"?
I really don't get it. Terabytes of RAM? Yeah, that might be hard to make use of today. But 64GB is definitely not too much, not for me and probably not for you, or even for your mom.
I don't have 64GB in my laptop, but only because I can't and have to settle for 16GB (unless I switch to a different OS, which is on balance a worse tradeoff currently, and yeah yeah I'm rooting for Linux but come on, one can only maintain hope for a couple decades and that mark is fast approaching...)
Cheap RAM is one of the things that keeps hope alive in this increasingly degenerate era.
> Do they fastidiously quit applications even though they'll probably use the app again within a few days?
s/days/minutes/
It's an old habit of mine that stems from spending most of my pre-adult life with old equipment, and using others' equipment with <1gb ram running Norton antivirus.... I'm so glad those days are over, but I haven't completely recovered. That being said, I do occasionally just leave everything running for as long as I can stand to. My system can never tell the difference.
As for using less than 64gb of ram... Ha! I have never had more than 8gb of ram! I have never needed more than 8gb of ram! 8gb is a lot of memory! 64gb? Are you kidding me? I've considered several times over the past few years getting another 8gb (for 16gb total), and ended up realizing I would never use it.
Don't browsers generally use more RAM when more is available? I only have 12 GB on my laptop and I've never really noticed anything. It's not like I'm using all 50 tabs and 20 applications at once, and the split second that it takes to reload the app state/data into RAM if it got bumped doesn't bother me.
Yes -- anything related to data science. Or, like in my case, algorithmic trading.
I used to keep AWS GPU nodes for that, but those are expensive, and it is infinitely more convenient to keep your data close to your workspace, instead of constantly uploading/downloading work batches.
A better option is to just have your own server at your house on a static IP address that you connect to via your laptop. That way you can have as much space as you want and it's always available to you.
Remote machines can't match the local machine in terms of latency. Especially when I'm "remote" from America somewhere in the middle of nowhere where the best connection I can get is a 3G.
But back to your 32/64Gb question. I need it to run my Docker containers. I do trading and stuff.
I use Photoshop a ton and currently I'm convinced it doesn't even TRY to use the ram. I open it up and it just seems to use about 2GB and happily chug away on swap rather than taking a bite out of my 32GB
I have some resource intensive VM's I run, but I also need to be mobile enough to go to different client sites. I could potentially do some of that stuff in the cloud but then the monthly cost starts to get high for just development.
I agree, the cloud is not expensive. £30/month gets you a beast of a machine with OVH that's always online with masses of bandwidth. Connectivity issues from the client are the main consideration.
Personally I want the Apple hardware, but would love to run 3 operating systems as I've never used Apple OS before. But am in love with the Macbook Air concept, seems the new Macbook Pros make the Macbook Air unnecessary. I mean they're insanely thin! The display, oh my god... not sure about those keys though.
I bought this Samsung Chromebook 2 to test out a 13.3" laptop size and the keys were close to being "counter sunk" into the frame (not really but they were noticeably low) and seeing one of the new Macbook Pro's in person at a store I was like "WTF is up with those keys."
Yeah I love the software optimization giving to long battery life on Apple's behalf but the whole "gaming on Windows, and software like CAD/Solidworks on Windows" I don't know. As I said run three operating systems as Linux is my primary OS to develop on. I'm not looking forward to learning Apple OS, at least Visual Studio Code is on there.
I'm kind of dumb though in some ways, I keep thinking "If I have a computer like a Macbook Air I can develop on the go" but I'm most productive on my desk, two monitor, desktop. But I want to get setup for a moving/traveling digital nomad lifestyle so one device and barely any possessions would be great, particularly in the event of theft my device is outdated/protected enough that it will destroy itself and I can replace it relatively easily... but that's far from my current situation in life as a mere peasant.
This autobiography brought to you by, schizophrenia, you gotta love it.
This is contradictory too, can I afford a $5,000 license haha. But that's something I'd like to be able to use in the future as I'm also interested in mechanical engineering/prototyping.
I mean visual representations something like SketchUp is great, free easy to use. My friend has a 3D printer and can produce STL files. I've used SolidWorks before. Yeah I don't know, I'm just rambling excuse me.
I've heard of Warez.
I'm also aware of free options like FreeCAD which I loaded on Ubuntu, pretty cool.
I primarily buy Apple for the hardware, secondarily for OSX. I used Windows for years and years, there's nothing about it I can't get used to. If MacBooks weren't so great, I'd be using ThinkBooks, they're not terrible. But MacBooks are so much better that it isn't even a close consideration.
That might be the lock-in talking, Apple would have to screw the MacBook line over as badly as they did the Mac Pro line to get me to switch.
I'm sure you've heard it before... but the XPS line from Dell is a solid choice for a Windows laptop. Just be sure to buy a new one. I recommend against going the Dell Outlet refurb route to save money.
The price differential doesn't really concern me. I've tried using different laptops, I always come back to Apple. I wouldn't call it insanely overpriced, it's just higher priced than everything else if you just went by specs. Being that the quality is superior, I do not begrudge them their profits. It would be a sad day if Apple ever stopped making hardware.
How? Just about everything is better. The biggest difference between Apple laptops and everybody else's is the trackpad. Nobody else makes one like Apple. You don't know what a pleasure it is to not have to think about the trackpad while I'm working. That alone lifts the build quality of Apple laptops to a level above ordinary Windows laptops.
The keyboards are excellent, better than the norm. I consider ThinkPad keyboards to be outstanding, slightly superior to Apple. But Apple keyboards are still way better than the average laptop, the second best laptop keyboard. I haven't used the latest Lenovo ThinkPads and I've read that their laptops have degraded over the years. So maybe Apple's now is the best. All I know is that it does it's job without any fuss and that's very important to me. The backlighting could get dimmer in low-light conditions but I understand that Apple fixed that in the newest model.
Moving on, the batteries are outstanding. Apple manages to get everything right. Charging works well, though the chargers themselves are sub-par, due to Apple's ill-considered decision to not use strain relief, I never have to worry about whether I'm charging my laptop too much or not enough, Apple builds all those decisions into the circuitry of the charging system. It's one more thing, like the trackpad, that I don't have to worry about when I use Apple that I always miss when I start using other kinds of laptops.
There's the screen. I'm sure there are similar-quality screens out there, but Apple's is outstanding. With Flux, I can use it in lighting conditions ranging from very dark all the way to just shy of direct sunlight on a bright day.
There's the solidity of the aluminum construction. I wouldn't exactly say I'm careless with my laptops, but I don't use a case and I bring them to the bar. If I close the lid, it's practically impervious to spills. I've relied on this more than once. I've spilled liquid on it with the lid open, all that needed to be replaced was the keyboard and trackpad, the mainboard wasn't damaged.
Finally there's Apple's support ecosystem. I've never not left the Apple Store satisfied. Their reps are helpful and knowledgable in a way that you really miss when you stray outside the ecosystem. One time I didn't want to wait for Apple after I spilled water on it on a Sunday, so I took it to a Micro Center that was listed on Apple's website. The difference in professionalism was night and day. Micro Center made me fill out paper forms and mis-transcribed my phone number, so I didn't get any notifications.
Literally everything about Apple's laptops is a cut above in terms of quality, and some things, like the trackpad and support, are spectacularly so. Other companies can get close to Apple on a few things, but only Apple can consistently do everything right. You're always going to be missing something if you go elsewhere. Apple hardware looks overpriced compared to a run-of-the-mill machine, but when you look at the high end of the laptop market, prices all look very similar. When you're actually comparing apples to Apples, (see what I did there?) the prices for similar quality laptops, like say the ThinkPad X1 Carbon, Apple comes out at only slightly more expensive. I consider the premium very much worth it. I can see how a more price-sensitive customer could find it very expensive, but to me that's like comparing Toyotas and Hondas to Mercedes and BMWs.
I agree macOS is better. It does have issues and bugs, but in my experience, there are far fewer annoyances with macOS than with Windows. It's a little hard to point things out, but Windows does get in the way many a times while using it. It's also the overall quality of applications that run on Windows that make for a poorer experience. Third party apps on macOS tend to be a lot better and nicer.
If there's one part where I like Windows and Windows applications a lot more than macOS, it's in the support of keyboard shortcuts (like all the Alt+ or Ctrl+ combinations). The apps as well as OS on macOS severely lack in keyboard shortcut support and depend more on a mouse or trackpad. I know I can define my own shortcuts for application/system menu items in macOS easily, but that's a big chore to do.
All the praise of macOS aside, I avoid Apple's own apps for anything where I need longer term availability. Take iWork for example. I never use it for anything that's not a throwaway project. Apple could, at any point in time, just decide that it's not worth it, junk support for all the files you've created (in its proprietary format) and then start afresh. So I use LibreOffice for all my longer term spreadsheet needs. Or Thunderbird for mail (and so on). The shelf life of Apple's own applications and their proprietary formats and cryptic file organization systems are relatively much shorter compared to FOSS offerings or even Microsoft's offerings. Apple's consumer side iLife apps also have hit some people hard in the past with data corruption and data loss (IIRC, iPhoto was notorious for that). Since I don't upgrade my hardware every few years, these factors hit me harder on the Mac side.
Linux is a lot more time consuming to manage for me (even though I'm fairly tech savvy). But a good combination of UI (looks, readability, fonts), usability along with perfect hardware support would be a great thing to have.
> a good combination of UI (looks, readability, fonts)
In my experience, Linux has this. Especially fonts. IMHO, fonts in Linux are slightly better than OS X, and worlds ahead of Windows. As far as UI, it depends what you want. There are a lot of options, and some of them look fantastic, some of them are very usable, and some fall into both categories.
> And please, don't tell me Ubuntu or other linux flavors.
Well, you seem to have isolated yourself from counterargument. I was given an 2016 Macbook Pro from work. My normal dev workstation is Ubuntu 16.10. I do truly prefer Ubuntu to OSX. I like the customisation, and even the default look and feel of Ubuntu just works better for me.
Also, as a developer doing mostly Haskell and Python, derping with Elm, rendering documents with Pandoc + LaTeX, all of my tools are perfectly at home on Linux. Everything works with OSX too, but it is usually a bit easier to get things working. I prefer apt to brew. I don't know what else there is to say, but here is a counter argument. If OSX and Ubuntu were both for-pay products and cost the same, I would pick Ubuntu.
The UX on Linux is not lagging. I'm talking about fedora and gnome. I have had Mac toting people specifically ask me about my cool UX and what de was I using.
Oh and the XPS is better looking and non-hypey like the touchbar Mac book.
I really suggest you give the Fedora livecd a try.
OS X (sorry MacOS) has problems, but dear god as soon as I try to use Windows or something else, I'm reminded how good the Mac is. The last 3 or so major revisions haven't offered much of anything I personally care about, though.
The first two points are, but not really the RAM. It's going to be an ounce at most. If they go from two 8GB sticks to two 16GB sticks, it wouldn't change size/weight at all.
The reason Apple gave is the truth, LPDDR3 only supports 32GB. You would need DDR4 for 64GB but Intel only supports regular DDR4 not LPDDR4. While DDR4 uses less power when active, LP saves significant amounts of power during standby and they would need bigger batteries to compensate.
I disagree. I moved from a MBP to a notebook with Arch and i3. The UX of a tiling window manager such as i3 is in my opinion much better and more efficient than the floating wm on OSX. There is no real tiling wm for OSX (don't mention divvy here, it's a nice tool but far away from a real tiling wm).
> I mean Windows is good, but nowhere good as OS X.
The vast majority of the world disagrees. The only reason anybody uses OS X is because it's Unix. Other than that, the UI is atrocious and severely lacking. It doesn't even come close to the robust utility that Windows offers.
Same here. I used windows for many years but the second I used macOS I was won over and that piece of software will keep me there for a very long time. I don't often (not never) run into problems with the hardware. Hell I barely ever notice it. But the software is really important.
That's a reasonable case for an Apple laptop. The Mac Pro is targeted at different use cases and competes against different products on computational power. Signal processing workloads like Audio and Video editing that rely on Fourier transforms and parallel array processing benefit from big power hungry CPU's and GPU's; gobs of RAM; and large fast persistent storage arrays.
Someone who is processing lots of video or audio for money may spend most of their week at a desk in one or two apps and the faster the throughput of their machines the easier it is to meet client deadlines. And hardware starts to matter a lot and throwing hardware at the problem is often a good idea...the logic of rendering farms is the same as server farms.
> Am I the only one that sees OS X as the biggest reason to switch to Mac?
Not any more. Mac OS looks and feels very antiquated compared to Windows today.
As a developer, Windows has caught up to MacOS for the shells and beats it in pretty much all the other areas (UI of the OS, keyboard shortcuts everywhere, great file manager(s), much more productivity tools available, etc...).
And of course, Windows laptops are anywhere between 1/3rd to half the price of the equivalent Mac laptop.
These days, I use both Mac and Windows to develop but once I can no longer use my Mac Book Pro, I won't be getting another Apple laptop, it's going to be Windows all the way for a few years until Apple catches up again, if they ever do.
1. I can't figure out how to easily open up the bash/ubuntu thing where I need it. I'd love to just have a dedicated "github stuff" folder that it opens into by default (and can edit). I always seem to put stuff where bash ubuntu isn't allowed to touch, or I'm manually CDing around until I find that weird place where the C drive is. I mean yea I can google it, but I'd have to do it every time.
2. Pardon my french but it's fuckugly. The colors, the fonts, lack of transparency. Copying/pasting/etc all suck. How can I make this less sucky?
I actually have a few different setups on my various Windows computers, just to test things. And they are all pretty much equivalent, the only difference is the console I use (Cmder, msysgit, etc...).
The main take aways:
- I share all my dot files (.bash_profile, etc...) between Windows and Mac OS. They are in a Google Drive folder and whenever I move to a new machine, I just copy them all verbatim and they work right away.
- Bash, git and ssh work out of the box on these Windows/UNIX shells.
I agree with the colors, fonts and the copy/paste interaction on Windows. Not great, but tolerable. I want to experiment with more consoles since there are so many to choose from on Windows.
Oh and Cmder has transparency and you can configure copy/paste to be by line instead of by block, at least.
I use ConEmu in windows as my terminal of choice.. there are options to add an "open command prompt here" to the registry for directories/folders so you can right-click in explorer... You can also configure git-bash as your default... Copy/paste also get better... with the windows terminal, you can go into the config and choose the quick paste option, I forget what it's called on a mac at work.
Mostly, I use vs code, and open that from where I am in explorer.. it has it's own terminal that opens in the current directory, that I change to use git-bash -l, adding in the git prompt script(s). Overall it works very well, imho better than bash for windows (ubuntu userspace).
1. I put a cmd link in my taskbar, which opens to my Repos folder, and then launches bash (which will open in that folder under /mnt/c). Or just do `ln -s /mnt/c/My/Repo ~/GithubStuff` and have a symlink from Ubuntu land into your Windows drive.
2. Colors are better supported in the CU coming out in 3 weeks, and transparency + copy/paste can be setup via the file menu (right click on the logo in the command prompt title bar and go to Properties).
I am commenting as someone who uses both Linux, (at home) and a MBP, (at work) on a daily basis:
> And please, don't tell me Ubuntu or other linux flavors. They look good (and are good if you are programming on them) but the UX is still lacking a lot.
There certainly are some specific areas, like 4K support, which are still being worked on, (and is in fact worse on Windows), but other than that, I cannot see what macOS offers over a modern GNOME desktop, UX wise, can you offer specifics?
On my MBP, I always feel constrained, doing any heavier compilation slows it to a crawl and sends the temps to the very edge of what the CPU can handle, there's not even a point in having a CPU that can turbo up to 4GHz, since I never, ever, saw it happen under macOS, Finder is a joke, (Path Finder is good, but you can get there with built-in File Managers on Linux), Xcode is a joke of an IDE, stuff like syntax highlighting and autocompletion is randomly gone every couple of minutes etc.
My main problem with macOS and the hardware it runs on, is that even on a top quad-core model, you don't really feel like 'this thing has power to spare', the last thing I want to feel when working on a computer that costs over 2k.
> god forbid you have a problem (especially a hardware problem) and then try to debug it. Good luck searching online for a resolution.
I don't think that is really true anymore, places like the Arch wiki and forums are a sure way to get almost anything resolved. On the other hand, try having a hardware issue, (personally experienced WiFi drops, dead pixels and GPU glitches), on macOS, nobody even tries to resolve it themselves, you're told just ship it back to Apple.
> Never mind the confusion of the different flavors, packaging systems, and configurations
Not really a problem in practice, (but a very tired talking point), you just stick to your distro's 'ecosystem' and be done with it.
> Anyone figuring out the Linux/Laptop problem is re-inventing the Macbook Pro/OS X.
There are surely people doing just that, but I don't think that's the majority.
Linux with GNOME just 'clicks' better with my workflow, the system feels a lot snappier, the package manager is in charge of every update and systemd is powerful and easy to use, so I always have a very clear picture of the services running on my system and their health.
Additionally, my Linux laptop is aprox. 3x as powerful as the MBP for less money.
The only thing I really wish my Linux laptop had was the MBP's superior build quality, but things like the XPS line are catching up fast, so there's hope...
Once Linux is set up without hardware compatibility issues, it's substantially better than OS X for programming, in my experience: the software Just Works more, compared to brew or whatever. I know that's ironic, but I've used both of them over years. I'm sure other people have contrasting experiences in some other sub-ecosystem of software development.
No, I used Mac then Windows in the PowerPC days, then back to MacOs on the switch back to Intel. I still need to use Windows for work sometimes and I still think some things about it are better (file management). But in so many ways it's way better. If I needed to build a high end video/gaming workstation now I would buy a PC. But I would still have a mac laptop to do everything else. Let's see what Apple comes up with, but my guess is that this is the reality going forward.
I would say no to a native package manager. I do not want to make Mac OS Linux. I would like to keep Mac OS as Mac OS. It's like recommending putting Quartz on Linux.
The built-in apps aren't cluttering I don't think. If you don't use them, what's cluttering about them? I'd rather have them than not.
Things Mac OS is missing compared to Windows:
- COM
- Stealthy updates
There may be more.
> They look good (and are good if you are programming on them) but the UX is still lacking a lot. (Never mind the confusion of the different flavors, packaging systems, and configurations).
How can plaintext configs be confusing for a competent software engineer?
On Linux, you can make your OS work for you just the way you like it.
Good luck changing anything significant in Apple's walled garden
Plain text configuration comes in a million flavours. The differences in syntax goes from large to subtle, and it's easy to make mistakes and usually hard to debug. Documentation of both syntax and semantics varies wildly in quality. If only there was a convention...
Nowadays I'm on a Xubuntu box, and I've managed to have zero system-wide configuration. I've made a metapackage that depends on the software I use---and in the years I've optimised that to be a couple dozen---and installs a package repo for itself. Until recently I was running a FreeBSD box where all the desktop setup was my own (VTWM, dunst, many little programs; I ran Arch for a couple years before that). That's nice, but it becomes a baggage quickly. So many points of failure, and only me to maintain it. Now I'm back on (X)Ubuntu, all I have to do is configuration for my shell, git, mercurial, and then my emacs.d. It isn't even close to how plesurable it was to run BSD or Arch, but the minutiae is tiresome. I'd rather configure as small as possible and focus on my actual work and indispensable tools (Emacs, VCS, shell, in descending order) instead.
The usual Unix level of customisations are good on server, but for daily use they're hard. And the configuration files, with the lack of conventions makes it harder (one has to know tens of dialects and languages).
You can very easily buy a MacBook with a "normal" US keyboard, even in France. Personally I gave up the french layout several years ago. It took me maybe one week to get comfortable with the US-International input to type accents when writing in French, and that was it, never looked back.
The US-International input is very simple: to do an à you do ` then a, to do a ç you do ' then c. This input mode is however quite annoying when writing code, so I switch between US and US-International input mode depending on what I'm doing.
> But Apple not having an open package management ecosystem
It's not from Apple but Mac OS has Macports and Brew for this just like Windows has Chocolatey
Unity makes the Linux desktop usable now, but I still wouldn't call it slick. Even if it leaned more on KDE instead of Gnome it still doesn't feel right compared to either Mac OS or Windows.
Going on a tangent, I still wish Apple would release a screenless iMac. Not as much people want or need a screenless Macbook Pro (Mac Mini) - the 16 GB RAM limit is annoying and thunderbolt is lacking as opposed to just SATA, and not as much people can justify or afford a modern Mac Pro. If Apple wanted to see the market demand for that, they should try tracking sales of used old-gen Mac Pros on ebay.
Not sure why you still feel that Apple needs "an open package ecosystem" when you're already happy with homebrew. Apple computers have a lot of problems, but that isn't one of them
People really need to start understanding that there's a hard choice to make between thin/light, battery life, and power. You can't have all three in the same package. The laws of physics prohibit it.
I love Mac OS but it gets regular updates and which each update, it's more demanding. The hardware needs to keep up. Try running IOS 7 on a Gen 3 iPhone.
I am using a 2010 MacBook Pro 13" (2,4 GHz) as my everyday development machine with macOS Sierra and Xcode. I maxed out the RAM to 8 GB and replaced the HD with an SSD 3 years ago.
The screen keeps feeling smaller every year, but I can not complain about the performance of this nearly 7 year old computer.
Still running a 2012 non-retina MacBook Pro. I shoved 16GB RAM in as soon as I bought it (it officially only supports 8GB).
But it was a real dog with a hard disk. Shoved in a 1TB SSD and all is good (and that's a SATA3 not NVME). Should continue working for another 5 years no problem.
I really dislike this argument. If you are not doing any serious work, then yes, Apple products are a luxury. If you are doing serious work, then you can afford Apple products easily. Let's say you spend $10k on your 2-year updated hardware. That comes at roughly $14/day. That's about how much I pay for coffee daily (cafes + Nespresso capsules)
Edit: And that doesn't include the re-selling price. I do give my old hardware to my mother, so I take it out of equation.
It's only $14/day if you work 365 days a year. If you work a more typical 253 days/year, then it's $20/hour which could be half an hour of the developer's hourly rate.
Does that $10K hardware save you 30 minutes/day over over what you'd see with $3000 worth of hardware (or keeping your $10000 hardware for another year)?
For some, the answer is clearly yes, but for others, maybe not.
I'd really like to know what you do to make a 2 year Macbook Pro upgrade cycle worth the cash outlay. Performance updates have been really incremental between releases for the last while.
I do some serious machine learning and DSP development (except for deep learning, but an upgrade is not going to help with that) on my early 2013 rMBP and I feel bad asking for an upgrade although I could probably get one, because there is really nothing wrong with my machine. Maybe when a 32GB version swings around...
And if you do, that does not mean every other Linux distribution is bad. Try [kxl]Ubuntu. (Not normal Ubuntu unless you honestly think you would like unity. Most of us do not.)
Not worth the trouble. Its too much trouble with the updates and all that crap. I have people calling me about their hackintoshs and I tell them to call someone else
I'm really, really surprised you said thinner/lighter. I have a few years old MacBook Pro and it's already plenty thin and light for a "Pro" laptop.
A lot of the other things you list are in direct conflict with thinner/lighter, and Apple's obsession with thinness is probably why you don't have them.
I'm with you on this. The bloatware in modern operating systems, even once you take away the PC-vendor garbage, is appalling.
I feel like they could keep things the way they are, and offer an AOSP-like vanilla version for people who generally know what they're doing (Google does a pretty good job with limiting it).
I'd pay hundreds of dollars more for this software option (have previously considered hiring someone to do it in the past on a new PC, but I have had bad experiences with PC repair shops).
I bought my spouse a new PC and literally had to spend HOURS removing software and decoupling McAffee from Windows. An i5-based system was out of the box crippled while it downloaded updates and software from the Windows store that I didn't even want to begin with.
OSX is markedly better, but I don't need siri, icloud, chess, ilife, dvd player, photo booth... the list goes on.
You can turn Siri and iCloud off at first boot-up, and the others are just apps that sit in /Applications and don't really take up much space, and won't bother you if you don't use them.
(The only exception might be Garage Band with its huge sound files, but you can simply find and delete those.)
When you get a new Windows computer first thing to do is give it clean install using an image direct from MS (remove all the partitions), which gives you a nice clean start. Don't even try to uninstall crap the OEMs have installed, there will always be stuff you've missed.
OS X is a bloated version of BSD. Why run that when you can use Linux for free? Install the software you use and nothing else. Your package manager will keep all of your software up to date for you. You won't be running an absurd antivirus program an the time...
I understand completely. GIMP has never been very good (though quite usable in many cases). GTK (Gimp ToolKit) is a nice library, though. Krita is much better, but is very focused on painting.
At least you can run a real OS whenever you aren't using that specific software. Any reasonable linux distro will use <10gb including all the software you really use, and have a nice automated installer to shrink your windows/mac partition and install in the empty space.
Sure it's not ideal to have to reboot, but with solid state drives, rebooting isn't very much hassle anymore.
Wow, what a torrent of negativity on this thread, I didn't see a single positive comment reading through the top level.
This is excellent news, I'm chugging along on a 2010 Mac Pro and was very disappointed when the Apple Displays were cancelled last year. They are a staple of the lineup and always look gorgeous compared to what's on the market. I will definitely be buying the new Mac Pro and 2 displays to go with it. There is absolutely no way I will ever use Windows and having to downscale to an iMac or, worse, a MacBook Pro, when my Mac Pro is finally too old was filling me with dread. By the sounds of things, they are working on making it modular and expandable, also very good news as I like to keep my workhorse computer for a long time.
Overall very excited to see what Apple announce next year!
Much as I am keen to see the Mac Pro replacement, It's not unnecessarily negative to point out the absurdity of the fact that in December, 2017, Apple will be selling, at a premium price point, no less, an _Ivy Bridge_ processor. And charging _twelve hundred_ dollars for 48GB of memory.
The current one is Ivy Bridge. I don't see a minor speed bump prior to a major revision including updating their custom motherboard design from LGA 1155 to LGA 1150. Especially considering their neglect thus far.
Yep, great news indeed! For me, not regarding the Mac Pro especially, but that Apple is concerned about pro users, and that they are willing to talk about future plans.
Deciding on using Macs for your work is not only about the devices themselves -- you're buying into the whole Apple ecosystem. That's not something that's easy to change in a heartbeat according to which vendor currently has the best offering. And the more concerned you are about the future of Apples products for professionals, the more tempting it is to look long and hard at Windows or Linux, even if Apples current devices fill your requirements just fine.
> ... and was very disappointed when the Apple Displays were cancelled last year. They are a staple of the lineup and always look gorgeous compared to what's on the market.
Have you looked at what's on the market? There are many great displays out there now, all far better than Apple's offerings. Apple's displays were good for about a year before the competition surpassed them.
> There is absolutely no way I will ever use Windows and having to downscale to an iMac or, worse, a MacBook Pro, when my Mac Pro is finally too old was filling me with dread.
There is a third choice: Linux. I've been using it as my desktop for 18 years now, and it really is wonderful. With a tiling window manager (no GNOME, no KDE, just X11) the UI gets out of my way and lets me work.
Yeah, I do kinda sorta wish that there were lighter or prettier laptops around, but on Linux I can get work done: on macOS or Windows my productivity is severely hampered.
Yep, that's where I'd go if forced to. I've seen some pretty nice screenshots from the newest window managers of late. I value design and ease of use quite a bit though and the number of complaints about things not working or tools not being as good (think Adobe CS) on Linux puts me off.
A modern Fedora/Arch system is indeed very pretty and usable and if you pick decent hardware, (i.e. not the trash from BestBuy, spend a similar amount as you would on a Mac Pro, if you expect equivalent results), you won't generally have problems.
I'm sorry but you should be at the very least impatient with them, as a customer. This thing doesn't even come out until 2018 and you're using the 2010.
Look, I use Apple products all day, damn near every computing device I have is made by them. It's just astounding how they've let this languish. Very bad.
The whole PC industry is slowing down. I have a gaming rig built around a 2600K, a chip from 2011, and it's perfectly adequate. There's no urgent reason to upgrade to a newer chip like the 7700K just to get a maybe 50% speed boost. The only reason I'm even considering it is because of wanting to bump to a 1080 for VR.
The days of performance doubling every 18 months are completely over. Now it's more about power efficiency and more cores rather than peak single-thread performance.
This. We have still 8 months left. How hard can it be to design a tower? I guess Apple is having some interesting trick in their sleeve, but come on, how long could it take? Just throw a few more engineers on it!
The trash can was a neat design. The problem was is that Apple came up a design that was wrong for the market that wants a Mac Pro. This is a market that wants to upgrade and expand their computers. They want to add some of the largest capacity hard drives they can find. They'll put in an expansion card or two. The GPU that would be fine for me until I bought a new computer, they'll want to replace it next year.
I don't think the tower will be that hard to design. I think the issue will be in making a new motherboard. Or do you think, they'll just use one of Intel's designs?
Excellent news indeed. I was well underway of compiling a hackintosh components list, dreading building a possibly less stable system. I'm really curious at what price points these new macs are going to end up, and if multiple cpu/gpu vendors are supported. (nVidia/AMD)
I remember building a hackintosh about 6 years ago, has much changed since then? It was a lot of fun and a lot more challenging than a regular Windows PC build.
> Wow, what a torrent of negativity on this thread, I didn't see a single positive comment reading through the top level.
This is a thing many comments nowadays capitalise on. First, I'd rather read criticism than lodes, to see for myself if all the good in sth. is eliminated by criticism or not. Second, your statement is false, as there are as many positive comments as negative ones. If the negative ones are higher up, that means people upvote them, i.e. agree those negative comments, a right to which they're entitled.
For me -- and, I suspect, many others -- this is too little too late.
Just a few months ago, I spent somewhere around $4500 (all-in) putting together a new workstation. It runs Linux instead of OS X and this has led to me using my (4-year-old) ThinkPad more than my (18-month-old) MacBook Pro (when I'm "on the go"). I actually plan on selling the MBP; I just haven't gotten around to it yet.
I'm sure this is great news -- and long-awaited -- to many people... but some of us got tired of waiting.
Same I was a huge mac fan, switched during the TiBook era a good 14 years ago used them throughout my professional career (designer, 3D, animation, creative tech) but after waiting and waiting for a pro refresh with Nvidia I bit the bullet and built a PC in December.
I would have been willing to spend several thousand £ on a Mac Pro that met my needs, ended up having my needs met by a PC with a GTX 1080 that cost less than the baseline iMac and within 15 minutes of 3D rendering using CUDA and Octane render I asked myself why I didn't do this years ago and my wallet thanks me.
Now I don't really see the point in going back to Apple, feels like they've only just decided to care about pro users again and this was probably a decision that came out of a meeting where it was 50/50 if the line was killed or rebooted.
Especially since the whole virtual escape key thing– being a default layout vim user for 20+ years– I'd love to follow suit, but I need Adobe Creative Suite. The tools available on Linux aren't even ballpark. Getting OS X on a VM has always been fussy and frustrating, and if I wanted to work with Windows I'd just use a Windows laptop. On top of that, the apple magic trackpad is just great. Fussing with extracted bootcamp drivers to get it to work on a virtualized windows so I can run some essential software? No thanks. I need quick access to a comfortable, familiar replacement environment if something goes wrong with my regular computer. I can't afford to futz around with that stuff.
> Especially since the whole virtual escape key thing– being a default layout vim user for 20+ years
I really don't understand how there can be vim users that haven't remapped escape.
Like, if you care about ergonomics enough to complain about making a rarely used key into a touch key, how can you not care enough to remap that key if you use it more often?
And Mac OS even has built-in support to remap caps-lock to escape, and has had for a long time I think.
Funny side-note: I've broken the keycap for my escape key (on a TypeMatrix keyboard).. yet I haven't bothered to try to fix it. It's just not worth the trouble.
That is impressive. Since before xmodmap on a device that didn't either have ctrl-key positioned so that "^]" was easier, or had the esc key closer to home row?
I have large hands, so I 'like' vim escape key being mapped to the actual escape key. I also have 20+ years of muscle memory backing it up. I also do a fair amount of setting up servers or virtual environments and like to have my editor already set up on every machine I touch. I also use caps lock for writing SQL. So yes, there are people who haven't remapped it 'yet' and as long as it's still there, never will.
I've been using the escape key for vim since I started work as a programmer 21 years ago. That's a long time and I really don't want to change it. I know it might seem silly to you, but I'd switch to Linux on an XPS before I'd upgrade to a Mac with no escape key. (Not that I have to, plenty of Macs available without the touchbar)
FWIW, I've been using the touchbar MBP for ~6 months now and I haven't had any issues hitting ESC. It definitely feels different, but it works just fine. If anything my problem is that I hit ESC too easily now. I used to rest my hand above the ESC key when web browsing but that translated to resting it above tilde on the keyboard, which meant touching the ESC area. I've had to train myself to not do this.
I use that method too. When I found out it was available I ditched my experiments with jj, jk, etc. This and finding out how many users just use / for most of their higher-level navigation needs has helped me quite a bit.
It's never too late for me to look at potentially better alternatives. I would never work myself into a situation where switching to an alternative major OS would be extremely hard. But sure, depending on what you work with, that can of course happen -- issues like software unavailable on said target platform. Disregarding that though, I prefer not to close doors by free will.
Too late maybe (at least for your personal upgrade cycle), but I don't see how a complete re-think using a modular design, pro monitors, and up-to-date components can be too little. What more were you hoping for?
Don't be so dramatic. There's a massive silent majority who will not leave OSX. The new MacBook Pros are literally some of the best machines Apple has ever produced, and a renewed focus toward pro users for the 2017 refresh (with 32gb of RAM) and a refresh of the Mac Pro will easily re-secure Apple's preeminent position in this space.
Same for me. Built my first PC in ages, 8 core Ryzen, 32 Gigs of RAM, NVME SSD, 1060 GTX, all below 2000$. Running Ubuntu on it for now, works very well.
When you start getting into Hackintoshes or running non-OSX operating systems on Macs (aside from Windows and Boot Camp), things can get iffy in some situations. That list is pretty nasty, though I'm fairly surprised someone got the damned Touch Bar working for Linux.
But if there's one good thing at play, it's that there's a fairly active community that tries to tackle these problems.
I've had very good luck with my conversion on my old MBP so far. I have a late 2009 kicking around the house, basically a web browsing computer or backup machine if I leave my newer model in the office, etc.
Since I couldn't get the latest macOS on it I figured I'd just wipe it and install a Linux distro- all told I was maybe an hour into the process and had Ubuntu 16.10 running like a champ. The built in Ethernet was a life saver because I had to replace the WiFi drivers and there were a few power settings I tweaked.
A few weeks post install I've tweaked touchpad settings and various other things, except for the WiFi, it worked nearly straight out of the gate, and it was certainly functional immediately post-install.
Known limitations that affect me are pretty much limited to Thunderbolt hotswap (you've got to boot with the TB hardware plugged in), lid close/sleep is pretty finicky (I feel like this has always been the case with every Linux laptop I've ever run), and the iSight drivers for the built in web cam have some issues therefor preventing some smart screen dimming functionality. If I actually took the time to patch the drivers this sounds like it's resolvable but I really just don't care.
Battery life is tough to gauge because it's so old and I didn't benchmark it pre-Linux but I'm getting about 2-2.5 hours out of it while writing code. Obviously not great, but I doubt I was getting that much more w/ OS X. This machine was my daily workhorse for a couple of years, it's got plenty of cycles on the battery. With auto screen dimming or GPU optimizations, this could likely get improved.
Really, the only thing I'm finding now is that I'm not a huge fan of the default Ubuntu 16.10 desktop and I'll probably start hopping distros since it's been a while since I was running a Linux desktop as a main OS.
I would say that from a completely subjective standpoint it feels every bit as snappy in standard use as OS X did and it will likely keep this machine quite useful for as long as it holds together.
For Mac users switching to Linux, I would highly recommend using Ubuntu Studio, since it comes preconfigured for Multimedia production and has a ton of applications onboard that would make your average MacOS user double-take and re-think the position that "there is no creative software on Linux", because my friends, there is a huge plethora of creative apps for Linux, and Ubuntu Studio demonstrates that in spades...
They tend to run worse than equivalent ThinkPads though, mostly due to prioprietary nature of power saving APIs and thus you lose battery life (and quiet).
Transaction costs (both the explicit "we take a percentage" kind, and the hidden ones like the risk of being ripped off) are pretty significant, so no, not really.
That's an essential insight for anybody that ever wants to run a business, and it is why sales people get paid the big bucks and purchasers are only as good as the discounts they get.
That's included in the spread I mentioned as "reduced price". Unless you're saying that the expected value of selling it is literally lower than throwing it in the trash, my point stands.
I'm confused about the reduced price. Everything isn't about money when we are talking about non-large sums. Selling something like an electronic can be a huge hassle for some people.
On second thought, I guess if you can just sell it to some service. Or say, Best Buy, then there's a point there for reduced price (at a high degree :p).
I use Mac OS because I like a reliable and functional workstation. I switched from Windows to OS X (at the time) specifically because of reliability. The biggest challenge then was application availability.
We have some big Linux advocates in our company, so I've also tried switching to Linux twice before going to OS X, a couple of years apart. The claim from them was always the same, Linux is ready for the Desktop. The problem is that it's been "ready for the desktop" for ten years. I can't imagine ever going to Linux over Mac OS.
I never thought I'd be in this category, but I am. I went from a 100% Mac environment to my home computer being a windows PC I built, and the major workhorse for my lab being a linux workstation running Scientific Linux.
I've still got my Macbook Pro, so we'll see how things develop, but I've already started the process of migrating away, and if we're looking at 2018 it'll be pretty far along by that time.
a pro machine that demanded placement on your desk, not under your desk
... and that's how Apple lost the professionals. The desk is a clean space for a huge monitor / keyboard / mouse, and MY work. It's good to make a workstation that looks nice, but it's ten times as good to make one that's flexible and powerful. The only people who want that workstation on the desktop are the designers at Apple, and stroking your own ego isn't on the path to making a great product.
Every great designer knows that form follows function.
I wouldn't take that too literally, firstly those are Gruber's words and secondly, there isn't anything stopping you putting the Mac Pro on the floor so long as the cables reach.
Apple themselves admitted as much. They focused too much on making something beautiful and small. Neither of those things matter to a pro, especially for a desktop.
And not only that, even Apple with its vast resources cannot keep the machine up to date because the components are so tied to the case design.
A nicely designed item is unrelated to where it should be placed. The OP was specifically saying that Apple lost the Pro market because the unit "demanded" to be put on the desk which is absurd since the Mac Pro can be placed anywhere that the cables allow it to.
I think the OP's point is that the Mac Pro is not a nicely designed item. Design is all about tradeoffs. In this case, Apple had to decide whether to prioritize form factor and appearance or upgradability and performance. They chose the former, which turned out to be the wrong design for the target market.
The feature that allows a current Mac Pro to be placed on a desk comes at the price of more important features like being able to easily upgrade and expand a Mac Pro. Could be wrong but I feel that's one thing that OP wanted to express.
The trade offs made give it a small footprint, I've seen many cheese graters on desks and they were fine. Initially I assumed OP meant the same thing as you highlight but after talking about keeping a pros desk clear I assumed they were being more literal. If the issue is that Apple made a small computer that compromised on its core requirements then that's fine but to say that the issues stems from making a computer to sit on a desk alone is absurd.
Couldn't agree more, I mean hell I put my G5 (cheese grater) behind my monitor (on my desk) back in the day _just_ so I could more easily dust my machine. Pretty? No way. Ergonomic? Yes.
my issue with the current Mac Pro is that, yeah cute packaging, but I need a second tower/accessory/etc when I only needed one before so I immediate get new clutter and cables all because someone had a "cute" design for the main computer
It's not important where I can put it - the act of designing something that should be placed on a desk prevents them from designing what I want. A Ferrari is a desirable sports car and a terrible cargo van.
You're refusing to accept the simple reality that professionals couldn't figure out how to put a computer on the floor. And that's how Apple lost an entire market!
A machine the size of the trashcan Mac Pro would be awesome as an upgradable Mac Mini type device. i7, internal drive bays, RAM slots. I'd love a desktop sculpture like the Trashcan, but not with the useless dual GPUs and thermal design and everything.
Great point, and the entire Mac lineup needs to be more flexible and configurable to meet customers's needs:
- I want to configure a laptop with a 1TB spinning hard disc, so that I get a lot of space without paying $3000. I can get such a Dell for $1500 with a UHD screen. I'd prioritise disk space over weight and battery life.
- I want to configure the Mac Mini with a 6TB spinning desktop hard disc, rather than a portable one, which is costlier and has limited capacity. To reduce footprint, make the machine vertical. I also want to be able to configure the Mac Mini to drive a 5k monitor via Thunderbolt 3, and 32GB memory. All of this should be upgradeable. It's a shame that the Mac Mini isn't. It doesn't even have the "thin and light" excuse laptops have.
None of this is sexy. It won't make tech reviewers go ooh and aah. It won't earn a place in the MoMA. But it will be more useful to customers.
The first thing I did when I ditched mac was to get a pc motherboard/ram/cpu/psu and bolt it out of the way under my desk. More room on my desk is an upgrade. Nobody really makes enclosures designed to be out of the way.
I'd be okay with putting the desktop on my desktop if the form factor was tight enough. Clearly that's what they were trying to do with their current Pro model, but at the expense of it being competitive. Just sayin', there's plenty of space in the corner behind my primary display.
I don't think that's necessarily Apple's fault. Desktops/towers (much to my chagrin) are largely a thing of a bygone era. No matter where I go, everyone is on a laptop, and that's what allows the employee to be mobile.
Best Apple-related news I have heard in a long time. Of course its outrageous, that they didn't listen to their customers earlier and reacted quicker, but it is great that they finally do, and also, that they are not shy admitting via this interview, that they changed their course.
Based on how long a product takes to bring to market in a large company, it might have well been the public reaction to the MB Pro release last autumn which woke them up. Just todays spec-bump of the Can takes like 6 months of preparation and planning. And it would fit to the true renewal being about 1 year in the future from today.
It might be to late for some, but I am so glad this is happening. Apple can make great hardware, if they are trying, and this sounds that they are trying again, so I am very curious what they can create.
When I wrote my comment I was under the impression that there were slightly improved graphics chips involved which would have meant at least organizing the supply chain and producing enough stock in advance, but in the meantime I read that it is just a price bump - which would cut down the necessary time to just a few weeks for writing the new price lists :).
Gave up on the mac pro years ago. I replaced my unkillable 2008 Mac pro with a lovely Fractal Design R5 case, a stonking 5930k overclocked at 4.5Gz on a 'posh' ASUS Mobo with 64GB of DDR.
It's a 'hackintosh', sure enough, but it's fantastic. First time in 30 years I don't own a mac, that's telling. Their fault, too.
I ran a Hackintosh for about 1.5 years. I just recently went back to Windows 10 after the integrated Linux sub-system came out with the update. Most of the software/tools I use run in both OS X and Windows, so it wasn't a huge deal breaker for me.
One thing I didn't like about the Hackintosh was maintaining it during updates, having to wait for the right NVidia web drivers, and then remembering how to update all the stuff specifically.
What I did like was that I had a computer that I built for $1100 that scored about 16,000 on the Geekbench, without overclocking.
Note: I still use a Mac Book Air 11" because it's the best laptop I've owned. I think the trackpad and sleep/wake functionality that just works is the best ever, and makes it really convenient. Obviously, these two killer features for me matter a lot less on a desktop.
Second Note: I built mine with an aluminum Rosewill Legacy U3-S – Silver/Aluminum Case that was both compact, and really good-looking.
[Edit, updated type of computer cost from $110 -> $1100]
Is this just because the newer cards are so new, or will there never be drivers released anymore? That's a bummer. And makes me feel slightly more confident on going back to Windows from Hackintosh. It likely wouldn't affect me for a few more years, since this 970-based, Intel gen 5, desktop that I built up a couple years ago, still has plenty of longevity left in it.
I'm on #3 hackintosh, and I agree, they're great. I'm using it for audio production primarily. 0 problems, and I have a lot of peripherals. The one place these fall short is graphics card support. Newer 1070's and 1080's simply do not work (unless there's some very recent developments).
I built my Hackintosh on the 970 platform, right when it came out. It was frustrating early on, waiting for the supported drivers, but eventually they came. What didn't go away though was having to install custom drivers with each upgrade/patch of OS X.
Have a Fractal Define Mini C myself, definitely desk worthy even as someone who is very fussy about product design. Has a utilitarian Richard Sapper vibe.
Really wish more PC component companies focus on high quality good looking components without the gamer branding.
I've been building hacks (as the community calls them) since 2010.
If you get a golden build and get it working it's almost like a real mac. It's possible to get everything working 100% but it's time consuming and even in the best case scenario macOS updates will probably be a pain.
Another aspect to consider is noise. If you come from the Mac world you are probably spoiled by low noise or silent computers. PCs are very noisy and you will probably have spend time in research and money in expensive silent parts to solve this.
Finally, a common problem with hackintoshes is wifi. I've tried everything, believe me. If you intend on using Wifi your best bet is buying the same chip Apple uses with an adapter. Many manufacturers offer their Wifi drivers for macOS but those are usually finicky or outdated.
Macs (maybe with the exception of a Mac Pro) are only silent at idle, not in full load. By my definition of 'silent' at least. If you can't hear the system while gaming because you use headphones, sorry, it's not silent.
If you just randomly build a beige box hackintosh, it will be noisy. If you spend some quality time on silentpcreview, you can get one that is silent even under load. I know, I have one.
Still, I'd very much like to get a decent Mac Pro instead. Devil's in the details, let's wait and see. My hackintosh isn't becoming obsolete any time soon.
I briefly ran a hackintosh when my Biostar motherboard died and I was frustrated with how slow my old Macbook Air was running. Pretty simple process: replaced it with a Gigabyte, downloaded an installer image from my Mac, and used to TonyMac tools to set it up. Unfortunately wasn't able to get Xcode to work reliably. Everything else seemed to work but Xcode was a crashfest.
I'm not sure it's the hackintosh... now and then I hear a lot of complaints about a particular version of xcode even on native macs. I don't use it, but I have work colleagues who do.
I was running the same version of Xcode on the same version of Mac OS on a 4-5 year old Macbook Air and it worked fine there, just slow.
I've certainly had Xcode crash once in a while, but this was something else. Tried reinstalling it and no change. I don't think I even got it to finish a build.
The FC R5 is a HUGE case, and it's as good as my mac pro 2008 really. Lighter too! Very, VERY well designed, excellent filters, and I run 3 140mm fans in it so it doesn't make a noise; the fans are permanently just over the stalling speed. Really very good cases.
Their utilities (Multibeast, Unibeast, Kextbeast...) are well-though and easy to use, but it worth noting that Tonymacx86 doesn't have any plans to open source them -- you have to trust him (no one seems to know his real name, mostly for legal reasons related to Apple I think) to execute binaries with root rights on your fresh OS, or go the Vanilla (and harder) way.
I have a fully working hackintosh. Just take someones confirmed build and buy the exact same parts. I lucked out in that every piece of hardware (besides the wifi card) worked out of the box. It runs just like a mac, better actually, and I use it all the time.
The truth is that the best hackintosh is Niresh but since it's even more illegal than Unibeast and such it's banned from the tonymacx86 forum and /r/hackintosh/ both.
Been running hackintoshes since 2007 or so. Mostly smooth sailing. Off the top of my head:
- It's all about making it easy for yourself. In short, motherboard and GPU choice change the experience from 'almost effortless' to 'never gonna work'. Get a $20 USB external soundcard to get rid of any audio configuration quagmires. Get the most compatible, widely used motherboard and Nvidia GPU, so you can get support on Tonymac should you need it. Stick to wired Ethernet to avoid meddling with bluetooth and wi-fi if you can. Everything else (PSU, CPU, RAM, hard drives and case) isn't an issue.
- The tradeoff is time invested into initially understanding how it all fits vs. money saved and increased knowledge of how MacOS ticks (a good thing regardless, if you're a power user). How much time depends on how much of a PC tinkerer you are already. If you already built PCs and tried getting Linux distros going it's gonna be second nature.
- You'll still need to check out Tonymac when a point update comes out for tips and warnings. The easiest solution is to install the fully up to date next-to-last MacOS version (install El Capitan 10.11.6 now that we are in the Sierra cycle, for example) and keep it going until you're forced to upgrade. Staying a generation behind, both in hardware and software, is the safest strategy. Hackintosh and being on the bleeding edge don't really mix.
In fairness I've been running Sierra just fine on my Hackintosh, though I did wait until 10.12.1. If I recall correctly the first version had issues with the Nvidia Web Drivers (some strange bugs) but they've long since been resolved.
I would agree with the overall sentiment though and add that even on a legitimate Mac it's usually a good idea to hold back a bit on updating. I work with audio a lot and it's common for audio units (and Pro Tools if you use that) to break initially.
Audio user here too. I just updated to 10.12.4 yesterday on a test install and it went without a hitch, but only because I checked Tonymac and updated Clover beforehand, which I wouldn't have done otherwise. And I'm on a old GTX 650 Ti that doesn't need the Web Drivers. Meanwhile my main 10.9.5 work install keeps gloriously chugging along.
Yes I had a previous hackintosh running for a few years in parallel with the Mac Pro. It's actually quite simple but it requires a bit more patience, especially during updates. It's mostly a case these days of doing the update, rebooting in 'safe' mode to install the binary third party nvidia driver, then reboot again and voila.
Also, some of the original setup requires a bit of tinkering with drivers to start with (especially if you use 'recent' hardware) but it's really not more complicated than installing linux these days (and then fiddle a bit with the settings).
It was the updates, and the time they took, and all the details to remember that eventually led me off of running a Hackintosh after 1.5 years. That and Windows 10 is more than good enough for me, especially as a desktop.
Fellow hackintosh builder here. It's totally doable. If you're able to run Linux and able to understand concepts like kernel modules, you'll be perfectly fine.
To get solid bluetooth support I bought a card from http://www.osxwifi.com/ , which makes peripherals like the magic trackpad work smoothly.
For me it was just too much work to keep it up to date. But then I don't have the big power requirements, but really need/want a separate monitor. These days I probably could get away with a Mini.
I generally recommend that people building hackintoshes get an external USB / Firewire / Thunderbolt soundcard. That way you're pretty much guaranteed to get official macOS drivers that'll work without issues.
They likely didn't have a choice, especially as they released a speed bump today. The Mac Pro had become a bit of a joke but the iMac and MBP range was powerful enough (just enough) that the pro community would grumble more than complain.
After the critical reception of the new MacBook Pro range amongst the Apple community, especially pro users, everyone was questioning Apple's commitment to pro users, and the Mac as a whole.
I'm glad to see Apple doing this, but I can't help thinking this was totally reactive and pre-emptive damage control. If Apple had just released the Mac Pro speed bump, there would be even more of an outcry that Apple has given up on anything more than incremental changes.
The pessimist in me thinks that Apple simply had no idea that there was this sort of demand for pro Macs. Phil and Craig mentioned the iMac numerous times, as though to say "hey pros, there is a great Mac you can use", so they can still claim that they do make great, high-performance pro devices.
My guess is "next year" means "Holidays 2018", and my guess is that Apple has only recently started work on this. Apple hasn't been about modular design or expandability for a long time. With the rise in adoption of VR, there has been even more discussion about upgrading graphics cards (remember Oculus comments about the Mac?), and the age of the CPUs is another criticism. I just can't imagine Apple starting work on this some time ago.
To be honest the age of the CPUs isn't too bad. I mean, they're old but the CPUs they have are still perfectly acceptable and competitive. The issue is that the GPUs just aren't worth bothering with at all compared to what's out now.
They talk in the article about being backed into a thermal corner and that's easy to believe. They simply didn't know that GPU technology was going to accelerate in advances in the way it did, and particularly not in single-GPU workloads.
Most likely the current gen speed bump and next gen redesign were both green-lit at the same time, so however long ago they started on the current update.
Well all the current update is is them dropping the price of configurations they already owned and killing off the previous base model. It's very much a matter of flicking a switch.
The pessimist in me thinks that Apple simply had no idea that there was this sort of demand for pro Macs.
I've heard again and again stories about power users that are editing videos (4K these days) or doing other content creation (3D animation, etc) which can really benefit from high-end hardware.
I'm thinking it was more an issue they couldn't figure out a high-end, upgradable Mac Pro which kept their beefy profit margins, and didn't become a support nightmare at a price point that was somewhat reasonable (i.e. less than $10K USD).
Probably, but I've read Gruber's piece twice and I couldn't help but feel as if the talk was held by a bunch of young geeks coming from a Kickstarter project.
Maybe that's an Apple's new approach at PR but it sounds like they tried to downplay the impact of bad management that ultimately led to a failure that must have cost them millions.
Perhaps they're realising that they're potentially losing a lot of people, and are a way aways from shipping. Much better to announce some vaporware and encourage people to stick around, especially if they plan to deliver. Before now, people had little hope about either Pros or Macbooks.
They wasted an entire hardware iteration on the Touch Bar that hardly anyone cares about. That's what put them in this position. If anything is un-Apple about it, it's that they so badly understood the desires of their customers.
Releasing the TouchBar laptops last November, which apparently have seen a 20% yoy sales increase, screwed up their Mac Pro desktop line released 3 years ago. I don't even...
Truth. I had $1,500 earmarked for a new macbook pro before they launched. Switched to Linux for all my development work shortly after. Which, I should have done earlier tbh.
I was ready to pay 6k+ for a max MacBook pro with external gpu and whatever other usb-c accessories they were going to sell. The lack of a ram upgrade caught me off guard, really no ram upgrade in like 6 years? Really its usb-c-only and they have no usb-c accessories? SMH
I've been reading up on various eGPU enclosures. I'll probably do it the next time I get some side-gig money, but the hacky-ness of it makes me nervous. If Apple were to make an official one, it'd be a day-1 purchase for me. I could see this type of product possibly fitting into a new Mac Pro hardware line, but I'm probably just dreaming.
To be honest I feel like the touchbar MBP gets a bit of undeserved hate. I do think the touchbar is a total gimmick admittedly but what else did people really want? More RAM, sure, but that'd mean lower battery life.
I was admittedly disappointed at first but then realised there isn't really anything else that they could've done to interest me anyway. I'm genuinely very happy with my existing 2014 MacBook Pro. I just also think it's telling that I as someone who's spent far too much money on Apple hardware also uses a hackintosh desktop.
In my opinion, Apple started losing its way when it started caring more about "fashion", which is not necessarily the same as (and it's often the opposite of) good design.
I don't think they had an issue with design so much as only having a single team as bottleneck for the entire product lineup. They really need the equivalent of Intel's tick-tock strategy where the designers can be going full speed on the cool new manufacturing techniques but a second team is just taking last year's industrial designs and shipping updates with current spec / capacity components so they stop drifting so far behind the market every time the design team isn't ready to ship a major change.
The hardware has mostly stalled out. A great CPU from 3 years ago is still more powerful than what most people are buying today. RAM and SSD space is plentiful enough to be a non issue for most people. Battery progress is steady but very slow. So, what's left to base a new laptop on GPU?
Storage? Nearly everyone I know who has a MBP is 100% on it all the time on disk. Understand apple makes a load on the SSD upgrades, but seriously increasing base storage would be enough for a bit.
They could go the desktop CPU route, although that's likely not an option for Apple since that would require them to either use extreme throttling or beef up the cooling (meaning louder fans and probably a thicker case)
I think Gruber addresses this pretty well: they already took the silent approach, but it was becoming unsustainable with their pro market, whose needs were not being met. If they just did a small hardware bump, they'd be projecting that they'd given up on the pro and they'd lose a lot of loyalty and goodwill. People can only wait so long so this info helps to keep them interested and bring the term "Apple" back into the vocabulary of any pro considering a future purchase.
You are not wrong, but the article is clear that they were hamstrung by their silly trashcan design, and they are moving back to something more modular and updateable like the old Mac Pros, which should help avoid stagnation.
Apple said entire point of the redesign is to make the machine modular, so that it can be upgraded down the line, specifically addressing this concern. I guess it's up to each person to determine for themselves whether to believe it.
My Mac Mini's RAM says otherwise. Pretty sure that, for the vast majority of its history, the same goes for the iMac. And, obviously, the Mac Pro, even in its current iteration. What other product lines are you thinking of, aside from laptops?
The iMac only allowed RAM changing from what I know and it only had two slots. Changing the hard drive was a surgery that very few people are going to do. Apple is now in the habit of making the RAM and SSD part of the main board in the name of space saving.
OWC (https://www.macsales.com) made it pretty painless to upgrade an iMac's hard drive to an SSD. They sell a lot of kit for various machines. Just saying.
Companies and industries make purchasing decisions that lock them in for several years, so Apple probably finally got their head out of their ass as they could see people heading in another direction.
Once you've spent five years transitioning to, say, a Windows setup, there's a significant additional cost to going with Apple, and it's not like Apple has felt like a deluxe experience lately.
This is the thing. A lot of companies have started switching away from Mac and now they're switching away I can't see that this is going to be enough to ever tempt them back unless they do something really amazing.
I genuinely love macOS and I'm happy with Apple's offerings on the whole, but I think there's plenty of people who just don't care and are happy with whatever works. If you're using say, Pro Tools or Avid Media Composer or whatever software you use to produce content, odds are it works just fine on Windows too. I think many are going to try Windows, realise this and then simply not bother switching back.
But its like when you're on the freeway and see billboard for your favorite restaurant 40 miles away. It gives you the motivation to not settle for something else.
Very. I think it speaks to the fear they have over losing this market and the incredibly bad reaction they'd have if they'd simply released the weakly updated Pro without further explanation.
Adding to my comment, I find very strange that they announced almost nothing tangible (the bump in the specs of the Mac Pro is not available yet in their web site, though I imagine it will be soon)
Why even bother with the speed bump. These guys have been neglecting the mac pro for 4 years and work their ass on a new version, and still spend time updating the legacy ?
Not sure why this was downvoted as it's completely accurate. Apple invited a well-known, Apple fanboy to an exclusive meeting with high level execs (feeling special yet?) to break this news - pretty much the best case scenario to break disappointing news.
He wasn't the only one invited, as he pointed out. However, he's one of the go-to sources for Apple punditry, so Apple knew that as soon as the announcement was made, tech people would be visiting his site to find out what he thought, and they'd be smart to give him a head start at digesting the news and asking them questions directly, instead of rhetorically.
"Religious fanatics" is a dumb phrase. Gruber has and can be very critical of apple and everyone. OP was just unhappy that Gruber doesn't share the same opinions as they do.
Great. Now they need to reintroduce a MBP with top of the line specs and without the stupid touchbar.
Snark aside, I have a mid 2010 mac pro at home that is still going strong (due to upgrades, SSD, more RAM.) However, I would like to get a new GPU for the machine, but I'm not about to spend $500 on a better, but ancient GPU that's compatible with a 7 year old machine. I've been wondering what my upgrade path would be. No way in hell I'm buying a trashcan mac without any upgradability. And I'm certainly not going to buy a MBP with the touchbar (I need a real escape key) and under powered specs.
I'm really hoping this is true, otherwise in the next year or two I'm going to be seriously considering building a PC like I used to and deal with Windows 10. The rest of my family uses apple and it makes support for their devices easier being on the same platform, but I need better performance for photo editing and audio production.
Speak for yourself. The Touch Bar is actually very nice and definitely speeds up some tasks once you get used to it and customize. I certainly don't want to go back to a keyboard without a Touch Bar now.
As for a "top of the line" MBP, the 2016 15" is already the fastest MBP ever overall, even if you count the few benchmarks which put it slightly behind 2015 models in one or two metrics.
The damning benchmarks seem to be related to AMD vs. Nvidia. I don't know much about that but I'd put it down to optimization issues in a few apps. The 2016 15" definitely plays all games better than any MacBook before it.
> What struck me about this is that Apple was framing a discussion in which the big news — the whole point, really — was their pre-announcing a “completely rethought” next-generation Mac Pro by emphasizing that most of their pro users use MacBooks and most of the rest use iMacs — and that they have big plans in store for the pro segment of both of those product lines. It’s exactly what I would have expected Apple to say if they were breaking the news that the Mac Pro was going away: We’re dropping the Mac Pro because its time has come and gone — all but a small percentage of our pro users have their needs met by MacBook Pros and high-end iMacs.
It is also exactly what you would think Apple would say if they know this is going to be an extremely expensive device. Much more so than the current Mac Pro.
The current Pro can be configured to cost a cool $7,200. I am sure the 'completely rethought' model won't be cheap, but I wouldn't expect it to cost much more than the current line. Which is still too much for most of us to buy for home use (I am running a 2010 Pro with a hex-core Xeon, raided SSDs and an nVidia 750 Ti which cost me about $1K to put together, and it benchmarks evenly with the current Pro), but the market for the Pro has tended to be institutions, or audio / video / photo professionals who could write off the purchase or charge it to their clients.
This. Can we remember that Apple is making a box with fans and a motherboard? It takes them five years because they have to be inspired with some original concept that marries form and function, and then deliver something novel and daring -- not just a new computer, but a new kind of computer. This is not in customers' interest, and they are right to be upset.
It was sort of, but you also had pretty direct access to 3 PCI Express slots, 4 pretty standard drive bays (with pretty enclosures), and 4 or 8 memory slots (depending on 1 or 2 CPUs).
I'm just guessing by modular design they're talking about something much more complex and unique.
If I could upvote you 100 times I would. I just want an updated 2008 Pro. It doesn't even have to be the absolute fastest CPUs out there. It just needs to be reliable and fast and ideally quiet too.
I admit I only skimmed this (my only wish for Apple is for them to release a set of build tools for Linux that allows building ios apps, and (more importantly, really) support the installation and purchase of OS X for running on non-apple hardware, and in VMs -- I don't think either of those will happen) -- however -- I caught this:
> We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do.
That's silly. They could take their previous generation Mac Pro chassis, stuff it with a dual Xeon board, 128gigs of RAM, a pair of SSDs and a pair of Nvidia 1080s - and after some nominal quality testing/driver tweaking sell it at their regular ridiculous mark-up.
Those old cases are so convenient, I've considered buying a used one just for a regular pc workstation build. Easy to get to the internals, nice airflow. Roomy. Looks perfectly fine:
Granting it took Adobe about 10 years to switch from old classics API on mac to cocoa, don't expect CS on Ubuntu before 2050 if they start working on it today...
I honestly don't care much about the Pro (since even with the speed bump it's still not that much better value for money) or the iMac (which is nonsensical for me given the form factor and my need for multiple displays) but am glad to hear about the mini still being alive, especially because my old Mini has just hit seven years.
Like many others, I put together a decent "tiny" Hackintosh (https://taoofmac.com/space/blog/2016/12/17/1840), but ended up converting it to a Linux workstation because it turned out it was more useful to me as a VM host.
So I'm still using my ancient mini as an instant-on desktop (can't quite beat Apple's BT keyboards, really), and am looking forward to upgrading it - I just hope Apple realizes that it serves "semi-pro" uses well enough to gift it with at least as much CPU and RAM as the current MacBook Pro range...
>Mac sales were up in 2016, once again outpacing the PC industry as a whole, and the new MacBook Pros are a hit, with sales up “about 20 percent” year over year.
Not really, what are people supposed to do? Move their entire ecosystem to Windows?
Just because it sells doesn't mean it's a great product. Balmer's reign is testament to that so I'm getting pretty tired of (non-shareholder) people pointing at spreadsheet numbers to justify Tim Cook doing a good job.
>Just because it sells doesn't mean it's a great product.
How about if the people it makes those record sales to love it?
"According to Brand Keys’ 2017 Customer Loyalty Engagement Index, Apple delivers a best in class user experience across every single category in which it competes, from smartphones to music streaming."
That's true for people who want 32 GB of RAM, a Nvidia GPU, a touchscreen etc. - I'm sure they haven't bought the MBP 2016.
But what about people who are fine with the specs and just don't like the details? You really need to have bought an MBP to know whether you like the Touch Bar and the flat/loud keyboard, or how well the battery works for you (given that the benchmarks are all over the place).
I'd love to see the percentage of returned machines instead.
But the whole point of a lot of the reaction pieces was that it was bad enough to finally shove developers off it and onto... whatever non-Apple laptop. (I don't really know much about PCs these days)
One can configure an HP Z840 workstation with 2TB of RAM and 44 cores. The vast majority of developers can and do work with much less. The question is, what are the true power users using?
If they are all developing with Xcode I bet they also have a server farm for distributed builds. That said, why would OSX devs need more that 16GB of RAM? It's not like they are going to be running virtual machines/docker images. Xcode isn't _that_ much of a memory hog.
Which industry are you talking about? I have not seen a single workstation with more than 1-2 GPUs. You are talking about 8 * ~250W common pro GPUs, 2KW that is going to be absolutely nightmare to cool/quiet in a normal workstation case.
I am absolutely curious as to what you're talking about because I have never seen this before, please correct me because I want to learn more.
Apple is talking about prosumers, not the rendering firms that need such massive GPU workloads. They'd be better off using server farms to render that much work. No one is going to work next to a noisy workstation with fans that has to disperse 2,000 watt worth of heat at full load.
I found that comment bizarre, someone correct me if I'm wrong but as far as I've seen in my own experience and other people in my field (3D CG) multi-GPUs seem to scale ridiculously well.
Seemed more of a case that no one wants to actually code for OpenCL on AMD...
Depends on the tools. RedShift for instance told us right after 10xx Nvidia cards came out that if we have lots of vertices (> 1 or 10 million - I don't remember exactly), you benefit from more RAM so get a Titan X. If you want to render anything with less, get more GPUs since you can't share RAM anyways, but the number of cores will help you.
Pleasantly Surprise that Mac Pro is still alive and getting a remake. Not only are they not killing it, they are also Pre-announcing it. Both action seems very Un-Apple.
My guess is that, Mac Pro, no iMac update, and Macbook Pro with touchbar had many pro users worried and start freaking out. And Apple think they needed to do something fast because they dont have anything to shown in the short term. I also wonder why touchbar wasn't questioned in the Interview if they think it was a mistake. TouchID is great, Touch bar is not.
In 5 years time, by 2022 we are very likely to get 7nm from Intel and 5nm from TSMC. There is no reason why, within the same thermal budget, we cant fit a 16 Core CPU, a GPU that is equal or faster then today's Top GPU, 128GB RAM, and PCI-E 4.0 SSD. I.e This iMac will be faster then many of the Mac Pro sold today, and likely to cover majority of the Pro uses.
That is why I am surprised at Apple continue to support the Mac Pro. It is highly likely the numbers of Mac Pro sales will continue to shrink.
I hope Apple look at Rack usage as one of the factor in its design. I see Mac Pro in Severs Rack as one of the potential to greatly increase its sales.
> For examples of the type of software that the current Mac Pro isn’t well-suited for, Federighi mentioned VR: “Those can be in VR, those can be in certain kinds of high end cinema production tasks where most of the software out there that’s been written to target those doesn’t know how to balance itself well across multiple GPUs, but can scale across a single large GPU.”
Interesting that VR get's a mention there. Federighi chooses his words carefully, and they're aware of the rumours surrounding Apple and VR.
Well noticed. Seems that Apple finally realized that their lines are not in phase with VR requirements.. both for content production and use. I'm not sure whether because they are working on a iVR .. if so MacBook Pro also need a good care as well.. more memory, gpu and a modular update strategy would be well come.
Doesn't necessarily mean Apple is working on VR, he could simply be referring to people making VR content (like video games) on a Mac where the current Mac Pro isn't up to the task.
I'm not sure Craig meant to say that. Tim Cook has stated he's not crazy about VR but thought there were quite a few applications with AR. Perhaps that's what he really meant?
That fact that the new Mac Pros will not be out till next year makes it sound like they've only started working on it quite recently, perhaps in response to the furore over their treatment of their professional users.
While people purchasing in bulk for enterprise contexts may have a different situation, on the individual level, I am financially bothered by the idea of a pro machine I cannot maintain, or am locked out of.
I hope the situation is not that the market of pro individuals have moved to the point of just throwing the whole machine away on an upgrade.
The Gruber article hints that Apple might have human manual maintainability / upgradability on a priority list, but I wonder what that means.
Given Apple's uncertain "Pro" roadmap, I've been on a beefy Windows workstation for a few years for some digital hobbies, but it occurs to me that I'll happily take a look at jumping back. The transition to Windows got me in the habit of relying most heavily on crossplatform software, so it would be a smooth transition back.
Unfortunately, I'm anticipating something with an outrageous design premium and so hyperspecced at every turn that it won't make much sense for me. It's a shame - shouldn't the company able to make a $300 iPad be able to make a dream $2500 workhorse for the amateur musician/videographer? I've been reading these kind of wishful rants since the 90's, so I know not to expect much.
I know the reference, but I don't understand what is meant by referring to it as "the last hope" and what/why something else is an alternative to that.
That a Linux based system is an alternative for a powerful Unixy system?
> That a Linux based system is an alternative for a powerful Unixy system?
As I noted elsethread, it really is. I used to use Macs exclusively, and I'll never willing leave Linux now: it really is that much better. The problem is that Windows & macOS are too constrained by their installed base: they don't have the freedom to be really revolutionary in their UIs, nor can they afford to support deep customisation (GNOME & KDE have similar problems — but one needn't use either GNOME or KDE to use Linux). Linux, meanwhile, offers a user true freedom: I can use a tiling WM, I can write code and bind keys to do anything I want. A Linux box running StumpWM and emacs is the closest thing the modern world offers to a Lisp machine, and it's awesome.
The Mac Pro was a problem when it went away from the cheese grater tower. It was a good clean accessible tower that looked good.
I do always need a macOS machine around to build iOS apps but the current Mac Pro is too expensive and limited for what it is, non customizable or upgradable really. I also want a pro machine that isn't an iMac, which really is what you have to settle for now, because I want a separate screen that I don't have to toss or I can donate when the iMac dies in 3-4 years.
A major problem is that Apple doesn't even make their cinema displays anymore, they were once the best screens and beautiful. LG is their monitor seller now. Why?
Apple is just missing the Pro users that like to customize their machine and modernize it. I have since moved back to custom PCs for my main power/pro machines and just do iOS builds on the Mac Pro 2012 now. The worst part is the next couple versions of macOS might not even run on that Pro because they are force EOL the hardware in the OS, not because of the lack of power as their hasn't been much progression there at all, just to EOL hardware when there isn't even a good new one to move to. I also feel a little disappointed in the new Macbook Pros. Some people I know are back on PC and just bought Mac Minis to compile their iOS apps.
macOS really is a great unix backed OS out there and the best looking for dev. Macs became so useful in 2006ish when they went intel and started creating the software around that and new web tech (canvas,webkit/webgl/khronos funding) that revolutionized. Unity on Mac pulled me into the Apple world again, for a time Unity was Mac only. Great things were happening after 2006 including the iPhone pushing dev to more macs. Apple has squandered that. They just seem apt to kill all that now and go totally proprietary and machines that are one block. That isn't going to attract Pro users or developers like it did a decade ago. They are losing their developers and pro creative/video/interactive users, that should be scary to them. Pro users are saying to Apple, "we'll believe it when we see it, for now we'll be over here".
I think it is funny they are scrapping the entire line because of GPU thermals.
A big chunk of their pro market just wants a pile of ram and cpu cores. They could offer that now with an intel integrated gpu for $1-$3K. Also, it wouldnt surprise me if intel gpus can already drive 5K, so you'd be able to actually plug it into the nice LG mac monitor (unlike the thing they are shipping). If not, they could sell a high end, but single gpu config, which is still a waste of a gpu, but at least it could drive current apple-approved monitors.
Also, delaying the entire line 12+ months for a heatsink is madness. Surely they could slap together a water cooler or something for a single high end gpu config.
They admit that there are thermal limitations, but it's not really in the context of GPUs so much as the entire package. What they do admit specifically about GPUs is that their entire prediction for the market was wrong, that multi GPU systems are not as important as the highest end single card on the market, and these cards simply can't fit in their current design dimensions.
In other words, GRAM sizes, and max single card speed are the driving factors in GPU adoption. Which makes sense, because you only get the large increases in GPU speed if you can fit everything in memory, and throughput is not high enough yet to ignore the cost of copying data to GRAM.
This is a tacit admission that Pro users are increasingly _not_ concerned with RAM capacity or CPU speed, but with GPU power.
The other mistake made was their assumption that Thunderbolt would allow nearly unlimited peripheral expansion, which turned out to be a complete bust.
Neither of things is solvable without completely changing the dimensions/specs of the Pro.
Serious question: What room is there for improvement and innovation in displays for MacPro's target users, aside from price-reduction?
I can see how photographers want faithful color reproduction with a wide gamut, good image consistency across the whole display, good resolution, and a decent size (maybe 32" tops?)
But AFAIK, monitors with those qualities are already available. The only thing I can think of that would make them better suited for professional work is to bring their prices down. (Because even pros have a budget.)
There currently isn't a single external display with a screen as good as the current iMac (that is, 5k resolution, P3 color). It's a sure bet that the next Apple display will match that.
The last Thunderbolt Display had USBs, ethernet, additional thunderbolt, even Firewire, and I use all of them. I can scarcely find a third-party Thunderbolt (not Displayport, Thunderbolt) display that has more than a couple USB ports, let alone the rest.
Believe me, I'd love to settle for a cheaper Dell, but as soon as you throw in a serious Thunderbolt dock (they start at $200), you're spending nearly as much, and now you've got some clunky lunchbox on top of your desk to boot.
I don't see any signs that the market won't repeat this pattern.
There doesn't seem to be a monitor that offers a large display (27-32"), 5K resolution, and USB-C ports with enough power to charge a 15" MacBook Pro while using it like a "Pro". The 5K UltraFine comes just short of this but I also worry about it's build quality. Other nice to haves would be 2 thunderbolt 3 inputs so you could for example use 2 displays through daisy-chaining. Maybe GSync and HDR for gamers and game developers.
I wish they'd revisit the first ever flat-screen iMac, the one that pivoted every which way and could easily be positioned at any height. I think now they've realised the design of the actual computer is pretty low down on pro-desktop purchasers' priorities, they'll go all-in on the display instead. Imagine the very best display Apple could make — maybe like the above with an almost bezel-free front, 5K, etc. — with an old-school Mac Pro tower full of expansion possibilities. It would be a dream pro machine.
Some people want a monitor that is endorsed by Apple, carries Apple's warranty, that Apple will support if something goes wrong, and that, very importantly for some folks, looks like an Apple product and fits in with the design aesthetic of the rest of their Apple devices.
Somehow when they released it, it was cheaper than Dell's 5K monitor. And somehow they've sold so many of them that they are the worlds most profitable PC business. Imagine what they'll do if they ever started competing?
838 points later and this post is only the 2nd reply directly to the OP? And much of the commentary to date is about command line tools on Windows? Wow. Blasphemy.
OK, I'll bite. I'll talk about the Mac Pro news.
First, I'll state the obvious and applaud Apple for conducting a "Mac State of the Union" with a Congress of the Apple press. Bravo.
Looking at the details, I think we should be careful when assessing the use of iMacs, laptops, and Mini's by the pro user base. Specifically, were these folks running to the pro merits of these other devices or running away from the lack of a viable Mac Pro option? I think the latter.
I'm a Mac Pro user (2009 5.1) and devotee. I also own a top-line 5K iMac. I'd much rather do heavy computing on the Pro. For one thing, even something as simple as playing a 1080p video on the iMac wreaks thermal hell and sends the fans into overdrive. My oldish (and well-liked) Macbook Pro did similar things.
But what I truly like about the Pro is its easy expandability. In fact, I wish that it had more slots! Mine are always full (and I still lack a fast flash disk card).
So I am very, very psyched about the announcements. I am hopeful that the 2018 offerings are not outrageously priced.
Finally, as an aside, those who just can't wait until 2018 have an option. Very decent Mac Pro towers can be had used in the $1K-$2K price bracket. Maybe less if you mine Craigslist. If I were in the market for a non-laptop Mac, that's what I'd do.
Note that I am not trying to criticize non-Pro Macs. I just wanted to contrast them with the Pro, hopefully highlighting the latter's merits.
Assuming you can run another OS than OS X (meaning that you have some Linux, Windows or BSD mileage and are confident about administrating such a machine), then look around you on the refurb market, get your hands dirty and you can build a _great_ workstation at prices that are much less than Apple's. I have done so and I do not regret it at all.
For those who can run another OS besides macOS, then Apple hardware has rarely been the best option, and never the cheapest. The point for many, like me, is the combination of both the hardware and software.
I'm looking forward to being able to land on moons in Elite Dangerous Horizons in VR on a Mac at some point.
In the meantime I got a Zotac EN1060, a machine the size of a Mac Mini that runs any game I throw at it at 60 frames per second, which makes booting into Windows tolerable.
Graphic design and coding work remains in the realm of a 2014 Macbook Pro.
If Apple seems intent on making things right with its high-end desktop consumers, I wonder what that means for those of us pining for the return of the esc key on the MacBook Pro.
Fans of the Mac Pro have been waiting four years for this announcement, and are likely to be waiting another one or two for any real movement on the problem.
The Touch Bar seems like a pretty apparent failure in that even most of the "until death" apologists can't or won't defend it, but Apple's not going to give up this quickly. I think you'll get your wish, but not soon.
>> Fans of the Mac Pro have been waiting four years for this announcement, and are likely to be waiting another one or two for any real movement on the problem.
The troublesome thing with this article is that even if Apple put out a Mac Pro with the latest and greatest guts in the old cheese grater chassis today, pros would be extremely happy. FWIW, I still think the cheese grater chassis is a great design.
Sure, design matters, but the livelihood of pros depends on their ability to get sh!t done. Just giving them access to "less pretty" hardware that does what they need today is better than making them wait another year or so.
Of all the Macs, the Mac Pro is probably the easiest to design. What people basically need are modular-PCs that can run OSX on the latest hardware and gives the user choice with respect to GPU manufacturer. There's less of a need to make it the smallest possible computer, because smaller makes it harder to do upgrades.
I'm sure it's been pointed out to you before, but there's a system-level option to map the Caps Lock key to Escape. It took me about a day to adjust, never looked back.
The Osborne effect is a term referring to the unintended
consequences of a company announcing a future product,
unaware of the risks involved or when the timing is
misjudged, which ends up having a negative impact on the
sales of the current product. This is often the case when a
product is announced too long before its actual
availability. This has the immediate effect of customers
canceling or deferring orders for the current product,
knowing that it will soon be obsolete, and any unexpected
delays often means the new product comes to be perceived as
vaporware, damaging the company's credibility and
profitability.
This is completely irrelevant to this situation. Their Mac Pro sales have surely already reduced to a trickle, and they will know that they will not sell a huge amount of the updated bridging model. The iMac update has been expected for a while so they're not changing much mentioning that's coming later this year either.
It's not like killing the current Mac Pro sales is exactly going to put Apple in trouble - they're raking in money from the iPhone and MacBook (Pro) lines.
This is a carefully thought out plan from Apple for sure, unlike Osborne's blunder.
I would guess that the only people buying Mac Pros at this point are people who need to for infrastructure compatibility reasons, and/or who need something more powerful than a MBP but absolutely can't go with an iMac for whatever reason. These are mostly people who probably would have been buying more even without the bump. So the bump is just a nice bonus and tip of the hat from Apple to hang in there.
Example of what I'm talking about? Check out this Mac Pro server rack setup:
I don't think they care about this. Mac Pro sales aren't large enough to be important to the company's bottom line. It's far more important for them to keep pro customers from abandoning the Mac altogether.
That was a relevant comment! Given that at least part of their former loyalty customers are considering to migrate to other platforms, that would mean they're trying to break the speed of migration.. which is another aspect not covered by the Osborne effect you cited. The time runs against Apple. They are playing the rollete.
Not sure why it would need to be smaller, the Cheesegrater was exactly what I'd expect a Pro level machine to look like and I've never been an apple user, I just love that case.
It's a pro level desktop, size (within reason) constraints aren't really an issue.
I agree. Bringing back the old design would immediately solve a lot of problems, but Apple won't do that. Even with a new redesign, I assume the system will still be designed to accommodate standard-sized GPUs and other add-in cards, unless Apple is going to use custom packaging, which is doubtful. That's why I guessed that they could potentially make the new Mac Pro smaller and have it sit on a desk instead of the floor, etc. Instead of the current cylindrical Mac Pro, it would go back to something box-shaped.
I wouldn't mind a slightly smaller cheese grater truth be had. It's a heavy beast of a machine and takes up a fair bit of space. Although it doesn't really move much and I don't notice it too much sitting next to me. Top specs and upgradability are my top feature requests.
I think the reason to keep the Mac Pro above all else is to have a halo product for their Mac OS line. Yeah they don't expect large sales but it should be the best that Apple can do with the technology, a preview of stuff to filter down with appropriate early adopter tax.
Apple should not lose focus of their roots. They need to remain committed to desktops and laptops. With Ryzen there are now tons more options.
While the iPhone and iPad businesses are solid, they can be cyclical and nowhere near as robust as their core faithful mac users who have stuck with them for decades.
Few professionals are going to muck about with a hackintosh and I mean no disrespect to those who muck about. I have done it myself, and its great as a curiosity and to learn but at a point you just need to get things done and have proper seamless hardware support for all the peripherals for pro level work.
Interesting - buried in there was the bit of strategy on the part of Apple, expecting software to take advantage of parallel GPUs, and that not really panning-out in the high-end film production apps.
Brand loyalty is silly. It does mean I need to avoid platform lock-in, but it is completely worth it. I buy the best machine available for me at the time. When it was time to replace my MacBook Pro 2011, it wasn't a machine from Apple. I bought a Surface Pro 3 and thought that I'd switch back to Mac when Apple sorted out whatever their problem is.
Vapour Race 2017: Next Gen MacBook Pro or Surface Pro 5.
It is indeed silly. My own mentality is that I'm "Post-OS". I'm at the point that I could use whatever OS I want for my work stuff, including Windows. They all hit the bar of "good enough".
I happen to use Windows today, because it runs the most apps that I want to run (in particular, some Steam stuff not available on Linux or Mac), but with the help of dual boot, I could just as easily be on Linux or Mac (all other things being equal, I'd prefer single booting).
Personally, I left Mac after a bad experience with the 2011 MBP and not liking the design direction of Apple hardware, but I still consider Apple hardware (for the most part) to be the gold standard in build quality. Macs may not suit my use case, but thankfully there are plenty of other fish in the sea.
I think it is great, all the usual Windows items apply.
Specific to the SP3 Pro: use the SP4 keyboard, it is much better and is completely compatible; the dock is expensive and worth it: 2 display port connectors so I have the surface display, 4K monitor + 1K monitor attached working at the same time - Also wired Gigabit Ethernet; Magnet connector for the dock/power is super convenient; Power adapter has USB charging; I use the new Surface Ergonomic keyboard and a Logitech MX Master Mouse at home and the Surface foldable mouse on the road (best portable mouse I have used). I keep the surface on my desk so I can use it with my pen, I just flip the keyboard underneath it. On the road: the MicroSD provides all the extra storage I need (movies on the plane), and it has a real USB 3 port, battery life is around 5 hours, but better if you are only using it to watch movies or reading. Runs VMs effortlessly with i7 processor.
Biggest problem is power, sometimes it sleeps and won't wake up. Happens far more often than it should, but I haven't heard the same complain from people with Surface Pro 4s.
I recently got the Surface Book, and it's pretty fantastic. Both the keyboard and touchpad are great. I like the keyboard even better than my late 2012 rMBP's keyboard.
I have a SP3, and it's nice, but I didn't like using the type cover. I much prefer being able to keep it in my lap, with the screen suspended from the keyboard. I'm sure it works great with a dock, but I haven't use one.
This is an interesting PR approach, non-Apple process. They took time to talk to their top journalist connections in order to halt the the "apple doesn't care about pros anymore" articles. I applaud the additional transparency, smart. What I am not convinced about is that they are building the right solutions. How did their product teams miss this gap several years back? If Apple saw the "developers" as a growing user base, why did the new Macbook launch with a touch strip? They should know what device (laptop vs desktop) developers prefer. Did the Macbook and Mac Pro leadership change? Was it personnel issue?
Overall an interesting situation I would love to better understand. If all revenue and growth projections are positive, why appease a vocal minority? I assume they do believe their "early adopter developer creative types" are a vocal minority that can sway a large consumer base's brand perception.
I am really curious what Apple feel constitutes "pro" apps. "Software development" is mentioned, but the general sentiment from Hacker News and developers I have talked to seems to be that Apple do not understand their "Pro" audience, at least when it comes to the Macbook Pro (granted, it's a different beast than Mac Pro).
> Apple’s research shows that 15 percent of all Mac users use at least one “pro” app frequently. These are apps for things like music creation, video editing, graphic design, and software development.
> Schiller, on Apple’s own pro apps: “I just want to reiterate our strong commitment there, as well. Both with Final Cut Pro 10 and Logic 10, there are teams on those software products that are completely dedicated to delivering great pro software to our customers.
> seems to be that Apple do not understand their "Pro" audience
As a few people have noticed (Recall Arment and possibly even Gruber pointing this out) for a while now almost all their Pro user footage used in advertising and announcements has either been photographers, directors or people sketching down ideas on iPad Pros.
Kinda seems to be ignoring the intersections of creativity and computing that the original Mac pioneered.
Maybe they'll update aperture (one can only hope for good, not so expensive photo editing software that allows one to put pictures into folders and projects...)
Working in a small-ish company, my job responsibilities are relatively fluid, with major shifts coming every few years. First, it was a shift away from Windows to Mac to develop PHP. Recently, I've off-loaded most of my graphic design work, which means I'm less reliant on the Adobe Creative Suite. So, my next shift might just be to Fedora or Ubuntu.
Not that I'm not satisfied with Mac OS or Apple hardware. But, they refresh too infrequently. I love the mac mini, my late-2012 is still chugging along nicely. But, I was able to expand the memory and swap the HDD for a SSD.
I don't need the retina iMac, I'd rather put the money into more memory and faster storage. For the same reason I'm not a customer for the Pro; I don't need the Xenon or enhanced GPU. So, where do I fit into Apple's product line?
After Apple launched the MBP's back in October, I moved to Linux for all my development work (web included). If you're not reliant on Adobe as much anymore, it's an easy transition.
I installed Gimp for basic photo editing (cropping, basic toning), and it's really not too bad once you get used to it. I have a long history with Adobe products too.
Unless you're actually allergic to 5K displays, probably the high end iMac. Yes they're expensive, but those industry-leading profit margins and high ASPs have to come from somewhere.
> One of the good things, hopefully, with Apple through the years has been a willingness to say when something isn’t quite what we wanted it do be, didn’t live up to expectations, to not be afraid to admit it and look for the next answer.
Yep. What a silly thing for them to say as long as the Touch Bar exists.
But seriously, Apple has a history much closer to the opposite of that. Never admit fault. If something really isn't working, pretend that it is right up until a replacement is launched. Just like PowerPC to Intel. Just like the Mac Pro until today. Just like the Apple Watch and Touch Bar (admittedly these last two are a little more speculative on my part, but I think it'll happen).
When the iMac was introduced, I worked at an Apple dealer. I can still remember one client who used to do video production work on an iMac. There may have been others but I distinctly remember one.
I've given up on OSX except as an iOS dev platform.
I'm using chromebook hardware (high end, but $200-500) for daily use most of the time, for reasons which will be clear in a while. (I still use iOS for mobile, though.)
For high-end computing, I just got an Acer Predator 17X "gaming" laptop (it was a toss up between that and a Dell PWS 7720). $2850, 32GB/512GB SSD/1TB, 17" 4K screen, GTX 1080, great keyboard, external mouse, 1-3h of battery life under hard use. Add 2 more NVMe SSDs and 32GB RAM (64GB total), with Linux/Win10 dualboot. It's pretty amazing. The alternative Dell PWS was about $4500 for a similar config and a tiny bit better in some ways.
I understand the consumer market for, say pre-built Dell desktops with an i3 you get your mom for surfing the web.
What graphics professional isn't assembling their own rig though? $3k is a LOT of money to spend on a desktop.
I'm old enough to remember dedicated SGI workstations, and a decade ago I felt OSX had some advantages in graphics, I can't think of any of those that exist today, at least to justify such a huge markup on parts I could buy myself.
I'm generally curious, I feel like I'm missing out, people that would consider paying $3k for a computer made out of easily-obtained components, why is it worth it to you?
My wife has a photography business. Photoshop and Lightroom are essential tools for her.
Right now she uses Windows 7 Pro. We can't stay on it long-term because of security (and eventually driver) issues.
Migrating to Windows 10 (non-enterprise) isn't an option because the forced updates are an unacceptable risk to downtime, especially at certain points in her business calendar (e.g., highschool yearbook photo season).
We may end up migrating to a new Mac Pro, but it won't be because the hardware is awesome. Truth be told, it will likely be overkill for her needs, and definitely overpriced.
We'd migrate her business to a Mac Pro because OS X doesn't have Windows 10's problems, and because we can probably recover from a hardware failure quickly due to the system's expected modularity.
I'm not here to sing Windows' praises, Win10 has had it share of annoyances for me, but I've just set it (non-enterprise) to have to prompt me for updates, there are no forced updates. If that is the big feature that has you prepared to spend a hefty premium you might look into it further.
> Photoshop and Lightroom are essential tools
Those are in the cloud now, and performance is mostly based on your GPU.
>Truth be told, it will likely be overkill for her needs, and definitely overpriced.
>We'd migrate her business to a Mac Pro because OS X doesn't have Windows 10's problems
This is insightful, thank you.
>because we can probably recover from a hardware failure quickly due to the system's expected modularity
Every hand-built PC will be just as or more modular though. If that's your concern, look at, say the Mac Minis with the RAM soldered in, I love Apple industrial design but don't like how hard it is to upgrade the hardware, generally.
There's one number to call, and they'll repair any part of the machine up to and including swapping out the whole thing. Anyone buying a machine that expensive they really need would(and should) be looking at 24 hour turn around on site service anyways.
And the only way to really get that is to buy a whole prebuilt machine from $BIGCOMPANY
"The second would be to bite the bullet and tell the world what your plans are, even though it’s your decades-long tradition — a fundamental part of the company’s culture — to let actual shipping products, not promises of future products, tell your story."
Interestingly enough, they also broke secrecy in 2013 when they gave a sneak peek of the new MacPro at WWDC, which would only ship later that year. They probably only announce products when they're ready to ship to not hurt sales of the existing models, but I'm guessing that won't make much of a difference with the current MacPro.
So Apple is completely rethinking the Mac Pro for a 2018 release 5 years after their last complete rethink of the Mac Pro? Does this mean it'll be another 5 years of no upgrades while they completely rethink it once more?
This is the problem: Apple has just lost user trust with this one.
All they ever needed to do was have something like the pre-cylinder Mac Pro: something PC-like. A case with replaceabl parts and lots of expansion slots. That's it.
Have any Mac Pro users long since moved on to Windows (or even Linux) or gone down the hackintosh route? Who is going to trust the Mac Pro at this point?
Every company comes out with dud products once in a while. Remember the G4 cube? Apple's had a bad run when it comes to desktop machines, but you've put your finger on exactly what they need to do next: pre-cylinder Mac Pro revamp. They've done exactly that before, I don't see why they wouldn't again. Of course, if the 2018 Mac Pro turns out not to fit this particular vision, I'd guess it's pretty much game over for all but the most niche of their pro desktop customers.
This is excellent news. Even if you don't end up using one, I think it's motivating if a powerful upgradable Mac Pro exists. Something to aspire to.
Depending on your area you might at some point have to do lengthy computations or you start working with deep neural nets and then want to have Nvidia hardware.
I think it's great if you can stay on your favorite platform and have everything on one system.
Also: continuous integration for iOS. If these machines will be reasonably priced it will result in faster test build times for many engineers out there.
So I'm very happy about this unexpected announcement.
Wow. I don't know what the time scales are when companies like Dell or Zotac spit out one of their bespoke desktop towers, but "not this year" for something they obviously already worked on for a while seems long.
Obviously the trash can was a huge design effort and I get the feeling they want to be just as revolutionary this time if they spend so much time, when they are obviously in a hurry.
Shouldn't they just be making a new 2 socket cheese grater tower? As simple as possible? The USP of the Mac is Mac OS, not that it uses a custom power supply.
As the article points out, Apple felt the need to make this announcement even though the new Mac Pro won't be released until next year, because they don't want more pro users to abandon the platform.
That's a pretty sorry state of affairs. I don't know if there are good statistics about this, but I wouldn't be surprised if quite a few users have already abandoned Macs for Windows. Microsoft is making a big push to attract creative professionals, who are heavy Mac users as far as I know.
My impossible dream is that Apple would simply release a microATX motherboard containing all of the "Mac goodness" and bundled with a license for macOS.
i would settle for a barebones machine with the basic CPU, onboard graphics, and like 4-8gb of ram but the ability to add in whatever cpu(s) and GPU i wanted, under $2000. Preferably around the price of the base imac.
Really hanging out for the nvidia Pascal drivers for my almost perfect pci passthrough VM system. I can run GPU accelerated Windows and Linux apps at (near) native speeds without rebooting, if Pascal drivers drop I'm off to the races with MacOS as well.
Can you go into a bit more detail? You're running Windows and Linux on Mac in a VM with graphics acc. at almost native speeds? Or on Linux running Windows, and possibly MacOS when (big IF) Nvidia releases Pascal drivers? If it's on Linux running Windows - I'm all ears.
Recent Linux kernels and have added the ability to pass through PCI devices to QEMU guests.
I run NixOS with Windows 10, Windows 7, and Ubuntu guests. Only one guest can be run at a time. I have a second GPU with a second screen that gets passed directly to the VM. I use numactl and nice to half surrender half my CPU cores to the VM and static huge pages to hand over a chunk of memory. The VM also gets passed an entire dedicate SSD for application storage. I believe libvirt can do most of this automatically now. I pass in a seperate bluetooth controller for keyboard and mouse.
Performance wise you're missing some CPU power and RAM, but otherwise running at full speed. Applications run flawlessly, its my dream development machine. I dont game much but I get the same FPS from the vm as bare metal windows on the same machine, as CPU and RAM are not my bottle neck.
Id I had a supported GPU or the Pascal drivers dropped I could get MacOS running and it'd be a one stop shop.
Thanks! This sounds almost exactly like something I need. Why separate keyboard and mouse though? Only difference that I would prefer is to retain the same keyboard/mouse/tablet and monitors (three of them) and to be able to easily switch between the two, preferably to have copy and paste working between the two. Alternative, I'm pursuing right now, is a KVM with synergy, but I have limited deskside space (for machines) - I'd rather have one machine with 20 cores and four graphics cards than two machines.
To get the performance benefits the VM can't be run with a spice / vnc interface, so there is no virtual screen or virtual input. You have to pass the VM its GPU and some USB ports and can only see/interact through those and the network bridge.
My keyboard and mouse can pair to multiple different machines at the same time and switch with a button press so I just do that.
Synergy also works fine, you just need to temporarily pass a real keyboard in to set it up.
There is also a newish feature in QEMU that allows they physical devices to be swapped back and forward by pressing both CTRL keys [1]. It works reasonably well with a few quirks (Extra mouse buttons dont work in guest, for example)
Same thing apples to the screen, I surrender an entire monitor to whatever VM is running, but you could use a KVM switch.
It's a little work to get set up and tuned but it is 100% a dual boot killer. The next time I reconfigure this machine, I won't be installing a native Windows partition, I simply don't need it anymore.
This is why monopolistic companies suck. It literally does not matter what Apple does. They're the only vendor of MacOS and Mac hardware. So what if you hate their ancient Mac Pro line? If you're tied to their ecosystem you have to pay up. And because you pay up they can claim their way "won". But your original demand for a good is still unmet. You've just found a way to settle for what the monopolist will grant you. That should really piss people off.
They have world's top engineers, designers and dollars! And they use reporters send out an apology. Can't even do it directly to its users/fans. Reading the content it feels like a damage control PR.
Apple has lost its way. Their support has become horrible if not arrogant. Their updates keep bricking devices more often than not. Their hardware fails more than it used to.
If they thought the Mac Pro was a mistake, how is the MacBook Pro a success?
Can't innovate anymore, my arse, you bet you can't!
"Overall, the split between notebooks and desktops in Mac sales is roughly 80/20. (Personally, I’m a little surprised desktops account for even 20 percent of sales. I would have guessed 85/15, and wouldn’t have been surprised to hear 90/10.)"
Is this because some companies are running rendering farms on a whole bunch of Mac Pros?
Curious how many of y'all have seen Casey Neistat's workbench explainer vids and his comments about the Mac Pro...
In so many words he uses it as a data management interface for his many TBs of external hard drives and that's about it. This struck me pretty hard in framing my opinion of what the Mac Pro really is to those who have it.
I bought a macbook pro back in the summer of 2015. It is still running strong despite the coffee I split on it.
I am hoping for a serious memory upgrade when every they get around to it.
I am curious what others are using for external storage? I did not splurge on a big internal drive, but I am finding any video work really consumes a ton of space.
This might be too little, too late. They should do what they did with the Intel transition; start selling PCs which can run Mac OS X as an interim measure.
Why might they not do this?
a) makes them look bad
b) well... is there a CPU architecture transition coming up? Last time there was this performance block, it was time to move from PowerPC.
This is a combination of vaguely encouraging, and vaguely maddening.
Perhaps I'm just a Luddite, but what I really want from my desktop Macs is, basically, what I already have in my 2008 Mac Pro and my very ancient Mac Mini, just updated, because those machines won't last forever.
The 2008 Mac Pro has the giant aluminum case. It has four drive bays. That machine has been an absolute warhorse for me. It's pretty much been running every day since 2008. I've produced a lot of video clips and multi-track songs using Logic. The only things I haven't liked about it have been (1) Snow Leopard was more reliable for audio, on this box, than later releases, (2) Apple has gradually walked away from things I wanted to do with the server subsystem, like maintain a usable current version of Apache in the OS distribution, and (3) the box is quite loud for use in a recording studio.
With the Mac Mini I am actually planning to buy some newer refurbished Mac Minis with SSDs. These machines are almost perfect for use in recording situations.
Apple is so obsessed with design -- the tin can design of the modern Pro, whatever "modular" design they are cooking up with the future Pro design -- that it doesn't sound like they will consider that the old Pro had an industrial design that was almost ideal, with the exception in my view of the noise level. A honking big case with a lot of thermal mass and big fans and a lot of room for hot memory and drives is in fact perfect. It is beautiful to me because it is so simple and reliable. It doesn't need to be tiny. Another option would be a rackable version. And that's it. That's all I need from a high-end computer.
For the low-end utility machine like the Mini I want it to be small and _silent_.
The iMac probably has enough CPU but I don't think it is quiet or expandable enough and I need 4-16 terabytes of storage right in the box and an easy way to back it up _to physical drives in my own house_, not the cloud.
I'm still resentful of the thousands of hours of work I put in to Aperture projects. Tens of thousands of photos, many with a lot of delicate editing, which don't even render on the screen correctly anymore. I'm still resentful of all the projects I had built in iMovie which don't work in later versions because of the features that Apple jettisoned. If Apple has a solution for the two big needs I've got -- the small ultra-quiet media "capture" capability (for audio), and a big honking _simple_ Pro for editing and production, _and_ it appears they are serious about maintaining Logic, then I'll stay with them and probably buy more Macs. If Logic goes, I'm gone. (Mac user and on-and-off developer since 1985... Apple user since 1977...)
It feels to me as if their thunderbolt experiment with the Mac Pro was entirely the wrong way round. What I suspect they'll do with the next round of iterations is go strong on thunderbolt with the iMac; at least 4 ports instead of 2. The iMac has a bigger market, so that's how they'll drive adoption. The Mac Pro, meanwhile, won't be so reliant on thunderbolt, although it will still support a good number of ports - why not? - still probably 6.
> We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do.
So perhaps starting in 2016, or 2015, or 2014 would have been a good idea?
I always assumed that upgrading the Mac Pro was going to by daisy chaining them with Thunderbolt connections. Guess I was wrong, but I'd love 24 cores and 6 graphics cards on my desk even if the price was crazy.
If Apple makes upgrading components easier for themselves that will likely mean users and resellers can mod them more easily as well. I wonder how the 'next year' comment fits into the CPU and GPU roadmaps.
It'd be pretty awesome if this came w/ Ryzen. Official ECC support ought to be a thing by then. Given that they already have a relationship with AMD for graphics cards it doesn't seem impossible.
Apple should go beyond trying to woo back their "power users" and design something to impress non-Mac users so that they’ll want to buy the new Mac Pro just so they can run Windows or Linux.
"The Mac is a $25 billion business for Apple annually, and according to the company there are 100 million people in the active Mac user base worldwide."
It is really interesting to read the articles from the other people who were there (links in the daring fireball article) and see the little differences in what each of them put in or left out.
Many people have already moved to Windows, Linux, and hackintosh. If Apple wants to win them back it will need to offer something really good in specs, features, and specially price.
Too little too late. It's telling that they sell so few Mac Pros that they can ignore the Osborne effect of soft announcing a new model at least one year away in favor of some fluffy PR.
It's strange they don't understand that doing nothing but keeping the Pro up to date spec-wise would've signaled mild disregard already. Keeping it frozen for four years is a clear, prolonged, 'we don't care about you' message to pros.
And everyone could've told them the trashcan design made no sense, and in fact did when it debuted. No need to wait four years for that. Probably Ive really really liked it and nobody managed to stop him.
Apple has “great” new iMacs in the pipeline, slated for release “this year”, including configurations specifically targeted at large segments of the pro market.
This just fills me with dread. It just seems like Apple continually doesn't get it and the "iMac is a pro computer" is the canary in the coal mine. We already switched to Adobe for all video editing after the Final Cut Pro fiasco, and if this crud keeps up I can see some serious pressure to just move to Windows boxes. I'll hate it, but I'll understand.
Lots of different types of 'pro' user; there was mention in the post of xcode developers being either the largest market, or the largest growing one. As a developer who currently uses a 2014 MBP (mainly docked), I am very, very likely to be in the market for either a new iMac later this year, or a Mac Pro next year. I'm glad Apple is, finally, looking to address this issue.
I can see the MBP (only portability option, I use one myself) or the Mac Pro, but the iMac has turned into such an unexpandable machine, it troubles me that anything Pro is associated with it. It serves too many masters at this point. I hope the new Mac Pro is good.
Too late, I moved away from Apple. My current setup is Windows as desktop and I log in to linux for dev via X11. I am still testing the setup, but so far it has been a very positive experience. I'll share my complete setup on my blog when I consider it stable.
I have read quite a few confusing things about new MBPs. I'll probably stay put for another year before bailing out of the Apple ecosystem (aka distortion field).
My brother has one - it's quite a nice machine and the touch bar is way cooler than I thought it would be (whether it will be really useful is probably yet to be seen over the long term).
He's mostly using it for video editing (4K/5K) and it screams along in Final Cut Pro X.
I would hope it has the standard meaning, ie you can buy one now in one configurations, and change or upgrade single components of it to achieve a different configuration in the future.
What an incredible unprofessional move, this would never have happened under Steve Jobs.
Apple has been known for having the patience to wait with introducing things until they got them right. I guess this really is the Scullyfication of Apple. Desparate moves rather than just putting your heads down and come out with something new when you have it.
If they were really serious they would make an open letter or they would write out to their Mac Pro users personally. That way their "openness" would actually make sense and they would show they really cared.
Pushing this to a little elite group of bloggers is just an amateurish move on Apples part.
Well now we're talking about subjective opinion. To me, press releases and corporate double-speak are petulant. Actual engineers speaking frankly to journalists, to me, is the height of professionalism.
All this is subjective. When Jobs wrote an open letter about not supporting flash that wasnt a press release that was him taking responsibility and adressing something heads on.
Making it engineers job to adress a leadership issue is exactly the wrong kind of frank.
But maybe it's just a matter of them realizing that they have to announce something because they know releasing when it's ready means next year... and last year was already late.
The company has changed in a lot of ways since 2011, some in a bad way, some in good. Times are changing too. Get on with it.
They dont have to do anything. They can just come out with things when they are ready. The Pro line is such a specific usecase that they wont be missing that many people and even if they loose some they will come back if what they come with really are as great as they claim.
If they really wanted to put out the words that they are working on something they should hint at it instead.
Tl;dr Theatrical announcement of minor spec bump followed by undescribed future replacement product in the next 18 months, for a widely disliked and stale product makes front page of hn.
The current Mac Pro, as we’ve said a few times, was constrained thermally and it restricted our ability to upgrade it.
So, when creating the 2013 Mac Pro, they knew they had to make a choice: either an upgradable system or a thermally constrained one that might look cool on the desk.
What kind of market research suggested that Mac Pro customers wanted the latter? Or were they expecting to capture some new kind of pro customer base that only buys sleek desktop cylinders?
> So, when creating the 2013 Mac Pro, they knew they had to make a choice: either an upgradable system or a thermally constrained one that might look cool on the desk.
No, that's not what they said at all. They said the choice they made was dual GPUs because they thought that was the road everyone was going to take. That turned out to be incorrect and the cylinder suffered because of that decision. They weren't talking about field upgradability.
The question stands regardless of who is doing the upgrading: why did Apple design a thermally constrained system for the pro market? What was the point?
My experience with the 2013 Mac Pro is dismal. They suffer from a widespread GPU overheating problem that leads to crashes on heavy workloads. It's probably the worst Mac ever.
Three years is excessively too much time in any modern measure and that's an unacceptable excuse. Lack of vision is a more acceptable excuse.
I'd take the Mac Mini form factor and make a whole range of pluggable computers from AppleTV-size for $99 to MacPro-size where you can buy as many as you want and use them for the simplest (IoT) or complex tasks (Video editing) while adding more cores, gpus, ram or anything you want is as easy as opening the hatch and installing them.
Vision, not excuses, and looks like Tim Cook doesn't have it.
What Apple says is coming: new Mac Pro and branded external display sometime in the future. Updated, prosumer level iMacs coming this year.
What I think will happen:
2017 - Apple releases updated prosumer iMacs
2018 - Apple releases external monitor, also perfect for use with MacBook Pros. Claims to still be working on Mac Pro
Late 2019 - Apple updates prosumer iMac again, says Mac Pro isn't needed anymore, weathers the twitter/blog rage of the single digit % of Mac Pro users that hung on from 2017
The world would be better served by Apple getting out of the hardware business and just selling OSX, allowing it to run on any x86 architecture without all of the Hackintosh bull.
I don't understand who these people are that are spending a 50% premium on components that are two to three years old. I use a MBP regularly, but only because my work buys them.
They did the licensing thing in the mid-90s. Mac clones were a thing.
To license macOS means to support infinite hardware variations, and to rely on OEMs not to suck. It is an impossible technical problem for a company whose whole ethos revolves around integration of h/w and s/w.
To license macOS means to support infinite hardware variations, and to rely on OEMs not to suck.
Kind of.
Apple had to approve any clone designs. It wasn't as willy-nilly as the PC market. At the time Apple allowed cloning, Apple itself had a large number of varying models to support.
Apple's big problem was that the clones were faster and cheaper. For example, the Umax C500 was available up to 240 Mhz while the PowerMac 4400 was only available up to 200 Mhz and the Umax machine was $400 less expensive and that was on the low end.
Power Computing's clones were high end and either matched or outperformed their Apple equivalents for less money. The PowerTower 200e was released the same day as the PowerMac 9500/200 and it was $1300 less expensive.
In the really high end, the Daystar Genesis MP smoked everything in Apple's product lineup.
The cloners were bad for Apple but only because Apple couldn't compete with them. Developing both the hardware and the software was too expensive for them to not be able to make all of the profit on every sale.
The Daystar Genesis MP generated a lot of "holy cow" moments.
Plus, we are in a different world these days. Intel hardware, due to Intel's keeping the bus proprietary and getting rid of the chipset makers, is a lot more generic than back in the day. NeXT and a lot of open source projects today have a list of hardware you can use. The list for today is a lot shorter. I think a lot of companies would thrive on making good macOS machines.
I really think Apple should just stop making any PC other than the MacBook and iMac lines. That's where their heart is anyway. License the OS for $250 a pop and sell for the same. If they are that concerned about what happened before, limit the sold macOS to Xeon cpus only. That will get rid of all the portables that might reduce their MacBook sales.
Mac OS already runs on lots of different hardware. Just look at the list of compatible hardware for building a Hackintosh. Graphics cards aside, there's a lot of compatible components.
Actually it would serve Apple very well. Most of Apple's profit these days comes from iOS devices, and more macOS in the hands of users would definitely lead to more iOS adoption.
Also, systems would be available to suit power users - Apple currently offers exactly zero in that space. Power users influence many buying decisions beyond their own systems. It would be good if macOS could be seriously used for hard-core science and engineering.
Apple's Mac business is the most profitable PC business in the world, they make more profits from Macs than the rest of the PC makers in the world. They are 7% worldwide by units, about 17% by revenues (ASP around $1200 vs. industry $500) and margins of 15% vs. industry 2-3%.
It's a $25B business that's steadily taken significant market share from Windows/Linux over the last decade.
No one said they should "give this up". If Apple's hardware is good enough to outshine whatever competition, it will continue to sell. Plus, they could easily restrict who they license the OS to - perhaps only to a few high quality OEMs and end users who'd just build a Hackintosh instead anyhow.
I'm certainly not advocating giving macOS away or selling it cheaply either. It's worth pointing out that every 2,000,000 units of $500 macOS would be $1 billion as well - which could quickly come close to the margin on that $25B.
I never had success with Linux. My Macbook pro has had its shares of problems (Wifi issues that later resolved with a system update) but it's nowhere my experience trying to install Linux and battling the drivers issue.
Anyone figuring out the Linux/Laptop problem is re-inventing the Macbook Pro/OS X.
Here are things that I'd pay $1,000 on top of the current Macbook Pro model:
- Thiner/Lighter
- Longer Battery Life (5+ hours)
- 32/64GB RAM
For OS X:
- Less cluttering (ie: remove all Apps and let the user decide what to install, like Siri and crap).
- Native Package Manager
That's about it. I'd be buying the new Macbook Pro in a month. But if Apple releases something like the above, I'm more than happy to drop 5-8k usd into it.