> Apple indicated that this issue required a substantial amount of changes on their side, and that they will not back port the fix to 10.9.x and older.
What? So all OS X boxes are simply broken, privileges-wise, if they're not on 10.10?
Apple's model customer is one who upgrades often. If you want solid support for old products, stick with Microsoft, and accept that their products can be clunkier because of deliberate choices to maintain backwards-compatibility.
To be fair, OS X updates are free and usually run well even on 5+ years old hardware. OS X has kinda gone the way of Chrome, with most users on the newest version.
Yep. JWZ used to say that Linux is only free if you don't value your time.
OS X has many merits on this front, but since 10.3 I've found that the first few dot releases of OS X have the same caveats often enough (both in terms of bugs/hazards and in terms of gratuitous UI "progress") that it usually seems better to wait past .x.4/5.
Now there's this, of course.
Since the problem showed up in 2011, maybe I should go back to Snow Leopard; I can't think of a single valuable change to OS X since then...
> JWZ used to say that Linux is only free if you don't value your time.
My reply to that has always been that Windows is only $300 if you don't value your time. (Preserving archaic cost of Windows to match JWZ quote.)
The implication that the one system is free and time-consuming and the other moderately in price but not time-consuming is entirely false. At the time, for many requirements, configuring and administering a Windows machine would have been more time-consuming than for Linux. It's also glossing over the fact that you can't horizontally scale your $-per-cpu/server solution once you do have the configuration worked out, without going back to the well for more money.
This amusing and glib quote also deliberately glosses over the "freedom" part of free software, which is essential. For example, I wonder if Google would exist today if not for a free unix workalike.
>The implication that the one system is free and time-consuming and the other moderately in price but not time-consuming is entirely false.
Haven't found that to be the case. And I'm an old UNIX hat, using SUN OS, HP UX et al in the early nineties and Linux from around 1998. My two observations:
1) Linux, even the most modern distro like Ubuntu, is a huge time sync for anyone that's not just content to browse the web and check mail in a system setup for them.
2) More users that we think are in that category (besides the proverbial "grandparents" perhaps).
You got a video from your wedding you need to edit? Might or might not work. You got a hobby that requires you to use a specific peripheral (like a soundcard) or software? Do you feel lucky? You got a new phone you need to sync? Prepare for an adventure...
1. "I need to do X in Linux." Web search for software.
2. Try to install it. Find out I need Gnome 3.44 and I'm not running Gnome at all.
3. Find something else. This doesn't seen Gnome.
4. I try to install it. Wait, I need a newer version of a library.
5. I grab the latest version of that library. Wait, idiot, that version is TOO NEW!
6. So I find the exact version and install it and oh son of a cow now everything is broken.
And at each step there's a different sub-community telling me I'm a complete idiot. ("Why don't you use Gnome?" "Why don't you use package manager X?" "Why the fuck are you using that package manager???!")
Same thing when people say if you're not the customer, you're the product. That is a logical fallacy because you are still a product even if you pay for the software.
The point is when you include your time your comparing ~10k products and there purchase price is a rounding error, not that Linux is inferior because it's free.
Linux desktop is and has been second rate for a long long time. Turns out unless there are corporate sponsors paying for development, open source software sucks. The OSS community is pretty much in denial about that.
That's funny, because I haven't managed to find a desktop OS that is up to par with Linux for me. Preferences may vary, but second rate in your book is the only viable contender in mine. Also turns out that Linux does in fact have corporate sponsors, like oracle, red hat, etc. so I'm not sure I follow. Is closed source software top notch without corporate sponsorship?
The sad truth is that most software sucks, regardless of the license, there's just a lower barrier to entry, use, and discoverability with OSS so it's more visible. At least with OSS we can fix the issues instead of just accepting the suckage.
After a year I find the only advantage of OSX are browser with touchpad gesture experience and windows edge handling. Haven't tried any distro ever since switching to MBP, wayland might have a better chance of replicating the experience. Hoping for the camera driver to be done.
In Linux everything else is better, fuller implementation of base tools, customizability, no crap like /var/folders..
This is entirely subjective; it is not a wise comment to make because it is better FOR YOU, not for everyone. For me, I liked the configurability in Linux desktop land but there comes a time when you want to sit in your office and get stuff done instead of spending all the time moving the furniture around and decorating; OSX doesn't let you do any decorating or move the furniture, so you're forced to do work (to some extent).
I daily use Linux, yet my day-job is to write software on OSX and Windows, so I get to use every system every day. They're all pretty much a muchness.
No, it's saying both are great but Linux constant desktop change and rewriting of underlying systems to break upgrades between OSes isn't for me at the moment (as a desktop system). It may be great for someone else though.
I used to find Linux desktop second rate a few years ago but adopting it again recently I've found Gnome 3 better than OSX in many ways, at the very least comparable. Plus Arch Linux has been incredibly stable whereas in the past it also used to break often.
Funny, I'm watching a talk about issues with software quality on free software right now. Perhaps some members of the community are in denial, but not all of them.
> JWZ used to say that Linux is only free if you don't value your time.
Used to say? How long ago was that?
Ubuntu and many other distros are incredibly easy and effortless to use. I know devs using apple products who spend more time mucking about with brew and other tools trying to accomplish things that are very easy to do on the most popular linux distros.
I prefer and run Linux. However, troubleshooting Linux issues, even on Ubuntu is a pain in the ass. The fact that Google's page rank favors older stable pages in search results is a factor. A forum post from 2004 is not the best source of information in regards to dual monitors (and xrandr). 2007 instructions for changing the default boot will be based around Grub. I shouldn't adjust ~/.bash_login in 14.10 despite what the internet says. I should use the configuration tool.
My point is that Ubuntu is complex [as is Windows] and when something doesn't run quite right, the right answer is usually hard to find and hard to recognize because it will be embedded in a culture of highly technical cross referencing. The tradeoff for going down the rabbit hole is that Linux is really powerful and flexible.
I'm loving me some Xmonad this week, but I had to root around in xkb and xmodmap and write a little haskell and read about out how to switch layout engines in Ubuntu [and then translate that into xfce based Ubuntu Studio]. It's non-trivial and requires reading stuff on the Arch Linux Wiki. Ubuntu isn't really self contained in the way Windows is.
That's exactly the sort of situation I am talking about. When there is a problem with some piece Ubuntu's ecosystem, the answer is "It's not Ubuntu's fault, RTFM for <other software>."
Knowing that information about Ubuntu goes out of date and that changing the search criteria helps: [1] doesn't make Ubuntu less complex [2] prevents it from being as user friendly as Windows for a broad audience.
I agree with the UI "progress". Lion and Mountain Lion were the biggest changes for me; Mavericks and Yosemite are alright to my eyes (although they took some getting used to).
I have a USB disk with Snow Leopard on it that I can boot on my work's 2008 Mac Pro but that kernel panics on my 2012 MacBook Pro due to the i7 "from the future" according to Snow Leopard. It's a pity because it'd make a really great test system (I would like to check that my software runs on 10.6 OK).
Does anyone know of a way of booting Snow Leopard from disk inside a VM without many kernel kext kerfuffles? Parallels tells me that I stink if I try and install OSX inside it (I have the regular DVD, not the server edition)
Not knowing how to operate a server, I have to assume it's harder and you have to know everything it does and the CLI to use it. Not the same for mint or any other pre-packaged distros.
The largest server farms on earth are all run by administrators who are extremely well paid for their time. Linux isn't free for the companies who hire them.
"Linux isn't free for the companies who hire them"
Actually, Windows isn't free for the companies who hire them. And you need to hire a lot more Windows admins than Linux admins if you want to keep your server farm running. But Windows admins are cheaper than Linux/Unix admins, so the phb can crow about that.
With XP this was valid, but from Windows 7 onwards you don't need to do it anymore unless you actually want to play a game - the majority of users would never need to at all.
FWIW, 10.10.0 for me was perfect. 10.10.1 broke my wifi. Anytime the computer woke up from sleep, I'd have to reset the wifi card so it could find my access point.
After 10.10.2 came out, I got my second-ever full computer lock-up. I've had the same OS image since 10.5 (migrated and upgraded multiple time, obviously) and this was the second time my computer required a hard reboot. And this was while watching a video in the Steam client. I was in awe. A video forcing me to do a full reboot? Brought back memories of Windows ME.
While "upgrading" to 10.10.3, my computer failed to reboot itself. It just sat there, black screen, with a little spinner, for about 2 hours. Hard reboot, everything is back to normal. For now.
I fully expect 10.10.4 to actually destroy my hard drive at this point, as well as cause my monitor to spontaneously get 30-40 dead pixels.
Sucky part is, I feel I have no alternative.
Windows is out of the question after seeing what a factory OEM image comes with nowadays. I'm not giving them money and spending 2 days formatting/reinstalling/seeking out drivers on slow Taiwanese servers just to make a half-usable computer. And then, after all that, spend another 2 days installing various adware infested shitware to get a fricking PDF viewer. And don't get me started on getting a half-decent dev environment going. Ugh.
Linux is almost there. It's a crap shoot for me, unless it's running in a VM. I never know which kernel update will fix my trackpad/break my sound/wifi and which will fix the wifi, break suspend, fix sound, but break xrandr or some other archaic XWindows-related technology.
>Windows is out of the question after seeing what a factory OEM image comes with nowadays.
Fair play. Research the manufacturer's policy if you're buying pre-built. If you build your own desktops, this is not an issue.
>I'm not giving them money and spending 2 days formatting/reinstalling/seeking out drivers on slow Taiwanese servers just to make a half-usable computer.
When's the last time you actually installed drivers on a fresh Windows install? I will concede this used to be the case, but nowadays the whole process is much more streamlined. Microsoft has made progress on supporting a lot of hardware drivers via Windows Update. Intel also has a great driver update tool that will scan your system and make recommendations.
>And then, after all that, spend another 2 days installing various adware infested shitware to get a fricking PDF viewer.
2 days is rather exaggerated. Most browsers display PDF. If you absolutely need to read a PDF, Foxit Reader is free, as in beer. Yes, you'll have to uncheck some boxes to avoid their bundled software. Big deal.
>And don't get me started on getting a half-decent dev environment going. Ugh.
Yeah, windows isn't great for a lot of developers' needs. You really could have just left your argument at this rather than spouting off the rest of that FUD.
I use Linux, OS X, and Windows all on a regular basis. They have their strengths and weaknesses, like any other products.
>If you absolutely need to read a PDF, Foxit Reader is free, as in beer.
You're actually proving his point : on Windows something as basic as reading PDFs (yes you "absolutely" need that in 2015) requires specific knowledge of obscure names like "Foxit Reader".
I'm not even going into annotating that PDF or adding your signature to it then.
The same goes for : an office suite, a file manager with decent previewing+smart search capabilities, something that quickly resizes/converts your pictures, great photo editing/cataloging software, video editing software, music editing software, e-mail client, a comprehensive development environment (1)
All of the above, which covers almost all areas of what people actually use computers for, comes as standard on OS X
(1) Pages/Numbers/Keynote, Finder+Spotlight, Preview (also your answer for annotating PDFs btw), iPhoto/Photos, iMovie, Garageband, Mail, Unix CLI/XCode
> Yes, you'll have to uncheck some boxes to avoid their bundled software. Big deal.
Hate to say it in troll-ish way but : everyone but lifelong Windows users has higher standards than this.
In fairness to Windows (and I can't believe I'm defending it because I happen to hate the platform!), bundling software is exactly what got Microsoft into trouble in previous years (Internet Explorer, MSN Messenger, Windows Defender, etc).
Hopefully the package manager in Windows 10 will offer up a happy medium between expected software being easily available, and 3rd parties not being pushed out of the market place.
> In fairness to Windows (and I can't believe I'm defending it because I happen to hate the platform!), bundling software is exactly what got Microsoft into trouble in previous years (Internet Explorer, MSN Messenger, Windows Defender, etc).
Except that bundling itself is not what got Microsoft in trouble. Having a monopoly, and then taking a coordinated series of actions relating to conditions of sale, bundling, and other steps to extend that monopoly into other markets and to eliminate threats to the monopoly -- with evidence, including fairly explicit documentary evidence from senior decision-makers at Microsoft -- that that was the intent of the action is what got Microsoft in trouble.
They got in trouble under anti-monopoly laws, not anti-bundling laws.
It's the same thing. If you don't have a monopoly then it's a feature, if you do have a monopoly then it's anti competitive.
I do agree that Microsoft did a whole boat load of other, really down right despicable, business practices as well. But in the MSN Messenger and Defender cases it was purely about shipping their own products preinstalled in Windows when competitors weren't included.
I guess you're right that my wording was perhaps a little too vague though. But my point was that when it comes to software bundling, Microsoft are damned if they do and damned if they don't.
Or maybe it requires specific knowledge of obscure names like Adobe Reader. Or did I miss something and Adobe started charging for Reader once there were double - digit numbers of free alternatives including those built into the browsers?
Heck, do Chrome or Firefox actually register with the system to handle PDFs? Wouldn't surprise me if they did.
"When's the last time you actually installed drivers on a fresh Windows install?"
This christmas. Building a new desktop for the first time in years. Bought high quality everything - Asus MB, Intel Quad core CPU, corsair memories etc. a fun and smooth experience.
I also bought Win 8.1 Pro, setup the machine for dual boot and then proceeded to install Win followed by Xubuntu. Gave me a good perspective on the state of the OSes.
Win install took much longer time, did not detect the AMD Radeon R9 card nor the Intel Eth interface on th MB. Or a lot of the peripherals. Using the supplied driver CD from Asus solved most of these, but not the GPU. To this day Win complains about some peripheral that it can't detect.
Xubuntu install was faster, smooth and did automatically detect all peripherals including Eth and GPU. And SW install using SW Center and Synaptic just works.
The Just Works used to be the perceived advantage of Win, and that Linux meant issues with drivers and fiddling with downloads from weird sites and manual builds.
It seems the world has flipped, mainly due to Linux progressing while Win really hasn't. Apart from the Metro/Modern interface that is there as well as the classic and you (read: I) end up being transferred between them.
> When's the last time you actually installed drivers on a fresh Windows install?
On this very box, I bought a 8.1 pro license, and had to install drivers for my GPU (ATI r9), along with various other tidbits, which I've now forgotten. Oh, yeah, my usb sound card.
Granted, I needed to install the closed source drivers on the Linux side, but it was easier (enable non-free, aptitude install ...) -- and the soundcard worked out of the box.
At least the network card worked this time around on Windows (at least as far as I remember). So the only real hassle was that I needed a windows install from which to build the usb install stick. Thankfully I still dual-boot my laptop, so I could make the install image there.
http://www.nliteos.com helps massively when installing all the Java / Flash / browsers / anti-malware / other stuff you need on Windows. The Taiwanese driver thing is still real; windows may ship drivers and have some via windows update, but they probably won't be the newest (or even newest certified).
Don't forget about the 4 hours, 10 reboots, 3 blue screens, and reinstalling your video card drivers twice of windows updates!
>Fair play. Research the manufacturer's policy if you're buying pre-built. If you build your own desktops, this is not an issue.
I can build my own desktop, but I can't build my own laptop. I use laptops exclusively for work.
>When's the last time you actually installed drivers on a fresh Windows install?
Less than year ago on my girlfriend's VAIO laptop, actually. Windows 7.
>Microsoft has made progress on supporting a lot of hardware drivers via Windows Update. Intel also has a great driver update tool that will scan your system and make recommendations.
Yes, it mostly works, and I'm aware of it, which is why I boldly told my girlfriend I'd take me "an hour or so" to "fix" her laptop issues. Famous last words. And I wasn't exaggerating about "slow Taiwanese servers" - I'm talking less than 100kb/sec transfer speeds. Welcome to 1996.
>2 days is rather exaggerated. Most browsers display PDF. If you absolutely need to read a PDF, Foxit Reader is free, as in beer. Yes, you'll have to uncheck some boxes to avoid their bundled software. Big deal.
It is a big deal, to me. I don't like being treated like an idiot, being on the look-out for hidden toolbars and spyware every time I double-click "setup.exe". Am I capable of doing it? Yes. Do I want to bother, for a fricking PDF viewer or a java runtime? Absolutely no way, never again. BTW, 19/20 people seem to just leave the default check-boxes checked, since they don't understand what they're being asked and just want a fricking PDF viewer.
And Foxit IMO is a PITA. Comes with its own auto-updater too. I have never seen a PDF viewer needing so many automatic updates, so often; big ones, too.
Why can OS X come with a nice, bundled, PDF viewer, and the capability to export to PDF from everywhere, and in Windows-land, I'm forced to install either spyware or shit-ware? Why? Why is Windows still stuck in 1997 in this regard?
>Yeah, windows isn't great for a lot of developers' needs. You really could have just left your argument at this rather than spouting off the rest of that FUD.
FUD? I deal with Windows issues from relatives on a weekly basis. What FUD? Am I imagining all of these issues? Spyware and adware infested boxes that come to a crawl as soon as the poor sap has more than 2 Chrome windows open because the OEM decided that 2 GB is enough for Windows 7 and also installed their own "Media Center Experience" which starts up on boot and is written in .NET WPF, eating whatever RAM there's left? That one wasn't on the VAIO, it was on her previous HP laptop, but point still stands.
Yeah, maybe for a power user who built their own computer from scratch and did a clean install and carefully vetted every package as it was installed, sure, yeah nice experience. I have a Windows gaming box that is super lean and mean and it took a lot of effort to get it to this state.
But I still find myself scratching my head in disbelief every time a windows user tells me they need a new computer cause their existing one "just got real slow lately" even though they never touched the OS. And I know why it's slow too, I know before I even open it, dreading what I'm going to see in the notification area.
P.S. She's a very happy OS X user now, FWIW. One HP and one VAIO laptop later and another Microsoft ex-customer never coming back.
None of your complaints are about Windows. They're all, almost universally, about OEM and third-party shovelware. You want to avoid OEMs trashing your computer? Don't pretend this is about a lack of choice, or about Microsoft being stuck in the past. It's about you being ignorant of your options, or ignorant of the ecosystem. Buy a Signature Series machine from Microsoft.[1] No crapware, no bundled "features" or "trials". You can also go buy rebranded-Clevo-hardware built by one of their partnered system builders. These machines also aren't full of crapware, if you pick a decent system builder. Vizio, by the way, also refuse to fill their laptops with crapware. On the other hand, anyone who owns a music streaming service is going to bundle their music streaming service.
As for PDFs, Windows 8 and up come with a PDF viewer since, y'know, two and a half years ago when it was released. Did you miss that? Granted the default PDF viewer is a Modern App, but it works, and it works reasonably well. I use it regularly to dock a PDF on half of my monitor while I work on something else on the other side. Office, by the way, will read and write PDFs just fine, if you're after a more solid office-like product to do it. In case you're about to complain that Office is too pricey, LibreOffice will do the same. If an office-suite is not your speed, Sumatra PDF exists, and is free, open, has no bundling, and is available in a portable flavour.
Again, this is not Microsoft's fault. It is yours. No one is stuck in 1997 but you. You know what it requires to not be stuck in 1997? Any sort of intellectual curiosity at all.
> FUD? I deal with Windows issues from relatives on a weekly basis. What FUD? Am I imagining all of these issues? Spyware and adware infested boxes
See, you're doing it again. You're not dealing with "Windows issues", you're dealing with issues on Windows boxes, caused by non-Windows problems. These problems involve things like "my computer illiterate family really believed there was a Nigerian prince sending them email," they're not problems that Microsoft has introduced, and they're not problems that Microsoft CAN solve. Stupid people do stupid things, and one of those things is double-click PORNVIAGRA.EXE.
This annoys me so much because you are essentially arguing that problems caused by refusing to read are problems caused by operating systems, and this really isn't the case. This isn't something Apple has done a better job with either, it's just that the marketshare for Apple wasn't high enough for the malware vendors to bother with them. Notice how reports of Apple-specific shitware have increased in the past couple years with their marketshare? This should probably point out to you that it's not something that's solved by changing OS, it's solved by teaching your friends and family to be more tech-literate. You are arguing that the problem of pedestrians being hit by cars when they cross without looking will be solved if we just pad bumper of the cars enough. You want to try to find a technical solution to a human issue, which computers really aren't up to solving yet.
Don't get met wrong, there's stopgaps. There's services like http://unchecky.com/ which just uncheck all the checkboxes for you. Now you have checkboxes unchecked that you really need checked, and stuff doesn't get installed properly. There's still some reading required.
The point here, though, is that you're not an idiot. Your family aren't idiots. You're just all too lazy for your own good.
(As an aside, there's also valuable lesson in your post that Vaios are among the worst OEMs around. Don't buy Vaio.)
I wish I could upvote this comment a million times. I see the same kind of ignorance-of-their-options (note I specifically say ignorance-of-their-options; I'm not saying anyone is generally ignorant) reflected when developers complain that it's hard to get under the hood of Windows or when developers complain about the tooling on Windows. You'd at least hope that this segment of the user-base would have more intellectual curiosity sometimes.
From the software developer angle, Windows is a FANTASTIC system when you need to get down to the nuts and bolts. People really need to educate themselves about the registry and the various ways the OS can be tweaked; as well as the various tools available (at a bare minimum those from Sysinternals).
Some other examples - the extensible kernel-level instrumentation that you get from ETW; the excellent crash and hang analysis tools provided by the likes of WinDbg (supported by the excellent debugging support in the Windows kernel -- yes, you can enable a keyboard shortcut that will memory dump your machine if it's hung and then it's a few commands to see what crashed); the commandline environment provided by PowerShell (I haven't used it much but really need to get into it more). Not to mention Application Verfier.
I could keep going on and on but it'd probably cloud my point if I haven't done so already :)
Not saying it's perfect or the be-all-and-end-all either; I also use all three major OSs, though primarily develop .NET software on Windows. Just frustrating that many of the complaints can be boiled down to "Windows doesn't work just like Linux/Unix does".
I agree with the poster above about usability, but I disagree about ease of development.
This is partially because linux is the dominant platform for Python developers, but developing/using Python libraries on Windows is really difficult. Many people simply overlook it.
The default command prompt is really really bad. There are alternatives, but none of them come close to the gnome shell. The Batch scripting language is terrible compared to Bash. Powershell is way too verbose and API-heavy. You can use Javascript, but then you are using Javascript... Then there are stability issues. I have left my home Linux computer on and online for months sometimes and never had any issues. Try the same thing on Windows... It will crash on the first week for sure.
The things Windows is great at is Office productivity software, gaming, design, and some other GPU-intensive tasks.
Desktop Linux is great at automation scripts, prototyping apps, temporary services for others (ssh and copy files), and virus-free internet browsing.
I will say is that the Linux ecosystem has Windows beat hands-down when it comes to installing third-party apps and updating them - 'apt-get' is fantastic. OSX confuses me a little in that regard - some apps you copy to the Applications folder; others require an installer to run; for the ones that require an installer it's often not entirely clear how you uninstall them, etc. I always feel like OSX is a little inconsistent in that way. The Mac App Store is nice and unified updating through the app store is also nice.
Stability-wise my machines have actually been very good. I don't reboot much at all and mainly put the machine into suspend instead of turning it off. My first instinct/fear with a spontaneous BSOD now is that my RAM or HDD or something has gone bad.
Agree with you about development with things such as Python (though in that specific case I remember the Python Tools for Visual Studio being okay), or for that matter any new and upcoming language. My first thought when i want to try something like that out is to drop into Linux. It's just better supported for that kind of thing.
> This annoys me so much because you are essentially arguing that problems caused by refusing to read are problems caused by operating systems, and this really isn't the case. This isn't something Apple has done a better job with either, it's just that the marketshare for Apple wasn't high enough for the malware vendors to bother with them.
If you have to carefully uncheck a bunch of options to avoid killing your computer with malware, then yes, the operating system is at fault. These sort of dark design patterns shouldn't exist, and the OS should be designed so that it's harder to implement shady things like this.
Yes, there is Mac malware out there, but Mac apps tend to have no installer at all, which is where most of the malware and crapware come from. It's harder to install some OS daemon when "installing" is just copying a file into your Applications folder. Mac apps almost never ask for permissions so I think users are more wary when the OS is asking weird stuff.
And Malware is practically unheard of in the Linux/Free Software world. I trust "apt-get install" implicitly when dealing with the main Debian archives.
> See, you're doing it again. You're not dealing with "Windows issues", you're dealing with issues on Windows boxes, caused by non-Windows problems.
That's a reasonable distinction, I suppose. How about this: The Windows software community sucks. The product may be fine if you know your way around the horrible alleys where malware and crapware lie in wait, but for normal people, the whole external environment is designed to screw you and mangle your computer.
Windows itself may be great, but the whole ecosystem is a wretched hive of scum and villainy.
I mostly agree, but there's still the fact that unlike in Windows I have to trust a randomer's ppa (or build from source, but that's sometimes pita and no-go for mortals) if I want to install the newest version of libreoffice or whatever, unlike in Windows where I can run binaries directly from the first party source.
All of the GP's complaints are about Windows. You are attempting to divorce "this PDF viewer" from "Microsoft Windows" while the GP's complaints are about the laptop running Microsoft Windows with a bunch of Windows software installed on it.
I love how you point at "unchecky" as a great service when it is a piece of software you use to deal with problems that shouldn't exist in other software.
How manu Windows users wailed and gnashed their teeth over iTunes insisting on installing Quicktime? Is that because Windows users are lazy or that they didn't want Quicktime installed just to use iTunes?
"You are all too lazy for your own good," is classic victim blaming.
Is it perhaps possible that Sony, Dell, Toshiba et al have poisoned the well?
I had been an avid Windows user for two years from 2010-2012 and I had fun with Windows, surprisingly, I did not have the "Windows sucks" feelings back then because I hadn't started using Linux full time, and now I am a full time Linux user.
I don't use Windows on my laptop, but I don't "hate" it, I totally concur on your point about people hating the platform for the wrong reasons, historically the maretshare of Windows has been high and that is the reason for malware stuff.
What irritates me more than people hating windows for no reason is they don't realize that they are not using the platform properly, for eg when I tell fellow developers to switch to or use Linux for development they "stick" to windows even if they can't change the brightness of the screen! Yes some stupid driver error, they say they are "comfortable" with Windows, but irnonically they can't even give folder permission to users, this basic thing!
I feel people are too resistant to change and they want to blame something/someone and that is the problem with the hate stuff that spreads like a flu over the Internet
None of your complaints are about Windows. They're all, almost universally, about OEM and third-party shovelware.
You can't separate windows from its ecosystem - it's allegedly the choice of vendors and the availability of commodity hardware that makes windows at all appealing for most computer users. And the windows ecosystem sucks - it's full of weeds and parasites. Windows can't be examined in the abstract - if 99 out of 100 randomly chosen configurations in the marketplace are just terrible, then that's just Windows. It's the incredible leeway that Microsoft gives IHVs that makes Windows as popular and at the same time as terrible as it is. That's what characterizes Windows. As an operating system, it isn't half bad (it's architecture is quite good, actually) . As a platform, it's terrible.
See, you're doing it again. You're not dealing with "Windows issues", you're dealing with issues on Windows boxes, caused by non-Windows problems
What exactly is it that you're arguing? It seems like you and the parent are largely in agreement and you're trying to generate friction.
I would contend that you're right that it's absolutely true that with enough research and with constant vigilance, you can bring a retail Windows system into an acceptable configuration, have all of your hardware enumerating properly, and avoid malware.
I would contend that the parent is right that maintaining a Windows system is just barely worth it if it's your own machine and is a hopeless time-sink if you're doing it for free for anyone else, like family.
Are you two really just arguing past each other?
The point here, though, is that you're not an idiot. Your family aren't idiots. You're just all too lazy for your own good.
Please see the posting guidelines. Personal attacks are not acceptable.
My Dad got himself a brand new laptop (i7, 8gb ram, ssd, etc). And a brand new printer HP OfficeJet (big black shiny machine with all the bells and whistles).
He couldn't print his stuff for work. He called me up. After over an hour I gave up. Windows 8 couldn't figure out which drivers to use. The CDs attached with printer didn't help too. I mean the installation went smoothly, but then nothing was printed. Check your cables, etc. messages like we are still in 1996.
I don't have time for this BS. Drove back home, got my macbook pro. Came back to my Dad's home and did THIS:
1. connect printer to the mac
2. driver installs after clicking 'OK' or 'install', dont even remember at this point
3. print
Just get Ubuntu Certified Hardware. They have over 500 models of laptops they certify. I have never had a kernel update break wifi. I don't know about all the other stuff, but most distributions store old kernels and allow you to boot back into the old kernel pretty easily. Don't know if this is an option on OS X.
Hell, I run Arch Linux on a number of laptops and workstations, and the only time I've had a kernel upgrade break anything was the wifi on my wife's old netbook some time around 3.12, and that was Broadcom's fault. So I downgraded to 3.11 (didn't even need network access for that), and a few months later on 3.14 it was working again.
Most of the upgrade pain I've had has been caused either by the migration to systemd or by the decision to unify /usr/bin, /bin, /usr/sbin and /sbin into one folder.
One problem with Ubuntu's hardware certification is that it's always for a specific Ubuntu version. I use Ubuntu and would really like it if they would pick at least a few common laptops and commit to ensuring that a certain number of future Ubuntu updates will continue to work perfectly on those laptops. It would help a lot in knowing what to buy.
> Windows is out of the question after seeing what a factory OEM image comes with nowadays. I'm not giving them money and spending 2 days formatting/reinstalling/seeking out drivers on slow Taiwanese servers just to make a half-usable computer. And then, after all that, spend another 2 days installing various adware infested shitware to get a fricking PDF viewer.
Don't know how many of these are at, say, Costco. Which is where many people shop.
I'm never buying a Windows laptop, so it doesn't affect me, but I'm still the resident "family IT specialist". And I know, I just know I'm going to be asked lots of questions I don't have the patience for, as these poor bastards wonder why their computers are slow, all because the shitty OEM wanted to make a few cents off pre-installed shitware.
It doesn't affect me directly, I don't touch Windows laptops unless I'm wearing gloves, but the principle of the thing pisses me off. You're being treated without respect, basically.
It's like a car manufacturer selling you a car with a "feature" that takes away 50 HP to play a jingle every time you turn on your headlights, all because the got paid $7.34 by the jingle manufacturer.
> It's like a car manufacturer selling you a car with a "feature" that takes away 50 HP to play a jingle every time you turn on your headlights, all because the got paid $7.34 by the jingle manufacturer.
Not that bad, but my car does come with what amounts for an advertisement for Sirius Radio built in.
:(
I was also offered a TON of extra add-ons when I bought my car. I'd say it was roughly equivalent to the process of setting up a new low-end PC in regards to the number of ad requests I got.
On the PC side, this is all typically mitigated by not buying the cheapest PC possible. In my experience (and this is not universally true!) the higher end SKUs are more customer centric, since at that point the customer is the one paying the entire cost of the device!
(See also, Nexus devices versus carrier branded phone models!)
>Linux is almost there. It's a crap shoot for me, unless it's running in a VM. I never know which kernel update will fix my trackpad/break my sound/wifi and which will fix the wifi, break suspend, fix sound, but break xrandr or some other archaic XWindows-related technology.
Apple could easily write fully functional Linux drivers for their hardware if they wanted to, but they choose not to.
Instead of OEM Windows image, use this tool to get rid of the crapware (malware in some cases):
http://windows.microsoft.com/en-us/windows-8/create-reset-re...
Windows Update is terrible. OS X has that beat. But Windows is ~5 years mainstream support, ~ 10 years extended. With OS X, you don't know. It might be 18 months, it might be 3 years.
> Windows is out of the question after seeing what a factory OEM image comes with nowadays. I'm not giving them money and spending 2 days formatting/reinstalling/seeking out drivers on slow Taiwanese servers just to make a half-usable computer.
"Dude, get a Dell!"[1] Seriously, one of the selling points of a Dell system is rock solid components, and a centralized place to find drivers (at least for the business line of systems). Keep in mind, Dell sells hundreds of thousands (or more?) systems to corporate America, all with support contracts. Everything they can streamline and make more stable and reliable is money in their pocket. I mean, they wrote their own RAID drivers for the AMI and LSI Megaraid chipsets which they resold under the PERC brand, and in my experience, they were better than the manufacturer drivers. That takes dedication.
FWIW, my wifi had similar problems on 10.9. Not every time, but occasionally when it goes to sleep, it doesn't wake up and you have to reset the whole TCP/IP stack to make it work. And lockups after wakeup requiring hard boot are not unheard of either. So it may not be exclusive for 10.10.
My wife and I have identical 2013 13" MBAs. Mine is on Mavericks. Hers, which was bought a little later, is on Yosemite. So, as close as possible, we've got identical HW with different software.
- My battery lasts longer, maybe 2-3 hours more per charge. I nearly never see < 50%, she's hitting 10% at the end of the day. (Note that both are good enough, but they're way different)
- Mail search sucks on Yosemite. Today, we were searching for subjects that I know are there because I'm looking right at them, and nothing is coming up.
- Safari does something funky with process per tab that manages to orphan process that take a bunch of cpu/memory.
- Time machine is really touchy about backing up to the server over wifi. The last few times, it's had to make a new backup on the Yosemite machine. Mine misses a backup every day or so, but that's just an hourly miss, not a full rebuild.
We're considering blowing away her machine and pushing it back to mavericks with Time Machine/fresh install.
Random bugginess. When I came into office, plugged my machine into TB display (have one at home, so should be no issue), I got nothing but black. Could get a login prompt, and even closing lid kept backlight on. Only solution was a hard reboot.
Oh true. I just realized I have the same issue. I connect my macbook to my hengedock which has 3 monitors attached. If I disconnect my macbook and hook up my tv via hdmi, I plug it into my hengedock and my 3rd monitor is just black while plugged into that hdmi port. I always restart it after watching tv and then plugging my monitors to get it to work.
I have all kinds of problems with Bluetooth, specifically with pairing or connecting devices. Hibernation is all kinds of wonky, too. If it's been asleep for a few hours, it will hibernate when I plug it into AC power without resuming it first. (I really wish there was a way to tell it to hibernate if it's been suspended for some user-settable amount of time like I can with Windows, but that's not a problem specific to Mac OS X 10.10.)
Generally much slower response, Final Cut breaking, trouble with hanging apps needing force quits, and it goes on. Minor by themselves; scary when it all seemed to happen after each person upgraded.
Maybe I've waited long enough before moving. Especially now that this vuln is known about.
I have a lot of music production software on my Mac, which doesn't all necessarily get updated to work with the latest OS X right away. Upgrading the operating system tends to be a maze of determining what works and what doesn't, possibly including paying for updates of software besides the OS.
I generally put off doing a major Mac OS update for as long as possible.
It's even worse that every update makes you wonder if your FireWire interface and control software will ever work or run again, even though you're running on THE SAME HARDWARE as before; the OS update typically breaks it.
<citation needed> on the "run well even on 5+ years old hardware" bit. Sure, the OS will boot and work, but that doesn't mean things will work well.
My late 2011 13-in MacBook Pro became unbearably slow after doing a clean upgrade to Mavericks. Only upgrading the memory to 8GB (which was not supported by Apple for that model) fixed the issue. Yosemite works, but looks like crap on anything without a retina display. Newer versions of iOS have a habit of making older phones become sluggish when compared with the same phone model running an older version of iOS.
I'm not expecting Microsoft-level backwards compatibility, but as someone that has a somewhat complex development environment having to get everything to work on a new version of the OS can take up to a week of tweaking.
I'm currently running the latest version of Yosemite. I have a 2010 MacBook Pro 15 inch with eight gigs of RAM and an SSD.
I've had little bugs and fiddly bugs with Safari and a few other things, but I really thinks it has worked fine for years and years on the machine. It noticeably improved when they added RAM compression (was that Mavericks?).
I don't find the graphics a problem at all, although they do look better on my Retina iMac. I got used to them pretty quickly.
I've been happy with most OS updates as they sped Safari up or fixed small bugs in it. Photos for OS X is a MASSIVE speed improvement over iPhoto.
I know some people run into pretty catastrophic bugs from time to time, I haven't had that experience personally. The idea that the OS doesn't run well on five-year-old hardware is bunk. I have no need to replace my MacBook Pro, The only thing it's not good at is 3D (and it was never very good at that).
If you have enough RAM and replace the spinning disk old Macs feel fantastic with a if you have enough RAM and replace the spinning hard drive old Mac still feel fantastic with a recent OS.
If you're on a 5400rpm drive with 2GB I'm sure it's painful. But I know even 4GB machines fair very well. SSDs just make an insanely big difference.
Did you upgrade to an SSD? I have a 2012 MBP non-retina with a slow Toshiba drive in it (it seems it is entirely coincidental whether you'll get a Toshiba or Samsung in your new MacBook!) and I am considering upgrading to an SSD. I use BootCamp (periodic Windows gaming) and have a bunch of large VMs within OSX, so I would be interested in how the upgrade went and where to look, how to clone my disk etc. etc.
Yes. It makes all the difference in the world, far more than the additional RAM.
Upgrade was an easy drive pull and replace, and I just restored from Time Machine to get all my data back. No fuss.
But I don't have a windows partition or anything else special. I imagine it will be a bit more work for you since you'll need to backup the Windows partition and restore it.
I was tempted to do that, but to be honest the battery of that computer is kind of shot (it only lasts about 3 hours nowadays) so I didn't think it was worth spending the money. Might as well sell that one and spend a few hundred more to get something with a retina display and a factory SSD :)
What hardware have they dropped support for without a technical reason? To my knowledge, the only Intel Macs they've dropped support for are ones with 32-bit processors, 32-bit firmware (requiring a 32-bit kernel and drivers even if the processor is 64-bit), or really old GPUs that can't support recent versions of OpenGL.
To my knowledge no Linux distro packages install media supporting 32-bit firmware and 64-bit kernel+CPU, but it's definitely possible to built a 32-bit GRUB EFI binary and have it load and execute a 64-bit kernel on a 64-bit CPU. I've done it on such a Mac. And there are a number of tablets built this way also (unfortunately).
Intel only contributed the code for that a year ago, so while we now know it can be done, there's no particular reason to believe that it's easy to get working properly. The messy mix of EFI 1.x and UEFI 2.x that Apple uses could make things even trickier.
iMac (Mid 2007 or newer)
MacBook (Late 2008 Aluminum, or Early 2009 or newer)
MacBook Pro (Mid/Late 2007 or newer)
MacBook Air (Late 2008 or newer)
Mac mini (Early 2009 or newer)
Mac Pro (Early 2008 or newer)
Xserve (Early 2009)
I am not sure why you were downvoted. I'd think it is in Apple's vested interest to actually not treat the older hardware so you buy new one. I am no talking about really old hardware. Example: I have seen my iPhone 5S get substantially slower after update to ios 8+.
Maybe most personal users but those of us in larger IT-managed organizations don't always have the luxury of updating on day one (point: I am on 10.9.5 right now).
At my last office we had older macs, a mix of laptops and desktops. They all ran slow, with one user's mac pro using all 4GB RAM on a fresh boot with nothing loaded. These folks were normal users, not power users. When a new OSX release would come out, a few people would upgrade, have a ton of problems, and warn everyone else.
I ran linux myself, and was occasionally called over to help with an OSX problem, and I could never understand how my colleagues could stand to work on such slow computers. In my experience, older macs do not usually run well on newer OSX.
I'm currently running 10.10.3 on a Mac Pro 1,1, at almost 9 years of use. Of course 10.10 isn't supported by Apple, but it works just fine, and with the combination of an SSD and cheap RAM, I imagine I can get another 3-4 years out of this desktop barring a hardware failure that pushes me to a new system.
Don't worry, my company uses plenty of Macs, and they're plenty upset. As does the U.S. government. I wouldn't count on this "oh, we're just not going to backport the update" sticking around. Unless they want to loose all of their government and commercial sales.
Alupis, I support federal government scientists (biologists, chemists, entomologists, etc.), and many of them use Macs. Naturally, there are more Windows PCs than there are Macs, but there are plenty of Macs in our labs.
> If you want solid support for old products, stick with Microsoft, and accept that their products can be clunkier because of deliberate choices to maintain backwards-compatibility.
It's not only about support for "old" products, it's also about having a roadmap to begin with. It used to be common wisdom that Apple releases major updates for the current version of OS X, and security fixes for the last two versions. Well, that common wisdom is outdated now.
Also, backwards compatibility and security fixes are not really the same, and I'd say that the correlation between backwards compatibility and software quality is completely overrated. I know I'd rather trust my life to Windows 7 (the epitome of conservative OS design) than to OS X 10.10 or iOS 8 (speaking as an iOS developer).
In raw numbers, 45% of OS X users are now vulnerable (and probably don't even know about it), and Apple doesn't care. That's an insane number. And I thought not patching the ~10% iPhone 4 in the wild was dangerous!
> Apple's model customer is one who upgrades often.
I don't know any tech company that doesn't have that idea of a model customer :)
To their credit, I can run the latest on a pretty old Mac Mini at home. It's not as fast as it used to be, but it is supported long after the form factor of the Mini has undergone big changes. I don't consider myself being squeezed to upgrade.
I booted up one of those G3 candy colored clamshell iBooks from the early 2000s recently. Not only did start flawlessly, it found and connected to the WiFi, and actually loaded Google on Internet Explorer for Mac.
You basically need VMs for every version of OS X you want to support, and need to keep copies of the old OS X SDKs since Apple seems not to test that newer ones will continue to work despite superficially supporting choosing a target version.
Most developers don't bother, as Apple has successfully made it quite difficult.
...and it says the admin framework fix only applies to 10.10 to 10.10.2, which was his point--it doesn't fix 10.8 to 10.9 like many of the other security update features do.
They'll just think "oh god, I have viruses!" and call whoever is the family's "computer whiz". The very concept of "this system is not secure" is something that ordinary user does not understand.
People understand if a system is insecure, and now apparently all non-Yosemite Macs are insecure. Folks will grok that. Whether or not they find out about it, well that's up to you and I, and everyone else on this website.
I didn't mean it in a cynical way. From my observation, "it has viruses" is the best model that is self-consistent and yet doesn't require investing a lot of time in comprehending the workings of a computer. Non-tech people reach it naturally.
I agree it's up to us to proactively tell our parents, friends and colleagues who may be vulnerable about this problem.
I have to go with the idea that MOST people have no idea about security with their computers. Also most Apple people still say no bugs, no crashes and no viruses. Even when I point out every time something crashes on their Mac.
After getting more and more tired of Apple's generally obnoxious behavior over the last few years I'm wondering which straw will be the last for me before doing exactly that. I'm pretty seriously considering wiping OSX off my MBP and running some variety of Linux on it at this point, though regrettably I might still be forced to consider them for future hardware purchases on account of the fact that I don't think I've ever encountered a non-Apple laptop that seemed worthwhile (in terms of general construction and build quality).
In danger of being accused of flogging deceased equines I can but point at the T42p ThinkPad which I'm typing this message on for an example of a device which fits that premise: solid hardware and a generally well though-out design. It is 11 years old, but sports a screen which comes remarkably close to the more recent Apple offerings. It also runs the latest software flawlessly, as long as that latest software is a recent Linux distribution. It has a keyboard which Apple-adepts can only dream about. Ports aplenty, two batteries or drives or whatnots, no problem, ~8 hours on a single battery charge when using the oddly shaped 'extended' battery.
An additional bonus is that you can get these things for free if you know where to look.
Of course, being 11 years old it also has its drawbacks. It won't fit more than 2GB of RAM. It is based around a single-core Pentium M which generally performs fine but shows its age mostly when meeting Javascript-hobbled web-related things.
BUT... and this is one of the main reasons why I use older hardware... if you develop software on this machine, and it works fine on that machine, it'll run circles around the stuff your neighbour made on his latest FlitzBang Fruitmachine. It'll but cause a blip on the CPU meter where his (or her, your choice) result maxes it out. It'll use memory like it was rationed, not like it was on sale.
Of course you can not develop software for the dark side on this older hardware. A small loss, in my opinion.
I'm working on a year-long government project now to migrate a small bunch of websites from Windows Server 2003 to 2008 R2 (because extended support for 2003 ends this summer). Yes, we're upgrading to a 6 year old operating system.
To what though? All the other offerings don't seem at the same state, unless Windows 10 can rescue us all?
I notice that Windows 10 has had new bits shoe-horned in (a new settings screen!) whilst still keeping ALL the old stuff there (MMC as written in 1998? Control Panel is still there despite this new Settings page, duplication much? None of the icons match etc. etc. what a hodgepodge)
Interestingly, the release notes for the 2015-004 patch that includes the fix specifically mention it is also available for Mavericks and Mountain Lion.
> Impact: A process may gain admin privileges without properly authenticating
Description: An issue existed when checking XPC entitlements. This issue was addressed with improved entitlement checking.
I see, so a security update with the same version number does not contain the same patches on all OS X versions it rolls out on, that's... peculiar.
I can't really believe Apple will actually leave this unpatched, as opposed to just saying it won't at this time. The impact of this exploit and the number of affected systems is way too big, they really can't let this sit in OS X versions that were brand new only one or two years ago, that would be insane. With all the resources they have a statement like 'the impact of the changes would be too large' is quite ridiculous.
My guess is that they will patch it in a later update, but haven't finished it yet. Maybe they are even hoping for a few more people to upgrade to Yosemite before they release it. I would be willing to bet that they don't leave a gaping hole like this sitting indefinitely.
I agree with this assessment. The initial fix of the exploit may be easy, but I would bet that the compatibility testing and fixes associated with things that will break because of the patch is what is holding it up.
It would be an interesting exercise to try to figure out what made it easier to patch in 10.10. Is it because it is an active code base, or because something was refactored between 10.8/10.9 and 10.10 that eliminated the need to take advantage of this undocumented capability--for instance, something in the System Preferences getting refactored?
Title is a little generous about "hidden", the exploit revolves around API & Framework used to power the parts of the control panel, and its authorization scheme being broken.
I do think its too bad that setuid binaries don't have additional restrictions, like 100% must be code-signed or must be run in a sandbox-exec[1] based on that signing.
Code signing is not interesting. If normal people can get their code signed then so can the attacker and if they can't then it locks you out of your own computer. The only thing code signing is really good for is to verify that an update to a program the machine's owner installed was signed by the same author as the original program, and unfortunately it's rarely implemented that way.
But that's not really what you're asking for anyway. You can assign granular permissions to binaries regardless of code signing.
The real problem is there are so many things that aren't literally root but are effectively equivalent because if you can do them then you have easy privilege escalation. That's why the Windows security model is so silly. An administrator is not allowed to 'su' to another user without the user's password but any reasonable subset of the administrator's privileges can be used to silently cause any user to execute arbitrary code, e.g. by setting their login script or putting executables in the All Users startup folder or just loading a kernel driver that will let them do whatever they want.
> That's why the Windows security model is so silly. An administrator is not allowed to 'su' to another user without the user's password but any reasonable subset of the administrator's privileges can be used to silently cause any user to execute arbitrary code
That's like saying kernel memory protection is silly when the user is an admin because he could just as well load a driver into the system to wreak whatever havoc he wants.
Memory protection is not silly. Not being able to bypass memory protection (and therefore prohibiting all manner of debugging tools) is silly, and that is the appropriate analogy.
If a user correctly has permission to do a thing via a series of ugly hacks then there is no reason not to just let the user do the thing directly.
...if their activities are discovered. And then the attacker just gets another certificate under a different name, or steals somebody else's certificate. Ignoring, of course, that revocation is totally broken.
It cannot simultaneously be easy for anyone to get a certificate and hard for an attacker to get a certificate.
With physical access, one has been able to create admin accounts for as long as I can remember.
- Start up the Mac whilst holding down ⌘-S. This boots the Mac into Single-User Mode and provides a method of interacting with OS X via the command-line, with full root privileges.
- Then check the filesystem to ensure there are no problems: "/sbin/fsck -fy"
- Then mount the filesystem for it to be accessible: "/sbin/mount -uw /"
- Now remove this file so OS X will re-run Setup Assistant: "rm /var/db/.AppleSetupDone"
Now just restart, and enjoy the cool introduction animation as you create your admin account.
Local privilege escalation is always bad because it means you're one malware payload or RCE away from being rooted and conscripted into someone's botnet (or worse). This isn't just a physical access concern.
What about without local privilege escalation? Is there no way for a malware payload or RCE to turn your computer into a botnet without root privileges?
There certainly is, but rooting a box lets you ensure that you stuff says in place. Once a box is rooted, its owner can never really be sure they have it clean without wiping and rebuilding it.
Even easier is running 'resetpassword' from Terminal in Recovery mode (boot with ⌘-R). This gets you a nice GUI tool where you can reset any account passwords.
But yes, this is not possible with a firmware password or with disk encryption (FileVault) enabled.
full disk encryption only protects you from passive snooping. If someone has physical access between two of your subsequent uses, no amount of any type of encryption will save you. Except maybe some entangled quantum bit collapsing mechanism. Maybe.
Think hardware keyloggers, fake MBRs, &c.
OP's trick won't work, but that's an "implementation detail;" there are plenty others that will.
EDIT: to clarify; that's not what you said, it's just a common enough misconception that it's worth being explicit about, here.
Of course keyloggers etc. are a problem. But that is a different story. A bug which can be exploited just by grabbing any device might have a larger impact on the vendors reputation.
A keylogger, fake smc, whatsoever ist much more dangerous for a single person, because the attacker knows what he wants on the specific device.
That's not very interesting. You can do the same on a linux box, passing init=/bin/sh in the boot loader, and probably the same on windows (boot a WinPE CD or a Linux live CD with NTFS3g).
It's even worse than it seems if you talk about it as 'anybody'.
Most OSX boxes are probably single user devices. But you do not normally run as root/wheel, you need sudo (sometimes through a nice GUI) for software to get root privs.
It's not 'somebody' as if another person were logged into their own account. It's that malware running as you can now get root, to further compromise your system, without needing a sudo password.
Just to clarify, this is _not_ a remote vulnerability. If it was a means to create a remote shell, then it would be. An attacker would need to first find some way to gain remote access and then could use this bug to gain root privilege.
There's a huge difference between physical access vulnerabilities (which are basically impossible to prevent) and local privesc vulnerabilities (which can be exploited in software).
To exemplify, for instance, this vulnerability could be packed in a phishing mail executable giving the remote attacker root access if the user falls for the trap, no?
It could be packaged into any executable that someone might think about downloading off the Internet for some reason.
So its really, really not good. Apple need to fix this soon, or else every OSX machine out there is going to start being targeted for misuse. This is really a powerful security bug.
Isn't the idea though that with physical access, the game is already over anyway? If an intruder has physical access to your machine they will eventually be able to get to anything they want.
If someone really wants to protect their data, they have to count physical access as a possibility and rely on encryption and/or remote wiping - the operating system login isn't going to do much anyway.
Yep, physical access is total access. However, this trick falls under cool-at-school-tech-labs, I'd hope enterprise systems would do something to prevent this kind of low-level shenanigans.
School tech labs are modelled after the enterprise networks, the only difference is usually a bit more competent IT staff. I'd expect most of the school-lab tricks to be directly transferable to enterprise.
What malware requires physical access? This is a local privilege escalation to root. It's only local because you need an account on the machine to make it work, but it can be bootstrapped to a remote exploit.
About two months or three ago I stupidly changed my password to my only account to a password I promptly forgot. I was losing it when I realized what I had done. To make matters worse, I did this change a day before our office was scheduled to move to automated backups via Time Machine.
Luckily after some digging I came across this fix and was back in to my computer, albeit a little shaken up by the back door.
If you're paranoid enough to think someone you don't trust can have physical access to your Mac, it is possible to prevent this by setting up firmware password though.
The first Mac I ever got, the IT department forgot to give me admin access, so I couldn't install any software. Having never used a Mac before, it took a grand total of about 15 minutes of Googling to figure out how to boot into single user mode and give myself admin access.
I really hate all the desktop IPC bullshit. IPC frameworks are pure fucking evil. COM, D-Bus, XPC, everything SUCKS.
If you want completely separate programs on one machine to talk, use UNIX domain sockets (with something like ZeroMQ or HTTP), FIFOs (named pipes), anything that you can chmod and chown, not a daemon that reinvents access control, badly.
Well, this is a very trivial "exploit" of XPC. The real problem is that such an API even exists at all. I mean, a method to create an arbitrary file in an arbitrary location with arbitrary attributes? That is completely braindead. That is clearly equivalent to root privileges, and therefore such an API shouldn't exist - sudo should be used instead, or other OS X methods of explicitly gaining root privileges.
This method wouldn't pass the most cursory of security checks. It's clear that it was never actually reviewed, and that the original programmer was relying on the proprietary nature of the OS X system for security. Something like this would never happen with Linux IPC - reviewing the methods published on dbus by services running as root by default is a basic check that any sane distro will do. Something as straightforwardly exploitable as this (literally just call the method while running as an admin user!) would never go in to Debian.
Indeed, it wouldn't. It's just another typical example of someone who doesn't understand why a complicated IPC system exists, and thus blames all possible problems on its mere existence.
It's like users who encounter a random problem in a program, and then blame it on the fact that the program is written in C++, simply because they do not like C++.
While this is a significant vulnerability, I don't think the article is correct when it calls it a 'backdoor.' The term backdoor typically implies something that was intentionally left to allow illicit access, and while this is a significant bug, I don't see anything to indicate that's the case here.
This is a hole that exists because an Apple-written application needed a method to gain elevated access. This was done through unpublished APIs which, when used by another application in a similar way, also resulted in elevated access.
So, this was clearly intentional, because it's used by Apple directly. And it allows illicit access, because any program can use it to gain access.
I'm not so sure. Unless I'm missing something, he doesn't demonstrate that this 'backdoor' is in use. It looks like they were using an escalation backdoor in `systemsetup`, but quickly patched a fix after 10.8.5. He just found a way around it.
Now, the fact that 'it takes too much effort' to backport would suggest that it was still in use. I don't see any other evidence, though. I'd be interested if someone found it!
Admin framework analysis revealed use of "createFileWithContents". The function in which this use occurs is not named in the analysis.
An error message in the initial proof attempt led to "authenticateUsingAuthorization". Back to systemsetup to determine how to use "authenticateUsingAuthorization". (This is where I ended up mentally relinking the issue back to systemsetup.)
So, I concede that is is not stated where within the Admin framework this "createFileWithContents" method is invoked. However, I also agree that if that function was not used, it would be simple to remove it and the issue would be fixed.
There's no way to tell for sure if something like this is intentional or not. Also, waiting to fix it until a researcher makes it public may have been intentional.
> There's no way to tell for sure if something like this is intentional or not.
Right, so the simplest explanation is that it's an unintentional bug. The absence of evidence that it was unintentional isn't evidence that it was intentional. If there were some magic string or default password or something, that'd be an obvious backdoor, but to me this looks like a pretty typical privesc bug, albeit in an undocumented api.
It was found through a chain of events that started with looking at the patch related to another privilege escalation vulnerability, one which no one seems to be claiming was a backdoor.
> Also, waiting to fix it until a researcher makes it public may have been intentional.
I'm guessing it's more likely the the researcher waiting to go public until it was fixed.
This is evidenced both by the fact that it was patched alongside other vulns (ie. not a rushed out one-off patch) and from the article's disclosure timeline, which shows 'Full disclosure' occurring 04/09, while the 'Release of OS X 10.10.3' occurred 04/08. This is a pretty typical disclosure timeline; they couldn't begin to fix it until it was tracked as a bug, after all.
A function named "createFileWithContents" is clearly intentional - through its creator probably just hadn't realised the security implication, making this a negligent, rather than malicious, backdoor, not unlike default router passwords.
OT but I have to say that the amount of Apple apologists in these comments is mind blowing. HN reader of all people should be the ones urging Apple to issue a fix for a very serious bug such as this one. Yet many comments here are saying that people should just upgrade while it might solve the problem for some, there are ones who can't upgrade machines at will.
Yeah, this is scandalous. I've heard enough things about Yosemite bugs that I held off upgrading, especially as I want to avoid software compatibility issues.
I hate Apple's new yearly release cycle. There's not enough time to stabilize and improve OS's.
This update just rolled out, but it indicates the admin fixes are only for Yosemite:
Is your mac a server or a multi-user desktop where some users intentionally don't have admin access? If so, you're in trouble. If not, this bug probably doesn't really matter to you.
For a single-user desktop machine, realistically, user/root privilege separation doesn't matter, because all your important data is in your home directory and not protected by root privileges anyhow. The root-protected stuff is actually the stuff that's easiest to replace, because it's just a bunch of software that you can re-install. Viruses don't need to infect your software to stay resident; they can just as easily register a login hook without root permissions.
(That said, some people will violently disagree with me on this.)
>Yet many comments here are saying that people should just upgrade while it might solve the problem for some, there are ones who can't upgrade machines at will.
It's dumbfounding how people here are simply shrugging this off and posting "So what? Just upgrade. Simples" type comments. This isn't acceptable. I know many people in creative industries alone who can't just upgrade immediately any time something comes out as they've to wait for their products to support the newer version. Similarly there are many who are using Macs in work whose corporate policies won't let them simply update immediately.
Then there's many who don't immediately upgrade versions as a point while bugs/other issues in that new version are found and resolved.
I have 3 macs. One of them is an older iMac. I can't put enough RAM in it to run Mavericks (which is a memory hog). It is a perfectly good computer, there is no reason to upgrade from Lion.
The next one is a 3 year old Mac Mini. I upgraded to Yosemite and it performs TERRIBLY. It has 16G of RAM. I've tried everything. Yosemite just sucks. The machine is now much slower than the 6 or 7 year old iMac with only 4G or RAM. If it was practical to back out to Mavericks I would.
The last one is a 2 year old Macbook. I rely on it for my job. There is no way I'm going to risk the crappy Yosemite performance to upgrade. I'll live with the security flaw.
Here is what I now know about Apple: They will not support you for the lifetime of the product. They consistently release buggy code these days. Unless you are willing to shell out more money for new hardware every time they come out with something new, you will be stuck "as-is (with bugs)" until you get rid of it.
No more Apple anything for me. I'm thinking I'll switch the Mac Mini to Ubuntu if I can get it to work there. I'll use the iMac until it doesn't work any more, and when my Macbook is ready for an upgrade (in another year or two), I'll be trading in that still-Mavericks based system for a Linux laptop or maybe I'll go back to Windows.
"I can't put enough RAM in it to run Mavericks (which is a memory hog)."
I'd upgrade to Yosemite today with the latest updates. They just freed up 4 GB of ram for me. Sadly if I had known all of this beforehand I wouldn't have gone out of the way to buy some old Mac Pro where I could upgrade the memory to 32 GB or more, so I just wasted about $1500 for the whole setup.
Then again like you, I transitioned my work to an Ubuntu box, but even for my personal stuff my Mac was just really slow unless you get an SSD drive which is time consuming to install on an iMac.
>Apple's model customer is one who upgrades often. If you want solid support for old products, stick with Microsoft, and accept that their products can be clunkier because of deliberate choices to maintain backwards-compatibility.
This is nothing close to "So what? Just upgrade." He's saying that from apple's perspective the best customer is one who upgrades their machine on every upgrade cycle which explains why they don't care about previous generation OSes. And how can you say this comment is apple-apologetic when he goes on to say that Microsoft does a better job maintaining backwards-compatibility?
Hey, at least there weren't any of the 'Well, I don't care if anyone sees all my boring details anyway' variety. At least people are beginning to understand the most basic implications of loss of privacy.
> Apple's model customer is one who upgrades often. If you want solid support for old products, stick with Microsoft, and accept that their products can be clunkier because of deliberate choices to maintain backwards-compatibility.
> To be fair, OS X updates are free and usually run well even on 5+ years old hardware. OS X has kinda gone the way of Chrome, with most users on the newest version.
> While this is a significant vulnerability, I don't think the article is correct when it calls it a 'backdoor.' The term backdoor typically implies something that was intentionally left to allow illicit access, and while this is a significant bug, I don't see anything to indicate that's the case here.
> Title is a little generous about "hidden", the exploit revolves around API & Framework used to power the parts of the control panel, and its authorization scheme being broken.
> Smells like an oversight to me. Some new developer got assigned to implement or tweak the SSH enabling switch (or whatever), and this was their solution, which never got reviewed.
> Referring to Snow Leopard now not secure > Still pretty much the best OSX.
> Among other things upgrading (to 10.10.3) will do, it'll fix the issue in the article.
So, it's my comment you quoted questioning whether this can be called a backdoor.
I don't intend that to be apologetic for Apple. I called it a 'significant vulnerability' but at the end of the day, it's a privilege escalation like those that have come before and will likely continue to be found occasionally, regardless of OS. I don't see what's apologetic about acknowledging a significant vulnerable while questioning whether it should be called a backdoor.
If you want to talk about Apple's response - I find it concerning that they aren't backporting the fix.
Most of those are people giving fairly reasonable benefit of the doubt, no apologism. But I suspect you're probably about as biased against Apple as the people you're assuming are biased towards Apple.
Oh your really off base.. I am MUCH MORE biased against Apple. You know those Linux users who hate on Microsoft? I am the Linux user who HATES Apple since I was first lied to by Apple in 1983 (Color Mac coming in the next year, wasn't till March 1987) (Color Mac will destroy Amiga 1000 in 1985)
I have never seen Apple as being honest or making products for me as a nerd. So I live mostly in Linux nowadays.
A biased person isn't credible? What did I say that was factually wrong or personal opinion. I don't like Apple products (Locked down and no fun for a hacker and expensive)
I have no credibility since I don't like a company? What about the opposite people who like the company?
Credible people are always upfront with their biases. I never trust anyone that says they are neutral.
Of course not. You're admitting and prideful that you aren't going to look at things fairly, and your complaints are therefore unhelpful, naive and very likely wrong to boot. People who simply like Apple are not uncredible, though a fanboy would be, for similar but opposite reasons. Credible people are aware of their biases and strive to overcome them. When you embrace your biases like you have, you're just offering some bullshit with a disclaimer that it's bullshit. At least you have honesty going for you.
Well the answer to that is when someone says something you call for proof. You don't throw out the baby with the bathwater. That's what I do with David Pouge and Walt Mossberg. But I usually get my news from Tech Press that has been black balled by Apple.
At least you're honest about it. Most who feel similarly will coat it in a thin veneer of pretend equal-handedness and categorically deny any emotional investment.
> To be fair, OS X updates are free and usually run well even on 5+ years old hardware. OS X has kinda gone the way of Chrome, with most users on the newest version.
>Apple's model customer is one who upgrades often. If you want solid support for old products, stick with Microsoft, and accept that their products can be clunkier because of deliberate choices to maintain backwards-compatibility.
He was rebutting that regular upgrades are somehow a type of revenue to apple since they are free and work on some previous generation hardware. How is that apologetic?
It sounds apologetic to him because he's more interested in tribalism than business or technology. It's not unlike much religious fundamentalism- the pursuit of truth takes a back seat to the pursuit of feeling like you're right about things, especially if it means you get to hate a caricature of some other group of people and blame some of your personal disappointments on them.
A similar force seems to drive politics, or at least motivates disturbingly huge blocs of voters. :(
Wait saying that 5 years hardware support is to short is wrong and that yearly updates in OS makes me into a religious Anti-Apple Zealot and Tribalism?
I don't like Apple's business practices, hard to hack devices and technology philosophy doesn't mean I am Tribal.
How is it good business sense to have 5 years support for hardware? I have much older hardware doing very productive things also my non-profit couldn't afford to upgrade every desktop every five years and update every desktop's OS every year.
I have to agree. I have an older macbook that if I upgrade to another above Snow Leopard, it runs pretty slow. But I do understand that support for older devices has to stop at some point.
Smells like an oversight to me. Some new developer got assigned to implement or tweak the SSH enabling switch (or whatever), and this was their solution, which never got reviewed.
What do you call something that grants root access without authentication, but wasn't intended to let arbitrary people or programs use it?
"Backdoor" isn't quite right, since that implies that the intent was to allow unauthorized use.
"Security vulnerability" isn't quite right either, since that usually implies getting code to exhibit some sort of behavior it was never supposed to have.
I can't think of any other term. Of the two, "backdoor" seems closer. Maybe "unintentional backdoor"?
> What do you call something that grants root access without authentication, but wasn't intended to let arbitrary people or programs use it?
Local privilege escalation. In a huge number of established LPEs, the exploit is by leveraging a weakness in checking who makes the call that allows legitimate privilege escalation. This is a legitimate privilege escalation (sshd binds to port 22, among others tasks), that can be exploited through a weakness in checking who is making that request and if they can have that granted.
Unintentional backdoor is better, wouldn't be my choice though.
To me the term backdoor implies malevolence and purposeful decision to allow you to remotely access someone's system without their permission later.
And purposeful decision to allow you to remotely access someone's system without their permission later.
This seems more like a mistake, although a pretty big one. It seems like it was designed as a small escaped out to make some of Apple's scripts cleaner it wasn't locked down to the degree that should've been.
It's a big security hole, but I'm not sure backdoor fits.
There is no jargon for "obscure, deliberately implemented security hole which supports system functionality", because nobody does that.
(For values of 'nobody' which are not members of a set that includes such as Apple, obviously.)
Jargon and slang is needed for shortening the names of everyday things that people do, use or encounter.
If Apple has started a trend, we might need a new term, like "root kludge". A deliberate solution with negative attributes is a kludge (that much we have slang for, because we encounter such things with reasonable frequency). This type of kludge gets us root. So ...
Wait, if Apple starts a trend with this, then it will be cool, and have some name that begins with 'i'.
How about "iHole".
"An iHole was discovered in my wi-fi router's firmware".
I guess. It's a backdoor in the sense that you can intentionally bypass privilege checks by calling a certain API with a particular argument. But I don't think it's a backdoor in the sense that the developers wanted a secret way to gain root on an arbitrary machine.
As someone who loathes Yosemite, I was hoping to find another fix for this. However, the exploit code when run in my console already returns "You need an administrator password" prompt. Huh? I'm running 10.9.5.
Run the python script at the end of the article as non admin/non root user. The script allows you to create a copy of a binary in say /usr/bin with "set user ID execution" bit set and owned by root! This should not be possible without root privileges.
This is pretty bad, and trivially exploitable. Basically anyone that has non-root access to your computer can trivially become root and take over the system.
When Apple refuses to fix something that's good the public's interest, the best way to get Apple's attention is to make this a rootpipe gate by sending tips to all Apple focused media outlets like AppleInsider Macrumors etc. Talk about the power of press.
Null pointer dereferences do crash in Objective-C. Sending messages to nil is generally not nonsense in Objective-C, though. It has well-defined semantics and is useful in many cases. It isn't a "Don't crash when faced with nonsense" policy. I'm not sure the benefits are enough to justify its propensity to trip people up, but it's not like it's just some kind of crash prevention.
This is not considered "nonsense" in Objective-C is part of the issue: having "no receiver" do nothing is well defined behavior that has a long tradition at least back to Smalltalk. People use this to write shorter code, on purpose, causing an effect similar to Haskell's Maybe monad.
(Note: I, myself, do not ever rely on this functionality, and even disagree with it, but the context in which people use it is important to understand when judging it; and with that context, I consider my own objections mostly due to my biases and not something that I would argue is "correct".)
To be fair, calling a virtual method on a null receiver is undefined behavior (not necessarily a segfault!) in C++ too. The compiler is free to replicate Objective-C's behavior if it wants to.
(Of course, no compilers actually do precisely that, but it is possible for compilers to delete virtual method calls entirely if it can prove the receiver had to be null...)
It's an optimization. Virtual methods that don't dereference this (by not accessing any non static member variables for example) don't have to check if this is null. For example,
class Example {
public:
static bool do_it;
virtual void do_it_if() {
if (g_do_it) return;
/* otherwise do something that dereferences this... */
}
};
void test1() {
Example::do_it = false;
static_cast<Example *>(NULL)->do_it_if();
}
void test2() {
Example::do_it = true;
static_cast<Example *>(NULL)->do_it_if();
}
In the above code, only test2 crashes (clang 6.0). If crashing was defined behavior you would need an extra check in the code generated for do_it_if.
EDIT:
Also, it's worth explaining why test1 works even though do_it_if is virtual. Since there aren't any methods that override do_it_if, the virtual method lookup can be elided and you don't have to dereference this to find the vtable. If do_it_if had been overridden and the runtime type was unknown, you would need to dereference this to lookup do_it_if in the vtable, which would segfault in both tests.
C and C++ need to be usable in systems without memory protection, so they don't guarantee any faults for operations like method calls. (There are some standard-guaranteed exceptions, for example std::bad_alloc, but those are only on relatively expensive operations.)
Compilers then take advantage of these undefined behaviors to optimize even on systems that do have memory protection.
I'm kind of surprised there's not a means (to my knowledge, at least) to crash/throw an exception upon sending a message to nil for people who want to better ensure their code doesn't have lurking issues like this. I suppose it'd have to be smart enough to filter out system frameworks to be useful, but I'd imagine that to be do-able.
I have a feeling it wouldn't work well in practice if you made the handler raise an exception. As the author notes, you're not the only one sending messages, and Cocoa will probably crash your program with a nil message send sooner or later. But you might be able to use it to find out where it's happening, at least.
There was, in the form of the -fno-nil-receivers compiler flag. It looks like clang doesn't support this flag. I'm guessing nobody used it. You'd have to be really careful when turning it on, since you're effectively using a language that's similar but not quite identical to Objective-C, and which will crash on lots of legal and sensible Objective-C code.
To make a scheme like that work, you'd essentially need to check before every single method invocation in your entire program because the issue — such as it is — is with method invocations being made against nil references, not with nil parameters being passed in as arguments to method calls.
I was imagining something more like a modified version of the runtime's objc_msgSend() C function (which Obj-C method invocations get compiled into) that guarded against that, but you do make me realize that such checks could conceivably be automatically tacked on at build time.
Anyway, I'm not entirely sure how useful such a thing would be, but I am curious how often my code (unintentionally) makes calls against nil and I've just never noticed because it doesn't cause harm.
Ah, I see. I think that at some point you have to trust something. The only true way is to never trust any of your inputs. This is, after all, what the JVM does in order to throw out kindly NullPointerExceptions on null dereferences. But realistically that is extremely consuming and has little payoff in most cases. I typically do not trust arguments, but do trust the results of functions to be what the documentation says it will be.
Static analysis could potentially go a long way here to expose values that could be potentially "infected" with nil. (And, for all I know, the JVM JIT could do exactly this to skip checks on provably non-null values.)
The Obj-C function call [foo doSomething] compiles to the equivalent of objc_msgSend(foo, "doSomething"); (for the pedantic, I'm intentionally simplifying and leaving out objc_selfrefs)
The first thing objc_msgSend does is check if foo is nil and, if so, returns 0/nil/the all-zero bit pattern/whatever. It can be extremely convenient, but it can also be extremely inconvenient.
Very informative, thanks. I am a C++ developer and recently had to fix a bug in an OSX app using Obj-C, which looks insane syntax to me. This only goes to reaffirm my belief that it's insane.
Anyway, I'll carry on reading my ObjC book as obviously the language has uses, although there's a Swift book in the post. Which do I read first though, eh?
Calling any method ("sending any message" in Objective-C terminology) on a nil receiver does nothing and returns the zero value of the method's result type (a value of the right size with all bits zeroed).
In Objective-C 'nil' is a special object (internally rwpresented by a null pointer). When the runtime is instructed to invoke any method on nil, the method does nothing and returns nil (rather than crashing). I think this is an artifact of Smalltalk behavior.
Pretty sure they did that as root, so they could get past that and explore how the program worked when being run as a non-root user. The final exploit doesn't depend on this program, though; it uses the same RPC interfaces this program does, from a separate program, so the patching was just part of their exploration, not part of the exploit.
Just make a copy of the binary that's editable by the user. Or copy its code into a new program. There's nothing special about the specific root-owned systemsetup binary. (It's not setuid.)
Sometimes these things are easier to explain if you look at it like an attacker:
If you're a regular user of an OSX system before 10.10.3, and you want to become root (admin user), you can use this exploit to become root.
If you're an author of malicious OSX software, and a user without admin permissions installs your malware, you can use this exploit to make your malware run as root.
Once you have root, you can do anything on a system.
If the system you're attacking is 10.10.3 or later, you won't be able to use this exploit because Apple has a fix in 10.10.3.
Apple will not make the fix available to anyone on 10.9 or earlier.
So who wants to sell a patch for the now unsupported users of OS X 10.9, 10.8, ...? It could be as simple as changing the permissions on System Preferences so that now you must enter your password before running them (so that they always run as root). Then patch out the backdoor.
I also don't think this indicates malicious intent. It's as likely that some collection of junior engineers who were tasked with something other than system security, simply invented something that was both insecure and obscure. This happens all the time. It takes a lot of extra time and code to engineer security and it's historically been one of the first things that product managers and senior management will toss out the door when the schedule inevitably becomes the gating factor. Hey, they fixed it. And you know, some Day One bugs are simply very very hard to fix when you find out years later that there's all this other code that depends on it...
It does explicitly say that Security Update 2015-004 only updates Admin Framework on 10.10+ (it does not require 10.10.3) to fix CVE-2015-1130 : Emil Kvarnhammar at TrueSec
https://support.apple.com/en-us/HT204659
Quite a few other security related things are fixed with this update that do apply to older versions of OS X.
I get that, but surely you have to provide a reasonable amount of time for patches to be installed. Perhaps a 2-part blog entry, where he gives general details (to instill urgency) and then release exploit code a month later?
If he published because Apple just patched, then I agree. If he published because he said he was going to disclose it at a certain time, then I think it's Apple's fault for dragging their feet. 6 months is an eternity.
I recently got a mac mini that shipped with Yosemite.
It has the nice feature of completely killing the WiFi interface when you attach a USB hub - googling around it seems a Yosemite bug. I have been told that also using bluetooth devices (such as the apple mouse) can trigger the same behaviour.
So I'd say I'm sorry that the mac shipped with Yosemite, except that at least I received the fix for the privilege escalation bug.
It isn't just Yosemite... I had exactly this behavior with Mavericks on a new MacBook last year. Fired up a Bluetooth mouse, and next time I opened the lid wireless had completely gone.
I've managed to get it working after a fashion, although not reliably, and have come to the conclusion that Apple just don't do wireless well.
I had this bug plaguing me for a while. My WiFi connection would drop every fifteen minutes or so, requiring me to turn off the WiFi and then turn it on again. I was able to fix it by doing a clean install of Yosemite (from a bootable drive).
My personal computer is a late 2008 MBP, and Yosemite runs fine on that. It's been upgraded to 8 gigs of ram, and that's enough to smoothly do dev work in vagrant environments.
If you're seriously concerned, why not use an external drive of some sort and install Yosemite to it? (Bonus points: clone your internal disk to it first, and upgrade that.) Boot it up, test out your critical functions, and then update your main volume if it works out for you.
You need to do the Bonus points version of your recommendation. Running an OS off an external will just make everything feel sluggish (unless you have a lightning enclosure and spare SSD laying around).
That would depend on which interface and drive technologies you use. A SSD connected via Thunderbolt would likely be faster than an internal hard drive.
I upgraded. I haven't had any problems that I wasn't already having in Mavericks (Anyone else have kernel panics when sleeping with a thunderbolt monitor attached?).
I haven't had any issues with that, and I have two thunderbolt monitors attached in daisy chain mode.
The only kernel panics I've experienced so far have been Parallels Desktop bugs that crashed the system. I'm regularly stressing my Macbook Pro to the max running vagrant images through Parallel Desktop with full integration tests running inside, and this tends to be pretty effective at finding strange bugs in Parallels Desktop.
I also have a problem with a Dual-link DVI adapter. If I sleep with it connected, at some point it wakes up and "kernel_task" takes up 100% CPU (all 8 cores) until I put it to sleep and wake it up again. Sucks coming down in the morning and finding my Macbook burning a hole in the desk.
Yes, but that security update doesn't include this patch. Apple confusingly lists the security updates in 10.10.3 and Security Update 2015-004 together, but not all items apply to 10.9. The very first item listed on the page is this Admin Framework bug, and it's only available for v10.10 to v10.10.2.
Xcode 6.3 is out now and requires 10.10. I also waited as long as I could, so now I have a machine that's been installing Yosemite for the last 18 hours...
It's a known issue related to having many nonstandard files in /usr/local or something like that. Eg. If you are using brew. Google for it, or just wait patiently, it should finish at some point.
OT but does anyone know how OSX bluetooth keyboards work? I thought I had BT disabled and a prompt popped up on hotel wi-fi asking me to enter the code for (not my) keyboard
What? So all OS X boxes are simply broken, privileges-wise, if they're not on 10.10?