I haven't quit it, but the problems, annoyances, surprises, seeming ineptitude, and creeping iOSification of OS X that the author describes sure do resonate.
Every new major release of OS X is a day or week spent disabling things, shutting down Spotlight again, trying to restore things back to the way they were instead of the way some Designer with a capital D thinks they should be, for no other reason than, "Beauty."
I just dread the idea of moving to Linux again. I don't want to tinker that much. But I am worried sick that OS X is dying, in the sense that it's becoming a platform to deliver people to Apple's (and partners') cloud services and sharing services and that's it. Screw all of that.
One major shot across the bow was the loss of "Save As..." and the change to "Duplicate". WTF, Apple? I now have to do 10 extra steps just to Save As.
It feels like Apple is abandoning its longtime users, the master users, the users who've climbed the pyramid, who've achieved a lot of game levels. It's just going after that huge base of newbies and midlevel people who don't notice or complain about all the changes that really, truly are not improvements. They're just changes. That's the problem in a nutshell: OS X changes because there's new management that wants to put its stamp on things, regardless of whether it improves the productivity of the user or not.
Why do people update OS X? Just curious. If it works how you like, why update it? Security flaws are probably the main reason, but isn't there a way to get those without acquiescing to an OS redesign?
Because you have to: OSX doesn't have much of a culture of backwards compatibility, and every update tends to pressure developers into the latest greatest thing. Maintaining old software is just not a cultural value.
Once the developers move, the users pretty much have to. I have an iMac from 2009. I had two OS's on it. Windows 7? Everything still works there. Runs fast, new software is great, etc. OSX 10.6.8? DEAD. It's basically useless. I guess it's nice that apple offers free upgrades, except that they mysteriously make a system that used to be lightning fast extremely slow, even though other OS's seem to run just fine..
It works perfectly. I have zero complaints or issues. It's no problem at all.
I use chrome and terminal for most of my work, but I do run itunes and vmware fusion and run some apps that give me a tiled interface and focus-follows-mouse.
No problems at all. I couldn't be happier.
I am sure it would all go to hell if I got a current model and tried to run SL on it ...
I upgraded SL -> Mavericks a few months ago when it stopped receiving security updates.
It's slightly worse (I don't like mission control, messages is useless to me) but not as bad as Lion (i.e. you'll just need to disable the stupid scroll behaviour, but performance is good).
On the plus side, you get the option to use new software developed for "10.7+" (heroku's db client, atom editor, swift etc)
Apple has a pretty vicious hardware/software upgrade treadmill.
I resisted updating 10.4 for years; IMO that was the high water mark for OS X, everything has basically been downhill from there. If I could still run 10.4 plus bugfixes and security updates, with modern software, I would.
But that's not possible. They push out new versions of the OS, along with new versions of development tools, which produce software that's not backwards-compatible past a certain point, such that eventually you can't run new software without installing major (0.1) updates. Apple's own products are the worst for this, but eventually you lose 3rd party apps as well.
Even if you resist the demands of new software, you'll eventually get forced to upgrade via hardware. Each generation of Apple hardware has a minimum OS version, keeping you from going back too far. For instance, Mac Pro "quad core" and 8-core systems won't run OS 10.4; Nehalem-based machines won't run 10.6. And Apple has purposely killed off its compatibility layers, dropping first the Classic environment and more recently Rosetta, in order to introduce barriers to running old software.
Classic/Rosetta were dropped during the moves to x86 and 64-bit. Given that almost all software was upgraded I can see why Apple didn't think the effort was worth it. And for all we know it could've been technically impossible.
Not sure why you're getting downvoted. I stuck with 10.6 until Linux. Took a look at anything past Lion... "nope!". I despised 10.8 enough to actually downgrade my 2012 MBP to 10.6.
But as for "why update" - up until 10.6, there were performance improvements as well as useful features (subjective -I know- fine).
As for "why update nowadays" - well, 10.6 isn't really supported nowadays. (cough Java 7 cough)
Both were Apple's choice. Apple's Java 6 relied on proprietary interfaces that were not part of the contribution to OpenJDK. Oracle could not have even recreated the Java 6 MacOS port. The other big reason was that new MacOS versions were only x64 and going to the trouble of making a 32 bit port for only obsolete OSes was a non-starter.
The reason--the only reason--I'm seriously considering upgrading from 10.6 is that new programs increasingly don't support it anymore. I download a simple little helper app and it says "The program requires OS X 10.8 or higher."
My best guess is that programmers build with the latest libraries, and the latest libraries require the latest OS version. If the dev is running on the latest version, it never crosses his mind to do otherwise.
I did the same thing recently, and I've hated it since. 10.7 removed Rosetta, broke "Save As", and annoyified the Save dialog.
But there were just too many pieces of software that wouldn't run, and unpatched security holes didn't seem like a good idea either.
To add insult to injury, Apple doesn't allow you to virtualize non-server versions of 10.6. You can, thankfully, hack VMWare Workstation and keep running your previous machine's image that way, but it's shitty that you have to jump through those hoops. It seems geared specifically towards keeping people from continuing to use their old apps.
I'd like to keep 10.6 support in the open source app I help out with but there are so many improvements in 10.7 and 10.8 from a developer point of view that the latest release will be the last to support 10.6 (and 32-bit macs). By improvements I mean genuine time saving features like base localisation, not "ooh shiny dot syntax!".
Until a version is EOL'd, sure, and that's what I've been doing. But 10.6 was EOL'd in late 2013, and by early 2014 at the latest it had unpatched major flaws, so I had to upgrade to 10.9.
So Windows has become more sane than OS X in terms of security updates?
EDIT: Windows 7 was released in 2009 and is still receiving security updates. 10.6 was released in 2011 and has been EOL'd. Seeing as both people still want to use these products, but one group is being forced not to, that's why I'm saying OS X is taking a less sane stance than Windows.
OS X upgrades are free, more like Windows Service Packs in a way. Microsoft also drops support for pre-SP versions of Windows after a time. The clock is ticking for Windows 8.0; you have to upgrade to 8.1.
I worked on the full screen feature in 10.7 through 10.9!
Full screen in 10.7 and 10.8 did render secondary displays useless. But it was a new feature in 10.7, so there was nothing to reverse. You can't take Safari, Mail, etc. full screen in 10.6.
Perhaps there was an app you used that switched from a custom full screen implementation to the system one, and so regressed on secondary displays?
VLC, iTerm (& Terminal? I think), Chrome, Firefox, ... Most things I used had some sort of full-screen mode that was then "hijacked" by 10.7's native full-screen mode. I say "hijacked" because even versions that were pre-Lion would somehow end up using the native "feature".
The thing to reverse would have been the "switch to new workspace when full-screening" - or at the very least make it optional. Certainly not respond "that's a feature" and close as "working as expected" when thousands complain.
I failed to see the value in that feature even with a single screen. An action that used to happen instantaneously now took 1-2s. and a dizzying sliding animation. (Many a flow was lost to toggling full-screen by accident - whereas previously you could toggle/toggle back immediately without losing your mental state - surely you appreciate that as a developer?).
Ok, I get it: you don't like how the system full screen integrates with workspaces, and you were peeved when other apps adopted it in place of their own implementations. Of course Apple did not "hijack" anything: full screen support has always been strictly opt-in. But apps would feel pressure to adopt the system implementation.
Full screen was an effort to make OS X more usable on small displays - recall that the 11" MacBook Air had just shipped. It didn't make much sense for media playback apps to adopt the system full screen mode, especially as it was in 10.7-8.
I would have loved to enable a system mode where full screen windows could coexist in the same workspace as unrelated windows, but this would have been a new feature, not something we could have achieved by reversing anything. And eventually Apple did enable a new mode, which was what shipped in 10.9.
Oh, and if you filed a bug, then whoever closed it as "working as expected" made a mistake. There was a (heavily duped) bug tracking the uselessness of secondary displays in FS, and it was closed when 10.9 shipped. I may even have been the one to close it, I don't remember.
Of course Apple did not "hijack" anything: full screen support has always been strictly opt-in. But apps would feel pressure to adopt the system implementation.
This doesn't make much sense, does it? Obviously apps adopt the system implementation. The problem the OP is talking about is that the system implementation became pretty weird, broke some apps that used to work just fine (especially lots of xquartz ones), and thought that the best use for your extra monitors was to just display a dark gray pattern.
Personally, I learned to live with it (sigh, uncheck "displays have separate spaces"), but it does seem like a good example of Apple shoving a half-baked idea out the door.
QuickTime and iTunes. I used to watch tv or a movie on one of my displays while coding on another. Instead I got to watch linen one screen while watching a movie (double whammy here because even showing pure black on the other screen would have been better since at least it wouldn't distract from the video).
I can't specifically recall, but I think the same was true of full screening video content from Safari (perhaps technically a "plugin feature", but maybe html5 video was around?), but QuickTime was 100% an apple regression.
I have no idea how the full-screen mode is supposed to work or what it is supposed to be good for; I just know that every now and then I fat-finger something and accidentally invoke it, at which point whatever single window I happened to be using zooms to take over one of my monitors, while the other monitor turns grey and useless. I don't understand how this could ever be a useful feature.
I took it personally because my vendor condescendingly told me that the fact that I can't use my environment (multi-monitor) the way I wanted (full-screening applications) was actually a feature. Apple was telling me to STFU or GTFO.
All you can do is seek your own joy. Expecting others to do that for you is setting expectations. Not sure why I got the downvotes, but I'm happy you got free of OSX. I'm still hanging in there...for now.
It worked exactly the same as before only Apple added full screen mode which didnt work in an ideal way. But guess what you don't have to use it. Just maximise your windows normally or use one of the many free tools to do it via keystrokes.
But yeh fuck Apple for offering free OS upgrades that are completely optional.
A service pack doesn't completely redesign how people interact with Windows, though. But OS X "service packs" tend to do just that. In fact, that was the author's frustration.
Seeing as OS X service packs also come out very frequently, whereas Windows are very rare, I'm not sure the two are comparable.
I should probably bow out now. I just thought it was interesting.
I think the principle attribute that determines how long support lasts is the cost to get to the next supported version and not anything to do with feature differences. Apple made the decision to give away upgrades precisely so they could move people off old versions I suspect.
They'd be like Service Packs if Service Packs broke backwards compatibility and changed core functionality of the OS, which they typically don't. They are not analogous.
Desktop Linux is mostly install an work. I have not done any tinkering for a long time.
But the Yosemite upgrade on laptop brought back my memories of initial Ubuntu distributions. I had disable few setting to get some performance. I had to change few UI setting to get a decent look. It looks a transition phase OS.But ubuntu on that m/c had few touch pad issues otherwise it would have been the default OS for me on the laptop.
It is...until it isn't. Nine out of ten Ubuntu or Mint installs will go off without a hitch, with no weird issues or regressions, and a warm, friendly, comfortable development environment welcomes you. Then there's that one time you install it on your laptop so you have a to-go environment that matches your workstation, and BOOM! your wifi isn't recognized (what year is it again??) or your sound card sputters (damn you PulseAudio!), or your hybrid graphics screws the pooch. Hell, I built this workstation I'm typing on with GNU/Linux and BSD compatibility first in my mind, and I still had issues with some hardware right off the bat. Nothing that can't be fixed with some fiddling, but it's annoying as hell.
Yes, all of the above issues can be fixed, just like the issues you dealt with in OS X. It's a computer, after all; Garbage In (Apple/Linux/BSD developers), Garbage Out. And don't get me started on Windows 8.x; it's finally becoming usable daily, but there are a million reasons I chose to stick with 7 for Windows-specific work, and wait it out until 10 ships.
Apple broke the cardinal rule: If it isn't broken, stop fixing it! They want to innovate and improve and conquer the world, fine; but they need to remember that they had the best OS X release with Snow Leopard (and in my personal opinion, that was the best desktop OS period). In their rush to wow the masses, they broke their OS for those of us who use it to be productive and creative.
At this stage, I feel that a good old fashioned, stable OS like FreeBSD or Slackware Linux or Debian is the best choice for a solid 'nix workstation, something you can get real work done on. But ever since Lion was released, I would rather use Windows Vista on a Core Solo machine than OS X on any Mac.
I dual boot windows and ubuntu. They both have their good points and their bad points.
> Nine out of ten Ubuntu or Mint installs will go off without a hitch
Even when the installs go off without a hitch, there are always lingering pain in the ass problems and they're all related to buggy drivers, bad UI tools and dependency hell. Every time I think about making my own distro, I come back to those three big problems and think: the last two are fixable. The drivers... not so much.
More wifi then Sound were common for me before don't have them now. Actually i face these issues when doing a fresh installation of Win7. Many times i had use Ubuntu to find windows drivers!
My Macbook pro Ubuntu installation went without any problems. Touch pad mouse clicks weren't as exact as OS X. I am used to it hence was not able to use Ubuntu. Maybe i should tinker a bit.
On this particular machine I'm using, some distros have excellent sound out of the box and some require snoop=0 added to the snd_hda_intel options in /etc/module.d/ to avoid stuttering audio. I didn't discover this bug until after I'd bought the motherboard and tried various distros; in my research it never came up.
And on a completely different note, I have a Dell Latitude laptop from around 2000 or so, a Pentium III machine, that has no built in networking unless you count the dial-up modem. I picked up a random CardBus wifi card for $5, and OpenBSD recognized and configured it flawlessly.
It really all comes down to whether the OS developers have access to the hardware the rest of us use. If, for example, your wifi doesn't work, it's not likely to until enough bug reports are submitted that the kernel hackers responsible for wifi drivers get their hands on the hardware and write or improve a driver. It's the same for those trackpad issues you had; someone somewhere has to debug that.
This is why it's a good idea to research your hardware if you intend to run anything other than Windows or OS X. And even with research, it's rarely 100% working out of the box. That's the tradeoff for having what (in my opinion) is a very productive and comfortable working environment.
Sorry, a bit of hyperbole on my part. I was alluding to the "Vista Capable" days, when machines barely able to run XP were touted by their manufacturers as "Vista Capable", even though they knew very well that the new OS had much higher requirements than XP. Core Solo CPUs were particularly bad at running any Windows OS, and were featured in many of those machines.
My point being, even a kludgy setup like that was more tolerable than OS X 10.7 and up (again, hyperbole, but close enough to the truth in my case).
I thought it was obvious from my clarification: Lion was the Vista of the Mac world. Major instability, much slower than SL, broken hardware drivers in my case, forced obsolescence (why drop support for perfectly capable Core Duo and even some Core 2 Duo machines?), massive changes to file saving...it would be faster to list what wasn't screwed up.
Jesus Christ. Okay, in short, ML fixed some but not all of Lion's issues, and Mavericks fixed a few more but introduced a lot of annoying "features" that still broke my workflow. Yosemite was another huge regression, almost as bad as Lion.
If you want more specific than that, you'll have to find someone who has spent more time than I have on those OS releases. I've tried each one and have yet to see anything better than Snow Leopard; if you don't like that answer, too bad. I'm done.
You know, I started getting the feeling I was being trolled early on, but I didn't dare try calling it out here. Trolls seem to be dealt with by the staff but regular users who call them out get spanked too.
I always wonder where do people get these quirks... And I suspect mostly that's Ubuntu's fault, which is a shitty OS to begin with. It's the only distro I had problems with my graphics card (an Aspire 5050, several years ago), and the audio output.
I've been on archlinux, on testing, for christ sake, and it's been at least a year since I had an issue of any kind.
Debian was also fine by me, but the software was too old on stable; I did have installation issues with it, however, on other peoples laptops.
I've tried Debian, arch and several flavors of Ubuntu (like #!). The quirks aren't in the OS, they're in the shitty drivers, which are common to all linux based oses. Just because you don't personally see an issue doesn't mean others don't. The inconsistency across hardware is a large reason apple forces people to use their OS only on approved hardware.
Ironically, Ubuntu was the only distro that didn't present audio stuttering for me out of the box. I usually run Slackware and it was present there, which led me to research it until I found a fix. I chalked it up to Ubuntu staying on the bleeding edge even with their LTS releases, along with having the largest user base by far (more users = more people with this bug = more incentive to fix it).
Either way, it was an easy fix for Slackware and the other half dozen distros I've tried on this machine, and I've found generally that Slackware is the best fit for someone who doesn't mind getting their hands dirty occasionally, in exchange for stability and lack of dependency issues.
I don't know how it is on Macs (probably worse), but on my Lenovo Y50 I've had a lot of issues getting my nVidia card to work properly. First I tried Arch Linux, which I had been using on my previous machine for about 2 years now, and no matter what I did I couldn't get my nVidia card to work properly, even with the nvidia-beta package from the AUR.
So then I tried Ubuntu, thinking that would be the "easy" distro. It installed alright, but the nVidia card was still giving me trouble and the official Ubuntu nVidia packages weren't new enough (it's a GTX 860M). So I added a PPA with a newer version of the driver and that seemed to get it working at least, but I still had massive screen tearing issues, and for some reason the nVidia settings app had far fewer settings than the Windows version I have on the other partition, and it didn't even have an option to enable vsync, which should resolve the tearing. After trawling through mountains of nVidia documentation and forum posts, including, no joke, the EBNF for their config file, I still can't get the damn thing to work.
I'd like to have a real shell and be able to play games without rebooting, but the polish on desktop Linux still isn't quite there yet. And yes, I know this particular problem isn't the fault of Linux or any of the other software in Ubuntu, but it's a problem with the ecosystem and it's one that's preventing me from using Linux as much as I'd like to right now.
Apparently I had tried messing with some other environment variables earlier, but this one didn't do it either. It shouldn't be this hard to tell a GPU to slow down when it draws things.
That was one of the things I tried and somehow it still didn't do it. Setting it in the individual games' settings didn't to it either. I'll try one more time though.
Is this a complaint about the Preview app? Because obviously all other programs you might run in OSX make their own decisions about their functionality. Personally, I always use the Menubar search feature via QuickSilver or the default Help>Search shortcut, CMD+?, so I never even noticed those features have been now hidden behind an option key.
The Duplicate/Save As thing is present in any app that's using the system document-based app frameworks and has opted-in to autosave... you'll see it in most of Apple's apps (like TextEdit, Preview, or Pages) and in some third-party apps (off the top of my head, I know Pixelmator does it, and there are others).
iA Writer is another to go down this route. To be honest, I hadn't really noticed since most of my time is spent in Chrome, Sublime Text, Spotify/iTunes, Cyberduck, and Mail. None of these - even the last - have adopted the new api/framework.
Thank you for this, I had no idea. Further, I have no idea how I would have found out about this if not for stumbling across it randomly online, as the menubar is simply not a place where I expect options to be changeable and there's nothing in the menu to suggest option will modify it.
You can switch to Linux, but unfortunately it's been degrading along similar lines.
Big, wrong ideas are destructive, and the two biggest, wrongest ones right now seem to be: "desktops are just out-of-date tablets", and "the only good affordance is a dead affordance".
Affordances are what make computers humane, and there's a world of difference between how phones are used (mostly social), tablets are used (mostly media consumption), and how desktops are used (mostly productivity).
The visionaries in charge of all three of Windows/OSX/Linux are of similar mind about the "big ideas", and it's caused a lot of grief for productive people.
I disagree regarding the "tablet" point - while the major DEs take some UI inspiration from tablets, in general they're still very much desktop-oriented. People often complain about Unity for being "too much like a tablet", but actually try using it on a tablet and it's not well-suited at all in its current form.
You might want to look at KDE and XFCE though - both are sticking more to traditional Linux desktop ideas than Gnome and Unity.
I don't think we actually disagree. The tablet/desktop convergence is them trying to sit between two barstools - true that they're still mostly toward desktop, but they aimed for both and hit neither.
And agreed on your second point, using XFCE right now.
In UI terms, it's any sort of visual clue for the user to do something. A button might be drawn as a "raised" thing inviting you to press it down; a window corner that has larger and visually obvious "drag handles" where you can grab it to resize the window; that sort of thing.
I'm with you on moving back to Linux, having sufficient experience with both I'm loathe to move back for both hardware and software envrionment frustrations. I do have hope that things will get better on that side of the fence, but OSX would have to become pretty terrible for me to do that anytime soon (though they seem to really be pushing their luck lately).
I have a new OS X MBP. 16 GB RAM, NVIDIA Something (650?), SSD, Yosemite, i7. I have an older i7 Sager, 16 GB RAM, Nvidia 550m, HDD.
The Sager for years has been a terrible computer due to the WIFI support. It was crappy in Windows and worse in Linux. I actually looked forward to getting the MBP. Recently, however, I found a guy's opensource driver for the RealTek WIFI. This is more stable than the actual official driver. His blog post was great about how to install it. Now I've got two computer to compare.
OSX is apparently terrible at memory management. Running Eclipse and like 10 tabs in FF can cause the system to swap. On the SSD I loath the very concept. Some of my comments on HN about Light Table stem from the fact that LT uses .5 GB of memory when everything is said and done (node helpers, etc). LT on OS X ran great, but OS X would swap. Running multiple VirtualBox instances made it worse. I tend to run about 260-350 MB of swap if not more for a few small programs.
Cut to Ubuntu. Once I got use to Unity I liked Ubuntu. The OS is smooth, use-able and well-supported both at a community level and from a system update perspective. I can run a 2 GB Arango VM, and 3 Hadoop VMs at once all nicely networked to each other via host-only. FF with the same 10-ish tabs running with lein REPL and Counterclockwise (Eclipse IDE for Clojure) and still only use 76 KB of swap.
Now aside from following a few steps about getting the WIFI to work, I've not really done much Ubuntu customization. I haven't had to. I installed it, it worked, I worked.
OS X had some nice ideas, but, IMHO, Linux caught up. The terminals available with Linux are better than the default terminal with OS X. They are more memory efficient than Console 2. Unity works well. I actually have muscle memory trying to work with OS X as I do Unity.
I will grant that the OS X laptop is light years ahead of my Linux box's battery. Even when the battery was new, the Linux laptop was lucky to get 2.5 hours. The OS X laptop gets 5 hrs or so under my daily load.
Interestingly, the only laptop of the 5 or so I've owned to have a dead pixel is the MBP. Under white backgrounds it's easy to miss. On dark backgrounds I'm annoyed.
I don't have personal experience since my MBP has a SSD. Things page fairly quickly. It will affect the total life expectancy of the drive. With the new models of MBP that's a real problem. The drive and it's logic board have to be replaced. This will require me to pay a "Genius" to do that. Ideally I could just pop a few screws and be done.
SSD lifespan concerns are overdone. Tech report took 100TB of constant writing to get a TLC SSD to start showing errors, and you're not likely to do that. However, by avoiding swap, you're in all likelihood causing extra reads, which also degrade performance. I advise saving your worry for other aspects, such as reliability during unexpected power loss.
I'll let it swap (since I can't figure out how to shut of swap). I just do any heavy lifting on my Sager. A nice side effect is that I have to make sure my deployment flow works.
> I now have to do 10 extra steps just to Save As.
I feel your pain. Fortunately you can have Save As back: System Preferences -> Keyboards -> Shortcuts -> App Shortcuts. Add a shortcut called "Save As..." with Command-Shift-S. Magically, Duplicate is gone.
Great suggestion, but I could not get "Save As…" (associated with Command-Shift-S) to appear in the File menu of the Preview app (where I'd like to see it), after following your guidance. I have 10.7.5. I don't care if Duplicate is there or not; I would like to add Save As as you described.
Sorry, you also need to assign Command-Option-Shift-S to Duplicate. Then Duplicate will magically be hidden behind Save As... in the way that Save As... was previously hidden behind Duplicate. [That is, when you hold the option key down, Duplicate changed to Save As...]. Otherwise they'll be shown together in the menu.
>That's the problem in a nutshell: OS X changes because there's new management that wants to put its stamp on things, regardless of whether it improves the productivity of the user or not.
I'm not an OS X user, but that's a solid observation, that at least partly accounts for the issues, based on my experience of other OS's and software. I've seen the same happen, from the inside, in large companies where I've worked, and also, from the outside, a lot, as a user of various software packages and OS's, that seems to be the case, many times. Of course there are other reasons too.
I was just talking to a non-tech [1] friend yesterday who was complaining about the OS problems he was having (Ubuntu, in his case), and who said, essentially, that things are too difficult - upgrades screwing things up, etc. etc. My reply to him: the state of the art in software (in general) is still below what it should be - or words to that effect.
[1] Of course, part of the reason for his problems is that he is not tech-savvy (though he actually is more so than the average layman), but that raises the question of why software in general cannot be more user-friendly and easier to use. Difficult question, I know, since the field is so complex, and compounded by the existence of so many different versions of hardware, operating systems, and applications, all (mis)interacting with each other. Reminds me of my erstwhile system engineer days: customer has a problem (in a specific app situation) with this model of that computer that we sell and support? check the OS version: is it Ver. x.y.z? ah, that's not compatible with BIOS (or motherboard) Ver. a.b.c - only when using that particular RDBMS / compiler / whatever; upgrade the BIOS (or motherboard or OS or problematic software) to Ver. p.q.r ...... :) (which point was sometimes learnt after system / hardware engineers had spent a lot of time trying to solve the problem on their own, before escalating it to head office)
It was good fun and learning, but frustrating too at times, and must have been for some customers too ...
I agree completely. And believe it or not, I went back to Windows. Sort of. I'm testing the waters. My big issue is iTunes. Yeah I drank the kool-aid and now have an iTunes library of about 100GB. Despite how much I hate what iTunes has become at least is runs on Windows.
Also, it's easier to "turn features off" in Windows than OSX these days. I'm sad about it though. But after 14 years on Mac I'm done. It was the upgrade to 10.10 that finally pissed me off enough to leave.
By the way, anyone know of a good terminal app for Windows? Not too crazy about Powershell. I have been using Git Bash and that's decent so far.
If by having an "iTunes" library you mean it has some old DRMd 128kbps AAC files, I believe they'll let you download non-DRMd 256kbps versions which will then play in most decent media players. AAC achieves transparency at lower bitrates than MP3, and compatibility really isn't much of a problem these days. Windows users seem to like foobar2000, MusicBee, or dBPowerAmp so you can move on from iTunes.
Apple removed the DRM from new music sales. All existing music sold before then is still DRM locked. When Apple discontinued DRM, they made user spend $0.30 per song to unlock them (seriously). Now, it appears you can spend $25 for iTunes Match, ensure the affected songs are available, delete the local DRM-infected ones, and it will download proper ones to replace them.
iTunes Match brings its own issues to the party. I've had songs that I own, with 'explicit content', that I've hand-ripped, converted into radio friendly abominations that are, frankly, unlistenable. Maybe that's just Apple's 'family-friendly' values shining through; that might be appropriate for some, but I don't have a Big Brother in real life, and I don't want one messing with my music collection either!
http://www.foobar2000.org/ is a fantastic piece of software. Extremely efficient/powerful/customizable (and closed-source). I sorely miss it under Linux, Rhythmbox/Quodlibet/Amarok/Clementine don't even come close.
WMP actually isn't that bad these days, though all I use it for is ripping CDs (yep, those shiny plastic things for you young'uns) since it's great at that. For playback in Windows, I've stuck with Winamp through the years as I haven't found anything that works the way it does without a bunch of features I don't need. I tried Foobar2000 and couldn't get into it, and MediaMonkey just seemed like overkill.
By the way, a great site for finding apps in specific genres is alternativeto.net. Here's a list of alternatives to iTunes:
I moved to Linux (Ubuntu) and migrated a HUGE iTunes library along with it. I now use Clementine as my music management / player. I believe there's a way to import an iTunes library, but the iTunes metadata is all in one big XML file. So I wrote a script that loads all my metadata into Clementine's sqlite database.
I use Cygwin on a regular basis. It's basically indispensible, because it's the only thing that does what it does. But I get the strong impression that it was designed to punish Windows users. Just off the top of my head:
* It has no idea about Windows file system conventions. IIRC, the default install directory is "C:\Cygwin".
* The "default" install is super bare-bones, and missing a lot of pretty basic Unix utils. This would be a minor complaint, except:
* The installer--the bit that you download--is also the package manager. This means that if you install it and then delete the installer like a responsible, space-conserving user, then the first time you realize that you don't have, I dunno, rsync, you have to download the installer again. And then, unless you having to manually navigate to your Downloads folder every time you want to install a damn package, you have to find a place for it to live on your drive and set up a start-menu shortcut for it. These things are why we invented installers, so why doesn't the installer do them?
* And, as you may have guessed from that last bit, there is no way to install packages from within the Cygwin terminal. In fact, you have to close all open terminals every time you want to run the package manager. You can imagine how much fun that is. The excuse for this is "such a program would need full access to all of Cygwin's POSIX functionality. That is, however, difficult to provide in a Cygwin-free environment, such as exists on first installation." In other words, they can't provide a proper package manager because then they couldn't make the installer and the package manager the same program which they shouldn't be fucking doing anyway.
* Every time you run the installer/package manager, you have to click "Okay" seven times (yes, I counted) to confirm a load of options that you will almost certainly never change after first install.
This ran way longer than I intended. Apparently I am fussed. :-/ The thing that really gets me is that this isn't even a case of Unix grognards not knowing or caring about Windows standards. There is no modern OS where installing an application off of the root directory is acceptable. I don't get it.
I think Cygwin is trying to be an Operating system ontop another operating system. Windows is in C:\Windows, so there is some logic behind Cygwin being in C:\Cygwin. It's not ideal, but I wouldn't call it all together illogical.
The package management is indeed a mass. The install.exe should literally install ONLY the barebones system plus some kind of terminal app, like Putty (because CMD is a pain to use), and then open the terminal to continue installation from there.
I look in on the Cygwin users' mailing list every now and again and I have a lot of respect for the devs and how they run their project, finding a usable middle ground between Windows and Linux (Cygwin at its core is a DLL that emulates POSIX and Linux-specific system calls).
The installer being the package manager threw me when I first started using Cygwin. However, when I researched the issue, I can see how the choice made sense. Unlike a Linux system, you can't upgrade the Cygwin DLL in-place while it's running.
apt-cyg [1] is a nice simple bash script for managing Cygwin packages. I find very useful for avoiding the GUI installer when I just need to add or remove a few packages. For software that doesn't require an upgrade of the Cygwin DLL, they can be installed / removed from inside Cygwin itself.
I used Cygwin when I was on Windows but my advice is to use Windows for the client programs and run a headless Linux VM for everything else. Export its filesystem so you can use Windows to edit files but ssh to Linux to work with a terminal. You'll get a real package manager too.
Did you stop reading after the third sentence? I addressed that specifically later on. A lot of Cygwin's behavior is no more acceptable in Unix than in Windows.
And no, I don't use Cygwin to "escape" Windows. Windows is a perfectly functional GUI for casual daily use. I use Cygwin because it provides useful tools and utilities that aren't available in Windows.
Still runs in the Windows terminal, right? Small as it seems, one of the first things to push me to OSX from Windows was the ease of copying and pasting in and out of the terminal.
For the past couple of years, it uses mintty (based on the same code as putty) as its default terminal. I find mintty to be a very fine terminal emulator and is easily configurable; its default colour for rendering ANSI Blue was too dark for my liking but I was able to change it to the same colour that xterm uses.
It supports UTF8, 256 colours; copy-pasting is simple and easy. I suppose the only mainstream feature it's lacking is tabs -- but that doesn't bother me.
This is fixed for Windows 10 thankfully, along with a few other features that have been missing a while [1]. Multi-line copy, paste good paste support (correct formatting), newlines following resizes, transparency, etc.
>Every new major release of OS X is a day or week spent disabling things
To be fair, I'm a .NET dev who has to do this in most Windows releases as well. It's a part of the customization of the OS. They don't tailor it to devs, they tailor it to normal users.
I'm in the same boat, basically using Macs for the hardware at this point. Nothing that I'm aware of comes close to MacBooks in build quality, display, and lack of driver issues, and the retina iMac is a fantastic developer workstation. But not being an iOS user, I find most of the changes since 10.6 either irrelevant or negative.
The Razer has a much worse processor, half the RAM, half the storage, half the battery life, and it's heavier than the 15" MacBook even though it's only 14". And according to reviews "it runs incredibly hot".
Battery life is 5-6hrs compared to Apple's "up to 8 hrs" for the MBP 15. That's amazing considering Razer runs a 1344 core GPU. What does a pitiful MacBook Pro have? Integrated graphics. Hahaha. And for the highest end MBP they have? A puny 750m. I'm sure you won't get 8hrs with that one. Compare the two on any graphics benchmark. There's the mudhole I mentioned. 870m is 2x faster than a 750m.
The Razer also delivers a higher res touch screen. The Razer is thinner. It's also black and green which IMO looks much better than those boring silver MBPs.
Most reviews say the Razer gets between 4 and 4.5 hours on battery. The MacBook is 9 hours and some reviewers got more than 9. Color is personal preference. I think aluminum looks better than plastic.
You've never even seen one, have you? Razer is aluminum too.
>Nice job trying to use the specs page for the outdated non-Retina MacBook
Must be all Apple's gadget spam confusing me. That was the first result returned by Google for "MacBook Pro Specs". If you want to quible over 0.01Kg, then take a look at that 0.01 inches in thickness while you're at it. Oh gosh, that MacBook is just TOO THICK to use!! roll eyes
>And nice job ignoring the processor,
On noes! Apple's newer hardware has an extra 300 Megahertz cpu on the top of the line MBP vs the Razer. I know, Razer can break out the MHz Myth! Yay MHM!
Oh, sorry, I forgot. MHM only applies if Apple has the lower clock speed. I must have stepped out of the Reality Distortion Field for a moment.
>RAM,
LOL. Let's talk about RAM on the iPhone shall we? Oh, RAM isn't an issue on the iPhone because <blah blah blah>. I'll use that same excuse then ;)
>storage,
Razer available with 512GB of storage. MBP available with 512GB of storage. What's your point again? Oh, I see. You can overpay for 1TB by spending an extra $500 as a BTO option. Good for you. I'm sure you're proud of that.
Ever heard of an external drive? They're pretty neat. You can hold big files on them, but you aren't punished by carrying around all the weight of a bigger main drive all the time. You might want to check into that. They're pretty nifty for the obviously 0.01Kg weight conscious traveler that you are.
At least the Razer team was smart enough to direct heat to a no touch zone above the keyboard. Look at that. The heat is all up in the keyboard on the MBP. What a shame. Your fingers must be cooking as you type your responses.
In the meantime, Apple's still low res. Apple's still lower pixel density. Apple's still missing a touch screen. Apple still has a missing or crippled GPU. In hardware that really counts, Razer comes out on top in a big way. But yeah, you're 10grams lighter on system weight, so you win. lol
I bought a linux laptop from System76 early this year, and aside from a fiddly trackpad, everything just works. I've seen a lot of people say the same about thinkpads.
Thinkpads have pretty much worked on Linux, for me.
Overall, Linux is great these days if you have compatible hardware. A random Windows laptop might be a problem, but a Thinkpad or a System76 or such should all be fine.
While it worked acceptably for three years, I gave up on my Thinkpad W520 this summer and switched to a Macbook Pro. Partly it was due to size (going from a bulky 15" to a really slim 13" computer is awesome), and partly it was due to wanting application support again, after seven years of almost exclusively using Linux.
One thing I'd like to say when looking at Thinkpads, or other laptops, for running Linux on, don't get one with hybrid graphics. My experience in trying to deal with it was a huge pain.
Maybe my problem was in going for the W-series. The older T-series laptops I've installed and used Linux on were great.
My T42 worked like a dream on Ubuntu 14.04 until recently, and now it refuses to awaken from sleep properly at all because of some update or other. I'm dreading trying to track this down and possibly finding out I'm stuck with using Windows 7.
I had (well, still have, but don't use) a w510. It worked just as well as the t61 before it (which was to say, they both did great under Linux), but it didn't have hybrid graphics... it was Nvidia all the time. The only reason I've replaced it was that it had absolutely abysmal battery life. Old job bought me a comparable ultrabook that weighs 20 pounds less and has 3x the battery life and I couldn't be happier.
I have a T430s - fairly thin with hybrid graphics. You should know that you can switch off hybrid graphics in the BIOS. Please feel free to get a hybrid graphics thinkpad - you can use it whichever way you want.
+1 - Thinkpads work out of the box in my experience, and with a bit of tweaking, I managed to move my parents to Linuxes (Mint) on "random Windows laptops" - a Compaq and a 17" HP.
Same, tried Trisquel and it just worked, except some flash media doesn't play which actually increased my productivity. Coursera videos, youtube work but spending countless hours streaming some TV links won't happen anymore unless I manually install a nonfree flash plugin from repository outside Trisquel, which is enough of a time transactional cost that I haven't bothered
I've got a 17" model. It is plastic, but I have no complaints about the overall build. The keyboard has pretty nice long-travel keys. The trackpad is horrible, the cursor tends to jump around a lot when you try to click. This is my first linux laptop so I don't know whether that's a common driver issue on linux. If not for that I would be 100% happy.
Another model got a lot of complaints about the keyboard a year or two ago, and System76 responded by sending everyone a better one for free.
Same for Ubuntu Gnome. There's a few addons (or plugins, or whatever they call them) that I install to suite my preferences, but I can do a clean install + additional packages + tweak to my needs in under an hour. If you don't know the package names you'll want, maybe double that. The past few upgrades have also worked flawlessly for me, which makes it even easier.
That said, I intentionally purchase hardware that's known to work well with Linux (if you buy a system with all Intel chipsets, you'll probably be fine). Also, the power management is still abysmal. I still have to tinker with powertop to get battery life comparable to other OS's.
Amen to that. After doing some magic with powertop, I get about 5 hours out of my Thinkpad X201 9-cell battery, which is still less that the 7 on Windows they claim. But it guess the battery is getting older too, I've never replaced it (it has about 2 years)
I hated Unity at first, enough to stick with 10.04 when I ran Ubuntu (I was never impressed with Kubuntu/Xubuntu/Lubuntu, there was always something broken somewhere in all of those). But I finally gave it a go in 14.04, and it has vastly improved. It may still not be your cup of tea, but in my mind at least it's better than Gnome 3 and is worth a look.
That said, for my money you can't beat Mint with Xfce for a quick and easy casual GNU/Linux box. I always skip Xfce's compositor and install Compton for tear-free window dragging and video watching (Nvidia-specific issue I believe), but otherwise the default install is very good.
>I just dread the idea of moving to Linux again. I don't want to tinker that much.
The thing about Linux is that once you've tinkered, things stay the way you are.
Copy over your home folder to your next distro and just about everything goes back to the way it was. When I wiped xUbuntu 14.04 and installed the 15.04 alpha, I didn't have to do anything to get XFCE back the way I wanted after I copied the contents of my home folder over and gave it a reboot.
This is far, far more friendly than what OSX forces you to go through after each and every update.
Well, most of the times but not always. I use Ubuntu and I don't like having a bar on the top of the screen and the modal menu at the top, which are the primary reasons I don't use OSX. I was using Gnome 2 with all the menus and icons moved to the bottom bar, plus Compiz for the virtual desktops cube which I find a more natural way to remember where I am than with sliding desktops. One day Canonical introduced Unity, with a bar fixed to the top. No way to remove that, so I started using gnome fallback mode, or whatever is called. Enter Gnome 3, with much more workflow changing features. We can work around almost all of them now, not so much years ago. I kept using gnome fallback, which unfortunately requires more and more tweakings to mimic a subset of the functionality of Gnome 2 (which was all I need to work). So I ended up with an Ubuntu 12.04 with kernel upgrades (the hardware enhancement stack from Canonical) and a DE that doesn't work as well as it used to (some quirks here and there). It's very much the first lines of the post about OSX. At least Linux gives me more flexibility than OSX does.
> The thing about Linux is that once you've tinkered, things stay the way you are.
Until they don't – I supported Linux desktop users for years and, even ignoring fun with the occasional kernel/driver update rendering systems unbootable or breaking sound/video, every so often I had to troubleshoot something which turned out to be caused by a backwards-incompatible change. It turns out that Linux developers are just like developers for every other platform and make mistakes or intentional changes for things they no longer wish to support.
> This is far, far more friendly than what OSX forces you to go through after each and every update.
My experience with every release since 10.0.0: install, reboot, go back to work. The thing to remember for every platform is that you hear about complaints from the small percentage of people who encounter something unusual because relatively few people spend months camped out on forums to remind everyone that an update didn't break anything.
Given how common sentiments like this are (I certainly share them enthusiastically), I'm kindof surprised there hasn't been more support for projects that attempt to address this somewhat by taking OS X compatibility beyond OS X:
As a heavy user of OS X, I have absolutely no idea what you're talking about related to your difficulties upgrading to new versions. I've gone from 10.4 all the way to Mavericks and have yet to have a single shred of trouble.
I'm not sure you're aware of the extent of pissing that has taken place from Apple on its Power Users' faces between 10.6 and 10.9 (likely 10.10 even, I've given up).
This isn't just "WHY IS THIS 10px LEFT TO WHERE IT WAS ARGH THIS SUX0RZ" - really basic things were broken, such as multi-monitor full-screen. Also, multi-core machines actually benefited (performance-wise) from upgrading up until 10.6. Not so after Lion.
It's easy to blame the users for being change-averse, but... was I really meant to stop using 'full-screen'? "Just resize it from that little corner there?"? Please.
Every new major release of OS X is a day or week spent disabling things, shutting down Spotlight again, trying to restore things back to the way they were instead of the way some Designer with a capital D thinks they should be, for no other reason than, "Beauty."
I just dread the idea of moving to Linux again. I don't want to tinker that much. But I am worried sick that OS X is dying, in the sense that it's becoming a platform to deliver people to Apple's (and partners') cloud services and sharing services and that's it. Screw all of that.
One major shot across the bow was the loss of "Save As..." and the change to "Duplicate". WTF, Apple? I now have to do 10 extra steps just to Save As.
It feels like Apple is abandoning its longtime users, the master users, the users who've climbed the pyramid, who've achieved a lot of game levels. It's just going after that huge base of newbies and midlevel people who don't notice or complain about all the changes that really, truly are not improvements. They're just changes. That's the problem in a nutshell: OS X changes because there's new management that wants to put its stamp on things, regardless of whether it improves the productivity of the user or not.