Hacker News new | past | comments | ask | show | jobs | submit login
GPU glitches in 2016 MacBook Pro models (9to5mac.com)
88 points by binaryapparatus on Nov 25, 2016 | hide | past | favorite | 69 comments



Wouldn't be a MacBook Pro if it didn't have some kind of GPU problem.


You spelled Wi-Fi wrong. :-)

On a more serious note, I haven't had any issues on my 13" MBP with an integrated GPU. Here's hoping this'll be fixed by the time my 15" MBP with a discrete GPU ships ...


Why is it that the GPU always seems to be the failure point on mac notebooks? I've personally experienced the external gpu on the 2010, 2011, and 2014 models all fail right around the 3 year mark.


New 15" MBPs run at 100C under load. Similar with older ones. Apple runs those too hot and they die, easy as that.


Apple doesnt believe in ventilation. Vents are ugly and destroy that iconic Ive look. Every single Apple laptop is cooking inside.

Edit: 15" 2016 runs 60 Celsius IDLE, and 99 (heavily throttled) under pure CPU load. Probably same 99 with GPU load.


The integrated MBP models seem to be a bit more reliable and run cooler. Too bad you can't get a 15" without a discrete GPU anymore.


You can always force the power settings into keeping the discrete GPU off.


... I had a GPU issue on the 2007 model :-). Was a known issue so Apple paid for the repair in 2009. So yes, very prevalent.


Nobody can call the new Macbook Pro "best in class" when they don't even ship with the best in class GPUs from Nvidia.

These machines are essentially build beautiful traps. So, please don't give Apple your money. You don't need the most recent Macbook Pro. Just wait 6 months and buy a used one. Please don't support a company that actively works against you in order to secure their infamous 40% profit margin for doing nothing more than severely limiting your choices.

I always buy used or third-party refurbished Apple equipment and I avoid buying apps from either of their app stores. I hope you do the same.


Ten years ago, I was a staunch Apple fan. I liked their product line-up - it was simple, easy to understand, and the products were good. If you needed power, you bought a Pro. If you needed value, you bought the non-pro. Both sides had good parts, were generally well built, repairable, and at the time used standard stuff. The OS was improving at a rate of knots as well.

Some time in the last 5 years, I've gone from being a big fan, to really not liking Apple much at all. Primarily because they've gone back to their older attitudes toward standardised parts; they've removed user-serviceability on batteries, memory, storage, yet haven't offered any real concessions to their ridiculously expensive upgrade prices.

Their non-US pricing is infuriating, too. With current currency conversion rates, the base MacBook Pro is $250 USD more expensive in New Zealand after all taxes are taken into account than it is in the USA, and it only gets worse from there.

Computers do not need to be disposable appliances. The used market for Macs is awful now, flooded with 2GB/64GB MacBook Airs that are near useless, and 128GB storage on the lower end rMBPs with only expensive custom third party upgrades available from OWC or Transcend putting a premium on the more spec'd out models.

In the past, you could just buy whatever used (or new) Mac is approximately right for your needs, and fill it to the brim with RAM and a fast SSD when you needed to. I got 3 usable years out of my 2006 MacBook, 4 usable years out of my 2008 MacBook Pro, and only 1.5 years out of my 2013 15" rMBP. At the time, it was already just over the limit for my work machine budget, and I had to retire it due to needing more RAM. I'm quite positive I would still be using it today had I been able to upgrade it. But instead I bought a second hand MacBook Air for iOS development and put up with Linux and Windows on an Ivy Bridge-E desktop that I could only use properly half the time.

Many things they've been doing recently have been actively hostile to users with basically no legitimate recourse. Pentalobe screws? Bricking iPhones with third party home buttons? Limiting availability of genuine parts, causing Chinese counterfeit market flooding? Multiple proprietary SSD blade card formats that are essentially pin swaps of existing standards? Glueing batteries? Lightning instead of USB?

It might be semi alright if you live in the USA near an Apple Store, and can take advantage of some of their better policies, but the point is that they don't need to do any of this stuff in the first place.


Same here. I live in Mexico and the pricing situation is just ridiculous. We are paying 20-25% more than in the US.

Not only for buying the products themselves, but also for the repairs which are way more expensive. When all the 2011 MBP GPU fiasco I was asked for $1500 when in the US Apple asked for about $300 for the exact same repair.

I've been with Apple for over a decade and just bought a Surface Book. Windows is not perfect, but Microsoft is doing all the right choices lately and Apple only produces eye rolls.


when they don't even ship with the best in class GPUs from Nvidia.

Bah, when heavily constrained by TDP and being able to (being required to) lower voltage and clock speed accordingly, it's far from clear NVIDIA would have any advantage in those Macbooks. It would bring Polaris cards back into their sweet spot, in fact.


Yeah, given a sizable power budget and the right benchmarks, Nvidia can outperform Radeon cards, but Apple has an exisistung relationship with AMD and there is not a compelling performance case to switch to Nvidia.

Plus with Supercomputing Clusters eating up most of Nvidia's production, Apple is not guaranteed to get the quantity & bin of GPUs they need on their schedule. This is the same reason why Apple didn't release any fusion based Macbooks, the bin of chip they wanted was not available in the quantity Apple needed by launch.


> Please don't support a company that actively works against you in order to secure their infamous 40% profit margin for doing nothing more than severely limiting your choices.

Or, better yet, don't support the people who did support Apple's profit margins, and buy your stuff stuff from a less crappy manufacturer instead.


Just picked up an XPS 15 9550 from eBay with a quad-core i7-6700HQ, up to 32GB of DDR4-2400 (mine has only 16, but it's SODIMM so upgradable), m2 2280 pcie3x4 + 2.5" sata drive bay, GTX 960m, 4K igzo touchscreen (I don't personally care for the touchscreen) and dimensions slightly smaller than the 2015 15" rMBP.

For $1200.


I used to be baffled and surprised. But now i know, developers and hardware nerds are different. Or basically they have different fields of knowledge.

There isn't a better CPU at Apple's required TDP. And there isn't a better GPU at these TDP either.

If people constantly complain and said they should have waited 3-4 more months for Kaby Lake, or 6 months for Nvidia. Then MBP will never get released.


For modern Macs, this seems like expected behaviour. My 2012 MacBook Pro and my late 2014 5k iMac regularly glitch out like this.


I assume by 2012 MBP you mean the Retina model? I've never experienced anything like this on my mid-2012 non-Retina MBP. (Hard drive issues, yes, but not video glitches.)


At least those glitches look professional


I purchase one (MBP 15). I did have some issues when using a external video card adapter (using DisplayLink drivers) that was then plugged into a USB2 hub that was then plugged into a USB-C to USB adapter. But since removing the driver (and the video card adapter), I haven't had any issues.


2011 all over again, I guess.


The 2011 was worse since it happened after the warranty expired. At least now people can still get a refund or use Apple Care.


What's worse is the battery life. It's ridiculously bad :-/


Go into the activity monitor, and in the energy tab make sure there is nothing that is keeping the GPU on. (Look in the Requires High Performance column.)

Just browsing the web battery life is really good, but unfortunately there are a couple apps I use all the time that use the GPU for no reason. And when this happens the battery life instantly goes from 10+ hours to three.

With Java apps you can theoretically pass them a command line parameter to prevent them from being able to use the GPU, although I haven't been successful with this yet.


Source?


Me using it, collegues using it. It's not good st all for the workload i usually have.


Hijacking the thread: was the cause of the GPU glitch with mid 2011 iMacs (https://support.apple.com/en-us/HT203787) ever determined?

Mine had this problem. I've been obsessive about keeping all temperature readings below 50C ever since it came back from repair.


That's horrible PR for AMD, whether it's their fault or not.

I don't really care about Apple losing some money over this, they are ridiculously rich, they will be fine. AMD, on the other hand, is the only company that forces the CPU and GPU prices down. Though recently they have fallen quite far behind Intel.


This is a bit naive.

AMD hasn't been competitive with Intel for almost a decade...there's nothing recent about that.

AMD doesn't keep Intel's prices down, the market does.


> AMD doesn't keep Intel's prices down, the market does.

This is a reductionist view. For the market to bring those prices down, there need to be at least 2 manufacturers of the same product who are not colluding. If AMD went under and closed down, the market that you have faith in would not magically create more companies manufacturing x86 components. This is because only Intel and AMD have the patents and the licenses to patents required to manufacture them. It's possible that someone would salvage those patents and licenses from AMD and try competing with Intel... but what sane person would do so after seeing AMD try and fail to do so for a decade.

In short, AMD's continued survival and success is required for a healthy x86 market.


Here's to a competitive Zen launch.


As a side issue to the current crop of first-adopter issues (I'll wait till this model is on its second generation - then I'm sure it will be a great piece of kit). I was thinking about the outrage concerning the denomination of 'Pro' on these models - with people crying that they're no longer suitable for 'Pro' use but I think I've come to the conclusion that the 'Pro' user of 5 years ago is a dying breed - Apple sees a future where they no longer exist.

Think about it - music production, video editing, photo editing, graphic design, coding.. These are all things which used to be considered extremely difficult but have now been made easier with fantastic, accessible software. Huge amounts of specialist hardware are no longer necessary or indeed desirable in these fields and the new MacBook Pro recognises this - software development (and other areas) are moving increasingly to the cloud and a huge development machine is no longer a necessity..

I'd suspect that Apple sees the future of its products and services in the cloud and the machines it provides will be slightly different gateways to that future.


Calling these "first adopter" issues seems like it's allowing Apple too much leeway. The product line is a decade old, and none of the hardware being used here is untested or unproven.

This strikes me as another example of Apple's reality distortion field falling apart at the seams. Why are these issues occurring in the public's view, and not on some Apple developer's workstation or testing lab?


>Apple's reality distortion field

I always hear talk like this implying Apple's products are only acceptable if they live up to the legend of their hardware and software "just working"

Honestly, for me they could put a 100% tax on your average off-the-shelf, not-promising-good-build-quality, already-made-in-the-same-factory-as-a-macbook consumer laptop just to provide OS X with official support and I (and a lot of people I know) would be just fine with it.

Of course it doesn't help their bottom line or strategy to tightly couple the OS and their hardware so they'd never do it, but can we stop pretending that all there is to Apple hardware is branding? OS X is a pretty big deal. I'll take average hardware with OS X over average hardware any day.


> their hardware and software "just working"

Yes, how dare we expect that something we spend multiple thousands of dollars on, and have to work with 8+ hours a day, to "just work."

> OS X is a pretty big deal

With Linux subsystem for Windows, Ubuntu, and other such systems, OSX isn't the big deal it once was. I have to restart it weekly to fix issues with software just not working right. I have to buy five to ten extra utilities just to ensure that my extra mouse buttons, windowing system, and keyboard shortcuts work as expected.

Sure, it could be worse, and I'm glad you have good luck with it; but it could be a hell of a lot better too. And it's not wrong to expect more when we spend so much money on our tools.


> but can we stop pretending that all there is to Apple hardware is branding?

Sure. All there is to Apple hardware and software is branding.


The day I see a Linux distro put together as coherent an experience as OS X is the day I'll agree to that.

Even in it's supposed decline, OS X still manages to put together a more compelling UX story than every Linux distro I've ever tried.

I get that some people don't really care about the UX their OS provides as long as there's a terminal and deem GUIs as unproductive for those who "really know what they're doing" (I often hear something about moving from their keyboard to their mouse taking too long?) but to me it's paramount.

I spend a lot of my life using a computer and I need my desktop environment and the tools I use in it (GUI ones included) to feel like a joy to use, not something I'm trying to duck out of the way of to drop in to a terminal.


> Think about it - music production, video editing, photo editing, graphic design, coding.. These are all things which used to be considered extremely difficult but have now been made easier with fantastic, accessible software. Huge amounts of specialist hardware are no longer necessary or indeed desirable in these fields and the new MacBook Pro recognises this - software development (and other areas) are moving increasingly to the cloud and a huge development machine is no longer a necessity..

While some of these professions may be moving towards cloud based solutions, video editing especially is still difficult. You'll end up waiting half your life to upload half a terabyte of footage to the cloud before you can start doing any work.


> the 'Pro' user of 5 years ago is a dying breed... Think about it... coding... huge amounts of specialist hardware are no longer necessary or indeed desirable in these fields... a huge development machine is no longer a necessity

Mac (not macbook) pro user here, and I disagree. I bought a mac pro when it came out because it was a mac so it was reliable and intuitive, and the dev environment rocked, AND because it was fast (for the time). My team's builds and tests require a lot of processing power, and our company likes to keep things in-house.

I don't think that all of the world has progressed as far into the cloud as you propose, and I just want a fast mac.


I get what you're saying. But the things you can't transition to the cloud are microphones, cameras, keyboards, other HIDs. If they see a future where there is no "pro" because we can all be pros, wouldn't you want to reduce the cost of entry as much as possible? Ie. make it trivially easy for anyone to plug almost any HID in and begin authoring content.


Maybe for when done for personal use. The same tasks are still difficult and demanding in hardware when done for professional purposes.


I develop virtual reality games professionally. I cannot do that on a Mac Pro or a Macbook Pro.


Can you do it on any currently existing laptop that is not much heavier and much thicker than a Macbook Pro?


There are VR capable (gtx 1060) laptops around the size of the 2012 MacBook Pros.


Ok, could you link to one?


Not the parent, but you could literally get off your high horse and google "laptop gtx 1060"

I did that and had several options within 10 sec


They tend to be 15" or 17". The new 15" Macbook Pro also has better 3D graphics (Radeon Pro 450). It's not as fast as the gtx 1060, but it uses 35W of power in comparison to 120W, and needs less active cooling. Apple generally don't compromise on noise and battery life.

I'm sorry for the snarky nature of my previous comment, but in my experience, all of the suggested alternatives require compromises which Apple are justified in not making. I'm not sure why people think that Apple are obliged to make laptops to suit every use case. If you just want a portable desktop, there are plenty of other options.


To be fair OP included the Mac Pro. If the top of the line product from Apple's macbook line can't cut it, maybe they shouldn't have abandoned the mac line so prematurely.


> coding.. These are all things which used to be considered extremely difficult but have now been made easier with fantastic, accessible software.

Coding!?


16GB was not much even 5 years ago. That "fantastic, accessible software" eats a lot of memory.


Apart from VM usage ( Which i have no idea if this could be minimized ), May be we should ask the Apps developers, and even Apple is guilty in this, Why their Apps are using GB of memory? Why is Slack using 3GB? iTunes 1.2GB? OSX consuming 4GB.

The abundance of memory and CPU has let developers create monster apps that doesn't take performance or resources in mind.


I have 8GB on my laptop and never managed to use more than around 5 GB.

Using Visual Studio, Android Studio, Netbeans.


VS is 32-bit so it will never use more than 3.5GB. However that limitation is super annoying when you have > 3GB in symbols trying to load.


VS was just one example on my list.


Hey I'm with you. I would love for VS to use more RAM so it would stop crashing due to running out of address space all the time.


Is your point that claims of memory need are exaggerated, or that everyone else is doing it wrong?

Chrome alone can take up 5GB. Right now DataGrip and RubyMine are on my machine are taking up 4GB, and even iTunes is taking up 1GB for some inexplicable reason.

Then I fired up a Windows 10 VM. 4GB by itself.


That most people are doing it wrong.

I configure my set of actual running programs for the work being done on that specific moment, not for the work being done during the week.

For example, I only open multiple browser tabs when searching for documentation, and even there I barely go over 10. Which I anyway close, after I am done with it.

Each application is only running for the time it is actually needed and does provide value to my workflow, otherwise it just a waste of CPU, memory and screen estate.


Try running a VM and compiling inside that...


The nature of VMs is that you can assign memory to it. So if your machine is already light, you wouldn't give it much memory to start with, and it'd run (just slower)


So you're saying the Mac "Pro" is OK for running things slowly.


Not saying that at all. Saying that if someone says it'll run in 8GB instead of 16GB, or 16 instead of 32, it's possible, but those claims ignore performance. The presence of those statements means absolutely nothing absent of context (for instance, the required performance of a Windows 10 VM isn't the same for someone just testing a site on IE versus a developer running a 300GB SQL Server database)


You mean like using Swift, D and Rust on a VMWare instance running Ubuntu Linux 16.04 LTS?


Linking some amount of C++ code can get very painful very quickly on a constrained VM.


Linking C++ is painful anywhere, this from the point of view of someone that likes the language since 1993.

Modules might finally solve it, but it will a Python3 like story until they get integrated across major compilers, libraries and frameworks.


I run 2-5 instances of Visual Studio and each one takes up 1-2 GB


Why on earth do you need so many open?!


You can only have one solution open at a time. For example you may be working on a complex DLL (composed of multiple sub-projects compiling to static libs) as one solution. You may also want to have the solution for a consuming application open so you can test changes there as well.


Yes, but that usually means 2, not up to 5.


I don't use the visual studio IDE for my work (just msbuild from the command line) but if I did it would be more than 5 solutions that I use daily. In rare cases I need to enlist in even more projects. I'm not the original OP but I can certainly emphasize with his problem.

That said my work gets done on a Xeon workstation not a laptop, so in that respect we are different.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: