Hacker News new | past | comments | ask | show | jobs | submit login
New MacBook Pro has first ‘DIY-friendly’ battery replacement design since 2012 (ifixit.com)
718 points by tailspin2019 on Oct 27, 2021 | hide | past | favorite | 613 comments



A lot of comments seem focused on the incentives to apple and what's motivating the change. All fair questions. To me, though, it feels like just a radically different approach this year.

2016 macbook felt like leadership got in a room and said "okay, let's make a list of all the sexy things we can think of that would make the macbook unique". This netted things like thin-beyond-practicality, touchbar, removing all the ports, etc.

2021 macbook feels like leadership got in a room and said "okay, let's make a list of all the top things everybody is complaining about most." And they just fixed everything (well, most things) on that list one-by-one.


The 2016 wasn't leadership, it was Johnny Ive without Steve Jobs bringing him back to reality.

Touch Bar? This was nothing more than adding expense to raise the ASP (Average Selling Price) of Macbooks, that had fallen precipitously low from a shareholder perspective because of the superb value-for-money proposition that was the 13" Macbook Air.

The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.

USB-C only was a philosophical move rather than a practical one that forced people everywhere to carry dongles. The USB-C cable situation was and continues to be a nightmare as different cables support different subsets of data, power and video and, worse yet, different versions of each of those. Worst of all, it was the loss of the much-beloved MagSafe. Also, the ports weren't all the same. You were better off charging from the right (IIRC) rather than the left.

Replaceable RAM and SSD being lost is still painful. Personally I don't believe this was about forcing users to pay for upgrades primarily. It was about shaving off a small amount of volume.

Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.


People are commenting this post and saying it is speculation, and until someone who directly was involved in these discussions shows up to comment, I suppose it is.

I have been in design meetings with Jony, and Scott Forstall, and many others whose decisions were micromanaged by Steve at every step. You can argue that a lot of Steve's design decisions were questionable; rich Corinthian leather skeumorphism, lickable Aqua widgets, brushed aluminum window title bars, but he owned them.

Steve and Jony would sit for hours outside of Caffe Macs going over designs. Steve would spend even more time inside the industrial design area going over prototypes. He would spend a couple of hours every week meeting with every software team that had user facing features. He had input on almost every pixel on the screen and every button/port/display/etc on hardware.

Once he was gone, the drift began. It was inevitable that focus would shift. Scott no longer had protection by Steve. Jony fixated on the new campus and things like watch bands. No one had Steve to rein in whatever impulse they had. Sure, people would ask "What would Steve do?" but we also had Tim Cook pushing to optimize production, lower cost of goods and increase margins.

Apple still has Steve DNA, but it continues to be diluted. You may disagree with Steve's vision and opinions, but it was strongly held and enforced. I feel almost everything about the last generation of MacBook Pros went against what Steve would have wanted and I am glad I wasn't there when those decisions were made.


Pretty much sums up my opinion from 20+ years of following Apple.

It is sad. No one knew what to do with Apple Retail. That was the most neglected part of business.

Ron Johnson left. Scott Forstall forced out. Katie Cotton retired. ( I felt both Scott and Katie had a bit of Steve Jobs in them ) Mansfield retired. It sometimes feel Apple is now largely run by Tim Cook and Eddy Cue.

Although the new MacBook Pro do seems to show there are people in Apple that still give a damn. That their voice may have been previously drown out. Quote from Steve.

>It turns out the same thing can happen in technology companies that get monopolies, like IBM or Xerox. If you were a product person at IBM or Xerox, so you make a better copier or computer. So what? When you have monopoly market share, the company's not any more successful.

>So the people that can make the company more successful are sales and marketing people, and they end up running the companies. And the product people get driven out of the decision making forums, and the companies forget what it means to make great products. The product sensibility and the product genius that brought them to that monopolistic position gets rotted out by people running these companies that have no conception of a good product versus a bad product.

> They have no conception of the craftsmanship that's required to take a good idea and turn it into a good product. And they really have no feeling in their hearts, usually, about wanting to really help the customers.

Really wish Steve was still alive.


Johnny Srouji seems to be in a very important position these days. I bet the positive current you claim that you are still feeling comes largely from his direction


If he were, I do not think he could be running the company.

His working style and some of his personal history seem like they would not have worked in 2020s.

I could see him as a board member, occasional informal consultant. But not as head of the company. And I don’t know if he would have liked that too much.


I disagree. It's true that the management style he had during the 1980's definitely would have had a stronger and faster backlash in the 2020s - he would have been forced out of the company even sooner than he really was. But this style also didn't work all that well in the '80s and led directly to him being ousted and Apple entering a slow death spiral they had to be rescued from a decade later. His second act really was different and he was much more mature as a leader; certainly he still worked people hard and was unusually blunt compared to some other current FANG CEOs, but you didn't see the sort of petulant and intensely personal abuse that you would sometimes see from him in his first act.

I think if he was CEO of Apple in the current environment, certainly there would be less hero worship outside of Apple, and a lot more people would be writing pieces critical of his leadership style and effect on the world, but there wouldn't be anything serious enough to force him from the company given how well things would be going. In other words, he'd be much closer to Musk than Kalanick both in terms of the severity of the criticism and his ability to weather it.


You know what they say... you either die a hero or live long enough to become a villan. That definitely would have happened with Jobs.


People do change, and I wouldn’t dismiss the capacity of anyone to adapt to new circumstances. With Apple growing, Steve would have grown, too.


That's a problem for the 2020s, then.


One could argue Elon Musk's style of working and personal history is in some ways similar to Steve Jobs, and Musk is doing fine in the 2020s.


I wish he had listened to basic medical advice and not abused his power/wealth/influence to steal somebody else's liver, only to die anyway.


Contrary to popular belief, there is no evidence that he didn't listen to medical advice.

https://www.livescience.com/16551-steve-jobs-alternative-med...


Your link does support that conclusion in the least. It’s just speculation.

Versus other sources that say Jobs did delay treatment.


My link absolutely supports that conclusion completely. Other sources are also speculating based on hearsay. There is no evidence.


Sorry, but I can't find any sympathy after they squeezed every penny out of developers in the app store.

> And they really have no feeling in their hearts, usually, about wanting to really help the customers.

Developers are customers too.


When Apple's focus was on education, the philosophy followed that. Welcoming newcomers who Think Different. Consistent UI design. Good documentation for developers and users. Building an ecosystem. Optimising for connectedness. Community-centered.

In the past decade, Apple's focus is self-centered: optimising for profit. That has come at the cost of community. Will the new hardware solve that? It's a welcome improvement! I use a laptop for data storage, and am worried about being unable to quickly swap my SSD over to a spare laptop (restoring 8TB will take a long time). I'm very pleased about the return of MagSafe, though, and battery replacement. Will Apple's software improve? Will Apple listen to developers? Or will the Linux community act first to welcome more newcomers? We're going to find out this decade, and I'm excited to see what changes it brings.


Apple has always had insane margins . Even the Apple //e costs more than competitors.

Apple became the most valuable company in the US before Jobs passed.


A lot of these are missing out the Context. When Steve Introduce App Store, he said and quote

>"Maybe it'll be a billion-dollar marketplace at some point in time. This doesn't happen very often. A whole new billion-dollar market opens up: 360 million yearly run rate in the first 30 days, I've never seen anything like this in my career for software,"

>"Music is a two and a half billion-dollar business a year for us. I'm thrilled at $360 million a year run rate. We'll be dancing on the ceiling if we cross a half a billion. Maybe someday we'll get to a billion."

He also said he doesn't believe App Store will overtake iTunes in revenue. But that was at the launch of App Store in 2008.

Forward to 2010, App Store was nearly $2B. Still not as much as iTunes, but it was growing fast. And projected to overtake iTunes in 2011. He didn't understand the App Store market. So it was uneasy for him. He spoke about it in multiple interview since 2008. But he was sick, and well aware oh his health. On his last trip to Japan in 2010, we wrote "All Good Things", the last part "must come to an end" wasn't written out. But he must have known his days may be numbered.

Remember 2010 to 2015, Apple was about, or at least perceived to repeat the same mistake of Macintosh and Windows again. Current iPhone 13 in launch quarter would have sold more iPhone than all iPhone 2G to iPhone 4 combined. iPhone revolution has barely stated despite most of nerds realise how big of a change it will be.

In 2015 the war has pretty much settled. Android will never be able to destroy iPhone. And iPhone growth projection has zero momentum lost. I wrote on AI how Apple will reach 1B iPhone user in 2020 at those projected rate in 2014. At least Phil Schiller was aware of it in 2014 and floated the idea or lower commission rate from "Strength". I am sure if Steve was alive and Scott Forstall was still on the team you would at least have some support of that idea.

The problem is right now no one has the conviction to say no this isn't a business we should be in. Certainly not from the number guys like Tim Cook or CFO. iTunes was used to sell iPod. App Store should have been used to sell iPhone. Instead it is now used to extract value from iPhone users aka Services Revenue. Scott stood up for Developers, Steve did too. As shown in email released in court. And I am pretty sure Katie Cotton would have smelled the PR disaster before it even started. But all three are gone.


Before the App Store: the carriers took 70%

After the App Store: Apple takes 30%

But let’s not pretend that most of the App Store revenue comes from poor indie developers. It comes from pay to win games, loot boxes, and other in app purchases with zero marginal costs.


Oh, so it's not that big of a loss since most of the people don't deserve that revenue anyways! So Apple on the other hand is totally entitled to that money?


I was hoping that Apple Arcade killed all play to win games or at least took a bite out of their revenue.

But again, you’re not standing up for the poor starving Indy developer. You’re standing up for companies that are doing far more sinister things going after “whales” than anything Facebook has ever done.


> You’re standing up for companies that are doing far more sinister things going after “whales” than anything Facebook has ever done.

There are several things wrong with that statement.

1. If these companies engaged in truly destructive business practices (and it could be proven), then Apple has a duty to remove those apps. Every developer has to pay them $99/year to be registered, so that money should be funding the removal of these supposedly exploitative and 'sinister' apps.

2. You cannot prove that the majority of these businesses are all big.

2a. Even if you could prove that, it's common knowledge that Apple is the largest company in the world, which renders that entire argument moot.

3. How do you delineate between indie developers and 'whales'?

3a. When you do differentiate the two, how is it ethical for the largest company in the world to ask for more of their profits?

4. Apple isn't standing up for the poor indie developer either, which is why it's perfectly reasonable to ask them to do better. There is not a company on this planet with more liquid cash than Apple, so there's nothing wrong with asking them to just improve their treatment of developers when consensus is that they're one of the most exploitative and destructive companies in the field of consumer electronics.


1. I agree. While Apple hasn’t gotten rid of the scammy apps, they have introduced an alternative - Apple Arcade - where they do fund Indy developers so they don’t have to make scammy pay to win games.

2. I said that most app revenue comes from “whales”. Those are the players who spend the most money on play to win games. https://medium.com/shopify-gaming/mobile-gaming-is-a-50b-ind...

3. Indy developers are selling a product that has value. The “whales” are the 5% of consumers buying loot boxes and candy crush coins (?).

3a Most of their profits are not from the App Store. Most of their profits come from a simple ethical business model - I give them money they give me stuff. Unlike Google and Facebook or the aforementioned games.

4. Who are all of these poor developers Apple should be standing up for? In the link above (and the numbers were confirmed during the Epic trial), most money is coming from games with in app purchases. If you look at the top selling apps on the App Store now, the last one that I can remember that came from Indy developers is WidgetSmith. How many Indy developers would be successful if Apple took a 15% cut instead of a 30% cut?


> Who are all of these poor developers Apple should be standing up for?

Well, there's the FlickType guy[0] who got kicked off the app store only to have his product completely cloned by Apple. Then there's the Hey! email people who went to hell-and-back just to get an inoccuous update approved. Not even a year ago there was a class-action lawsuit against Apple by developers[2], and Apple's "compromise" was to charge less for a service that was provably garbage. They know their 30% cut is illigitimate, that's why they backed off so quickly. Even still though, they charge certain people 30%, others only 15%, and then the big companies like Netflix get away with 0%[3] because of insider deals that other apps cannot benefit from.

So fixing that would be a good start. Then they need to allow for alternative payment processors (as pressure mounts from countries like South Korea and France), and hopefully get rid of their asinine sideloading requirements that do nothing for the user except make it harder to get the functionality they want. These feel like non-negotiables to me, and as a developer I have no intention of supporting their software or using their hardware until it's fixed.

[0] https://www.inputmag.com/tech/apple-blocked-the-flicktype-wa...

[1] https://www.engadget.com/apple-basecamp-hey-email-app-store-...

[2] https://www.axios.com/apple-settles-developer-class-action-c...

[3] https://twitter.com/TechEmails/status/1444367219509637123


Your point was that Apple was taking money from developers - Hey isn’t going through in app purchases. Apple makes no money when people subscribe to Hey.

Netflix hasn’t allowed in app subscriptions for years.

Every developer with in app subscriptions only pays 15% after the first year.

If Apple’s 30% is garbage, so is Google’s and every console makers.

Do you support Android? Have you bought any game consoles.

As far as not allowing side loading - that’s a feature not a bug. Are you really unaware of all of the malware that is on computers because of no sandboxing?

I’m referring to well known software companies.

- Zoom installed a web server surreptitiously on Macs so when you uninstalled the software, it reinstalled it (https://www.zdnet.com/article/researcher-says-zoom-web-serve...)

- DropBox does all sorts of evil on Macs.

- Fortnite side loading introduced a vulnerability on Android (https://arstechnica.com/gadgets/2018/08/fortnites-android-vu...)

- Google and Facebook convinced users to install VPN software using a corporate certificate that allowed them to track users in other apps.

- not on purpose. But if you had system integrity protection turned off on the Mac and installed Chrome, it made the system unbootable (https://arstechnica.com/information-technology/2019/09/no-it...)


> If Apple’s 30% is garbage, so is Google’s and every console makers.

I agree. We have to hold Apple accountable first though, because they're abusing it hardest.

> Do you support Android? Have you bought any game consoles.

I do support Android, through F-Droid, and while my app wouldn't benefit from existing on a console I've sideloaded several apps to my Xbox One without any problem. I played through Castlevania last week and it ran flawlessly, so it's mostly Sony and Nintendo who are holdouts at this point (and I say that as someone with a hacked Switch).

> Are you really unaware of all of the malware that is on computers because of no sandboxing?

I'm fully aware. I just don't think it matters when the iPhone has much more malicious attack vectors, like zero-click iMessage exploits that cut straight through BlastDoor like it didn't exist. Maybe once Apple fixes their more egregious security vulnerabilities and embraces transparency they'd have an argument: but right now it's a poorly-disguised and obvious excuse for lock-in.


So Apple is “abusing the hardest” even though to develop a first class console game you have to pay much more to develop on it up front than $99 and you have to pay a license fee for each game sold either physical or digital? All of the console makers make it much harder to develop than Apple or Android - yet you bought an XBox One from Microsoft.


In the eyes of the Supreme Court, games consoles are not general-purpose computers. Even if they were, Apple still drives wider margins than any of these console manufacturers do. The cost to manufacture an iPhone is about 40% it's retail price. The cost to manufacture a game console is ~90-105% it's MSRP. Without the ability to drive hardware margins, they stand a lot better chance in court than Apple does.


There already had been a court case. Epic vs Apple. Epic lost on every point showing Apple to be a “monopoly”.

But it wasn’t about court. It’s about you selectively choosing who to have moral outrage against. So Apple makes too much money as the most valuable company in the world. But Microsoft and Google are the good guys even though they are also making obscene profits and are worth a trillion+ dollars?


I have a contrarian view of Steve that completely goes against the current zeigeist - ala he was evil, squeezed employees, drove them to exhaustion, believed in voodoo magic health potions and was a total asshole. But then no one dares to ask - despite all this, how did he inspire so many to follow him, to look up to him and to worship him? Usually that's scapegoated with "He had that magic aura". This is totally unfair. He loved many people, had an excellent taste in design, changed his mind often, was sympathetic to people that he trusted, pushed back hard on things he knew sucked and generally kept Apple away from the rifraff endeavors of HP/Compaq/Dell/IBM and took risks.

I think Walter Isaacson did a massive disservice by not focusing on things he was uniquely good at, but instead built a largely negative narrative around him; squandered an opportunity to show his work ethic, his approach and how he inspired people. I recommend reading "Becoming Steve Jobs" instead by Schendler and Tetzeli. One of the best moments in the book is when Steve wanted the I-beams to be absolutely perfect in the new office building.

It would be better if we pick out good things about any accomplished personalities and try to benefit from them, instead of dishing out vile hatred that is oh-so-common at discussion boards like this.


Second this, Becoming Steve Jobs is the superior book. It helped a lot that the authors had interacted with Jobs over decades and also had an understanding of business and technology that Isaacson was missing.

There’s also an old documentary about the founding of NeXT that does such a great job of showing what it was like to be in the room with him as a member of a small team, and this was before he fully “became” Steve Jobs. People underestimate how much meaning can be found in being pushed hard by someone with a clear and inspiring vision.

https://m.youtube.com/watch?v=Udi0rk3jZYM


I agree with most of what you say, but based on personal experience, have a hard time really knowing who he loved and didn't see sympathy expressed often for individuals who worked for him.


Well Steve's is being martyred into a figure head for people who disagree with Apple's direction. Often used in contradictory situations. For example, one can claim too thin being against with Steve, and someone else opposite.

Come-on, that guy has died for 10 years. His opinion on anything one cited for is *UNKNOWABLE*. Stop saying he will approve or disapprove some ideas... That's literally meaningless statement...


Of course you are right. I have a snapshot of Steve in my head that I apply, but his opinion changed frequently, as evidenced by the various permutations of the OSX interface designs.

That being said, I just can't believe he would have been happy about the various issues with the old MacBooks. So many things feel so wrong.


I think it’s fair to say that Apple was more responsive, faster, with someone like Jobs. There was just a bit more push through the company to fix X, Y or Z. It’s hard to say that any features in particular were delayed for iOS, but I think it’s possible macOS would have seen a bit more churn, arrived at the macOS 11 design sooner, and maybe already have a redesign in the works to handle the new “notch” at the top.

That said, pure speculation on my part, but I think the notch would not have launched on the laptops without some other benefit - e.g. Face ID - or it would have been on pause until it was small enough to match the current menu bar’s height. There was sometimes more of a push to get things “just so,” I think. Either way, I miss the old showy product introductions. I like the polish of the videos under lockdown, but it feels like the format drains the enthusiasm a bit.

And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park. Most Apple hardware seems like incremental improvements rather than flashy impulse buys. Maybe I’m just more impatient than I used to be.


The notch does have a benefit. The area below the notch is the same 16:10 display you would have gotten without the notch. Now the menu bar that has been at the top of Macs since 1984 doesn’t take away from that main area.

The Apple Watch has been much more profitable and will have a longer lifespan than the iPods.


>I think it’s fair to say that Apple was more responsive, faster, with someone like Jobs.

Only partly. Apple under Jobs sold a completely unusable mouse (the infamous "hockey puck") for years simply because it looked cool. And it took them ages to move away from the butt-ugly skeuomorphic design in IOS. Only after first MS went with flat design and later Google got the design language of Android right, did Apple throw that out.

>And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park.

Uh, IDK. Considering how first the M1 and then the derivatives rocked the world of CPUs, I'd really disagree. The ripples of that really rocked intel and might cause quite drastic changes there.

Me, personally, I was looking at buying either a Dell XPS or a Framework laptop next to run Ubuntu, but considering the latest MBPs, I will buy one of those as I really love quiet machines with great displays.


> And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park.

I mean… the evolution of Apple’s in house chips are absolutely park-knocker-outters, and the thing that makes me bullish about the company’s future. It alone may be enough to secure a front runner position when it comes time to transition to AR.

They would benefit again from an iconoclastic head of product, but the new laptops and stability improvements introduced in Monterrey suggest that bench is stronger than a lot of people think.


True I love the new chips, but the new screen tech is a bit of a battery hog and isn’t perfect (which now that I write this reminds me of when the Retina displays were first introduced…)

Yeah, I’m not knocking Apple’s tech - but I was hoping for “the latest specs” and what I got was an HDMI 2.0 port, UHS II SD card slot, and so on. It reminds me of when they removed Thunderbolt Display input from the iMac, so you couldn’t use an old iMac as a secondary monitor. Or when they removed the optical sound out from the headphone jack. I won’t even ask why the new Macs don’t have cellular or why Apple hasn’t thrown money at game companies to make better Mac ports or otherwise entice PC gamers consider Mac hardware instead of Windows. Speaking of gaming, the lack of a “console-like” Apple TV is also a bit of a letdown. As is the lack of an Echo Show competitor.

I don’t doubt that this will be like the iPhone all over again - Apple’s late to the party but gets it right. But I also kind of worry it will be Siri all over again - fantastic at first, but ultimately a cancelled HomePod and still a work in progress…


I’m very curious about a few things - how does someone like Steve gain so much respect from so many different types of people? Was it the ‘we’ve won before with him, so I must believe’? - a Nick Saban like persona. Or was it that he was unbelievably empathetic? — that doesn’t make sense, because not all empaths are able to rally people to a cause due to the bleeding heart syndrome.

I ask because it is almost as if you see the bricks change shape at Apple trying to fill the missing piece... they know they need that influence, it’s just not there, and honestly, I want to be a part of an organization that operates in the post-kicked out Steve aura.


This is such a difficult and interesting question to try and answer. Of course, I can only offer my viewpoint as someone who worked for (with?) him in four different contexts.

He was most definitely not empathetic to me. He could be very empathetic to an abstract construct of a person. He would act as an advocate of the "user" but I felt that the user was always him. How did this align with reality? I guess it did, to a certain cross-section of people who appreciated whatever guise of a user Steve represented. This user construct changed over time and I could see it in early Apple Steve, lost in the wilderness Steve, NeXT Steve and return to Apple Steve.

Steve had charisma. Younger Steve charisma was different to me and it left an imprint. I felt that he was attractive physically and mentally, could engage with you and make you feel like you and he were doing something that could really make a difference. We knew we weren't curing cancer, but that somehow the pursuit was noble in a similar way; enabling human potential that was being lost. I still want to feel this way about technology.

Once you got to know Steve (if one could) he was oddly two-dimensional. His lack of real personal connection or concern about you as an individual was troubling. In some ways, the more abusive he was to you, the more it showed his interest in you. It was dysfunctional. He was never mean to random people (in my observance) although you hear stories of him being abusive to people he didn't know, I never saw it. I saw him open doors for people, let people cut in line in front of him in the salad bar, normal sorts of politeness. The higher his expectation of you, the harsher the abuse you could expect to receive. You could never really become numb to it, but after awhile you just began to adjust your calibration.

As you point out, Steve did win quite a bit. Of course he lost and sometimes he lost big. I really didn't care about winning the way a lot of Silicon Valley people care and I saw some of this in how Steve lived his life. Yes, he drove a nice car and had some property. But he wore well-worn clothes, drove himself to work and generally seemed like one of us. Later in life, odd things like private jets, yachts and Central Park condos showed up.

Everyone that I worked with at Apple who is still there knows it is not the same company. It can't be. The scale, management structure, market, political climate and more is different. And there is no Steve. I left Apple when Steve finally left, but I knew that "my" Apple was gone around 2006.


All I can say is thank you for pouring this out into the ether, these attributes and recollections are beautiful, or dysfunctionally beautiful you could say.


> how does someone like Steve gain so much respect from so many different types of people?

Simple. Steve is undeniably one of the most influential people in the last 50 years. You don't have to agree with him to recognize this. To dismiss his impact because you don't like him personally is shallow and reductive. No one is saying he was a nice guy.

First, he and Woz brought us the Mac. Sure, Woz was the tech guy but Steve's product influence cannot be overstated. Macs really lost out to the IBM PC and then Steve was forced out of his own company by his own hire (John Sculley from Pepsi).

Apple languished for a decade while Steve started NeXT. On the brink of bankruptcy when Apple had to be rescued by a $150 million injection of funds from Microsoft.

In the next decade, Steve took the NeXT OS that because the foundation for OS X and iOS and released the iPod, iTunes and then the iPhone that ultimately turned Apple into what it is today: a trillion dollar company that literally prints money.

Steve was basically the ultimate product person and really a visionary. It was often stated that he generated a reality distortion field as he literally changed industries around him. The sea change that was the iPhone took control over code distribution on mobile phones from the wireless carriers. The price of this was several years of AT&T iPhone exclusivity in the US. The popularity of the iPhone bent carriers to his will. He unrelentingly refused to ship bloatware on the iPhone (unlike what you get on basically very Android phone other than the Pixel).

Apple developed a track record for taking terrible technologies and making them great. One of my personal favourite examples is connecting to Wifi. Many here probably aren't old enough to remember this but in the early 2000s that involved going to a settings window in Windows and entering the Wifi type (802.11b and then 11g), the encryption type (eg WEP, WPA, WPA-PSK, WPA2 or WPA2-PSK) and an encryption key, which may or may not be a password.

On OSX you simply selected a network and entered a password. Why ask the user for a bunch of stuff they don't care about, probably don't know and you, the computer, can figure out?

To me this is classic Steve influence.

Fun fact: Steve is the reason we don't have DRM on downloaded music. iTunes initially used a DRMed format to launch and all songs were $0.99. DRM was demanded by the RIAA. The RIAA didn't like the pricing model. Steve ultimately made the bargain that gave them pricing tiers but his demand was no DRM. That's how iTunes ended up distributing DRM-less MP3s instead.

Steve was by far the most user-focused of any of the tech titans of the last half-century.

Some people don't like Apple's walled gardens and that's fine but again, to dismiss the impact of the iPhone (for example) because you personally prefer the "freedom" of Android is a shallow judgement.

We stand on the shoulders of giants. And Steve was a giant. That's why he was and continues to be respected.


<OS flamewar>

Not sure about Windows 2000, but connecting to wifi on Windows XP took a single double-click... and I believe OSX Tiger was released years after XP.

</OS flamewar>


Was that true before Service Pack 2? I suppose SP2 was still 6 months before Tiger in any case.


You're probably right.


I'm late to the discussion. And I don't have near the personal insight to offer that @diskzero does. But I think there's another aspect as well.

Apple came back.

I can think of no other company in recent history that has come back from such a long run on death row to periodically become the company with the world's highest market valuation. And we tend to associate that (for many good reasons) with Steve Jobs. Steve's persona and the admiration that follows I think derives in large part because it has been the ultimate come back story.

This "come back story" is one of the most classic and inspiring stories that has resonated through myth and fable throughout history. It's what makes us stand up and cheer in a theater when Daniel Laruso delivers the take down kick. It is Miracle when the US beats USSR in hockey. It's why we like to interpret David and Goliath as a little guy takes down big guy story (vs the kid with the rock gun kills the big lout https://www.ted.com/talks/malcolm_gladwell_the_unheard_story...).

It's interesting to note that with Steve gone, Elon Musk has risen to the new Crazy Successful Celebrity Leader figure. Like Steve, you can see Elon from a variety of facets, some very flattering, and some very damning, what we would love about him is that he has had some success at defeating the status quo.


The comeback was amazing, wasn't it? It was cool to be there and I think about the dynamics a lot. What I really keep trying to figure out is how we were so effective with such small team sizes and why this can't scale. When I was at Amazon, the mobile application team three times larger than the OS X engineering team circa 2001.



    That being said, I just can't believe he would 
    have been happy about the various issues 
    with the old MacBooks. So many things feel so 
    wrong
It's an interesting question for sure.

During his life, he certainly did champion a lot of form-over-function decisions: the "cube" G4, the hockey puck mouse on the iMacs, etc.

And then he also championed some similar decisions that most people regard as roaring successes: the removal of legacy ports on the MacBooks felt an awful lot like the decision to ditch legacy ports on the original iMacs.


> the removal of legacy ports on the MacBooks felt an awful lot like the decision to ditch legacy ports on the original iMacs

Sort of, but this disregards some important product context. Having a multi-port dongle or adapters on a desktop machine is a very different experience than a portable.


Totally agree - my understanding is Steve Jobs just was 100% committed to an opinion, until someone convinced him to go 100% in on a different opinion.

I'd also add that with Jony Ive on the way out for years, there are a lot of decisions attributed to him that he likely did no more than sign off on.


Whenever someone working at Apple has a Big Idea(TM), invoking "This is what Steve would have done" is now pretty much a mandatory tactic in arguing your position.


"Conceptual integrity in turn dictates that the design must proceed from one mind, or from a very small number of agreeing resonant minds."

― Frederick P. Brooks Jr., The Mythical Man-Month: Essays on Software Engineering


> rich Corinthian leather skeumorphism

In my opinion this design was a lot more usable than the current 10 year trend toward having no borders around anything anywhere.


I agree with you. I am hoping the pendulum will swing back to more humane interfaces.


He was somehow a *volent Dictator For Life :)

With this knowledge about Steve Jobs, it's fair to assume that he knew that too right ? he probably knew he was part of the glue that made Apple Apple, and took measures to avoid dilution happening too fast.

Also says a thing about leadership and human groups :)


I think these MacBooks are precisely in Steve's vision. They're ultimately just great machines for the users.


The notch is dumb. Wait until the next iteration for the validation of that statement.


Do you think the sensor will get smaller or the top bezel will just increase to the old height?


My mother has a a51. And apart from the OS, I cannot see any reason this phone is not the perfect form factor.


Thanks for sharing this. From an outsiders perspective I always assumed it was a ying and yang relationship where the sum was superior to the parts and in balance. Sounds like that was fairly accurate.


This is pure speculation, ungrounded from any evidence.

The touch bar is a very flexible (effectively) analog input + rich display device. If adequately supported by software it can be an amazing input, affording a range of useful functions not replicable with discrete buttons. In general, I really wish modern computers had more analog inputs available. Analog knobs, jog wheels, sliders, trackballs, etc. are tragically missing.

I have seen no evidence that Jony Ive was its patron, and no evidence that including it had anything to with making laptops expensive as a goal.

The problem with the touch bar is that (a) it only shipped on a limited subset of devices so software authors could not depend on it, (b) after its initial functions, Apple made limited effort to adopt it in all of their own software, improve its integration into the system, or push boundaries of what it could do as an input device.

> The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.

No, this was some Apple-internal mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype vs. full-scale factory production + poor estimation of reliability in a wide variety of contexts over a longer period of time. Nobody ever set out to make a “worse experience” or higher cost.

There are many suboptimal features of the common rubber dome + scissor stabilizer laptop keyboards, and I wish more companies were brave enough to experiment with alternative designs in search of improvements. (Disclaimer: my favorite "laptop" keyboards are https://en.wikipedia.org/wiki/IBM_PS/2_portable_computers and https://en.wikipedia.org/wiki/Macintosh_Portable)


> Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.

Also the parallel port. I remember the drama!

It goes the other way, too. When Apple put cameras in all of their laptops, the press relentlessly bashed them for wasting BOM on something so useless and expensive. Then the industry realized it was a good idea and followed suit. Similar for retina displays -- the term "High Definition" had become synonymous with "good enough" and ground PC monitor advancements to a halt for a decade. Phones were coming out with higher resolutions (not pixel densities, resolutions) than full-size monitors. Then Apple figured out how to market higher resolutions, the press mocked them for wasting money, but word got around that HD might not be the end-all of display technology and consumer panel resolutions started to climb again.

Here's a counterexample, a niche that could really use the Apple Bump but hasn't gotten it and probably won't get it: 10 gigabit ethernet. 1GbE became synonymous with "good enough" and got so thoroughly stuck in a rut that now it's very typical to see 1GbE deployed alongside a handful of 10 gigabit USB ports and a NVMe drive that could saturate the sad, old 1GbE port many times over.

Sometimes taking risks results in a Touch Bar or Butterfly Keys. That's just the nature of risks. The only way to have a 100% feature win rate is to limit yourself to copying features that someone else has proven out, but if everyone does that then the industry gets stuck in a rut.

I'm glad Apple exists, even if I don't personally feel the need to fund their experiments.


> 1GbE became synonymous with "good enough" and got so thoroughly stuck in a rut that now it's very typical to see 1GbE deployed alongside a handful of 10 gigabit USB ports and a NVMe drive that could saturate the sad, old 1GbE port many times over.

This has a few reasons:

- 10 GbE was, until quite recently, pretty power intensive and it still is more expensive and hot than gigabit

- Devices in LAN, especially those with high bandwidth usage, have become far rarer. A lot has moved to the cloud and the bandwidth of most people can't saturate 100 Mbit, not to speak of Gigabit.

- LAN as a whole has become rarer. A lot of people now only use WiFi with their phones or laptops, up to the point that most people now have (theoretically) faster WiFi than LAN.

Combined, there are few reasons to take the expense of putting a high-speed ethernet port on a device. Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit and you see those ports on a few consumer devices now.


10 GbE is still iffy even with CAT6 cabling over copper which complicates deployments and user experience. As a result, prosumer type devices like recent AMD x570 motherboards and the upcoming Intel Z690 based ones are including 2.5 GbE ports that are rated to work over CAT5E and provides enough bandwidth for a few hundred GBps with a lot less power usage on the switch side (something like < 4w / port seems common) and makes it easier for low cost passively cooled switches to work alongside a switching SoC that doesn't need to be terribly sophisticated to hit the latency requirements needed to support 2.5 GbE.


10GBASE-T is power hungry and unreliable, but dirt-cheap 10GBASE-LR and 25GBASE-LR transceivers work great up to 10km. If only they could figure out how to fit the transceivers into mobile-friendly packaging. But for a workstation they're great.


That's true, I actually run fiber in my home for that reason. I think the problem with fiber is, though, that the technology is pretty unknown to consumers and working with fibers is a lot harder than working with cables; they take a lot less abuse before breaking, for example. But if someone is going for 10Gbit+ in their home network, I can highly recommend fiber.


I really which LAN made a comeback. There hasn't been a week without a video conference where someone had internet issues due to Wifi problems. In fact, from my experience, most times people talk about issues with their internet it's in reality Wifi issues. But few non-technical (and even technical) people consider connecting devices like TVs or Laptops by LAN, even if they hardly ever move and the router is close by.

All this talk about how fast Wifi can be made people think it's all they need. But in reality, building a Wifi network with fast speeds across a whole house while avoiding too much interference from networks around you (esp in inner cities where you can easily have 40+ networks in reach) is more work and more expensive than pulling LAN cables to the right places.

But LAN isn't sexy and no one advertises how fast a copper cable can be, it just doesn't sell products as well as talking about Wifi 6.

/rant


I am partly with you. But considering you have to actually put wires through walls and install sockets vs. just setting up an Wifi 6 access point, I don't see Ethernet making a huge comeback.


> Combined, there are few reasons to take the expense of putting a high-speed ethernet port on a device. Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit and you see those ports on a few consumer devices now.

I think the only thing driving 2.5/5/10GbE at all is that WiFi Access Points need it.


Compare the cooler for a 2.5GbE card [0] to that of a 10 GbE card. The fact that WiFi (which is what most consumers use) now supports those speeds surely helps, but 2.5GbE is also simply far easier to integrate and power.

[0] https://www.amazon.de/XIAOLO-Netzwerkadapter-Unterstützung-L...

[1] https://www.amazon.de/XG-C100C-Netzwerkkarte-RJ45-Port-802-3...


I agree that 2.5GbE is easier, and I still think AP backhauls are the primary driver for it. AP makers cannot sell multi-gigabit WiFi APs without a back haul that can support them.


> 10 GbE was, until quite recently, pretty power intensive and it still is more expensive and hot than gigabit

PCIe 3.0 transceivers were 8Gb/s and supported preemphasis and equalization, closing the sophistication gap with their off-backplane counterparts. How many PCIe3+ transceivers has the average person been running (or leaving idle) for the last decade? These days a typical processor has 16Gb transceivers by the dozens and 10Gb hardened transceivers by the handful. I just counted my 10Gb+ transceivers -- I have 36 and am using... 10 (EDIT: 8/4 more, HDMI is 4x12Gb/s these days).

The reason why 10GbE is expensive has nothing to do with technology, nothing to do with marginal expense, nothing to do with power, and everything to do with market structure. Computer manufacturers don't want to move until modem/router/ap/nas manufacturers move and modem/router/ap/nas manufacturers don't want to move until computer manufacturers move.

These snags don't take much to develop, just "A needs B, B needs A," and bang, the horizontally segmented marketplace is completely immobilized. That's why the market needs vertical players like Apple who can push out A and B at the same time and cut through these snags, or high-margin players like Apple who can deploy A without B and wait for B to catch up. Otherwise these market snags can murder entire product segments, like we've seen happen to LAN.

No, it isn't because of reduced demand. People are recording and editing video more than ever, taking more pictures than ever, streaming more than ever, downloading hard-drive busting games more than ever, and so on. LAN appliances would have eaten a much healthier chunk of this pie if LAN didn't suck so hard, but it does, so here we are.

> Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit

Yaay, PCIe 2.0 speeds. 2003 called, it wants its transceivers back :P


Power is a big differentiation. You need to send 10GbE over 100m (some break the standard and only offer 30). Have you ever touched a 10GbE SFP module or the heat sink of a card? They're quite hot and you need to provide that energy, which is not a problem on a desktop, but a big one on a laptop. If the laptop has RJ45, that is.

> modem/router/ap/nas manufacturers don't want to move until computer manufacturers move

Modems and routers only make sense once they serve a link that is actually beyond 1Gbit - which is rare even today. Also, these devices are minimal and the hardware required to actually route 10Gbit is a lot more expensive. Even Mikrotiks cheaper offerings today can't do so with many routes or a lot of small packages (no offense to them, their stuff is great and I'm a happy customer - it's still true, though).

APs are a bit different, as WiFi recently "breached" the Gbit wall (under perfect conditions). But there are already quite a few with 2.5Gbit ports to actually use that.

NAS, on the other hand, are a bit held back by the market. Still, high-models have offered either 10Gbit directly or a PCIe-slot for a long time now.

> People are recording and editing video more than ever, taking more pictures than ever, streaming more than ever, downloading hard-drive busting games more than ever, and so on. LAN appliances would have eaten a much healthier chunk of this pie if LAN didn't suck so hard, but it does, so here we are.

The professional video editing studios with shared server are already on 10 Gbit LAN, the stuff has been available for years. Pretty cheap even, if you buy used SFP+ cards. Switching was expensive until recently, but I'd say that the number of people which need a 10G link to a lot of computers are even less.

And LAN competes with flaky, data-limited, expensive 100 MBit lines (if you're lucky). 1GbE is beyond awesome compared to that and yet it lost, anyway.

> Yaay, PCIe 2.0 speeds. 2003 called, it wants its transceivers back :P

I'm not happy, either, but its better to at least go beyond Gigabit speed rather than stay stagnant even longer.


Are you really suggesting that PCIe and ethernet are equivalent? There are so many differences, starting with the distance...


> ... a niche that could really use the Apple Bump but hasn't gotten it and probably won't get it: 10 gigabit ethernet

10GbE was a bit of a mistake on several fronts.

We had become used to these 10x iterations with Ethernet from 10Mb to 100Mb to 1Gb such that 10Gb seems like a natural extension. But running that bandwidth over copper remains a significant technical challenge. For awhile I was using a Thunderbolt 10GbE controller and it was huge (basically the size of an old 3.5" external HD) and most of it was just a giant heatsink, basically.

In commercial situations, the issues with copper often result in using fiber instead. At that point there are less barriers to even higher speeds (eg 25Gb, 40Gb, 100Gb), which make a lot of sense in data centers.

Added to this, there's not a lot of reason to run 10GbE in a home setting or even in many small corporate settings. Even in larger corporate settings, you can go really far with 1GbE using switches, bridges and routers, possibly using higher speed backhaul connection technologies.

What should've happened is what has started to happen in the last few years: interim speeds (eg 2.5Gb and 5Gb). Hopefully these become more widespread and become relatively cheap such that someday they just displace 1GbE naturally.

On top of all of this, Ethernet is an old standard that uses 1500 byte frames. This actually starts to become an issue at 10+ GbE such that various extensions exist for very large frames (eg 9000 bytes) but this runs into issues with various hardware and software.

Probably largely because of the 1500 byte frames of Ethernet, the de facto standard for TCP/IP MTU is pretty much 1500/1536 bytes and this has become a self-fulfilling prophecy as more and more infrastructure is deployed that makes this max MTU assumption.


The scary part? 1GbE is older than I thought. A couple weeks ago I replaced a 1GbE switch (gs524t) at my work and got curious. Said model came out in 2001 or 2002.


Copper gigabit Ethernet is about as old as USB 1.1.


>> Also the parallel port. I remember the drama!

Printers I can see. An entry-level HP LaserJet was $600 back in 2000, something not as easily replaced as a serial mouse or gamepad.

>> a niche that could really use the Apple Bump but hasn't gotten it and probably won't get it: 10 gigabit ethernet.

They stuck it on the Mac Mini


As a $100 additional option...

(which granted, isn't too bad compared to the price of a new 10GbE card, but still...)


The touch bar was fingers on glass. It's not appropriate for a professional device since it requires you to look down and doesn't lend itself to the "mechanical" use of devices that high-paced work requires.

Also, you can actually add the analog inputs yourself. The DIY keyboard community -- which is flourishing with options and new vendors -- has lots of options available. I myself have two analog knobs and one trackpoint on my keyboard. It's absolutely amazing.


I was really hoping Apple had a big leap forward in fingers-on-glass interaction planned. Imagine if the glass could kind of raise or move down so you could “feel” where the buttons were. Heck, even providing a few notches in the chassis, above the Touch Bar, for a finger to “feel” relatives where it was, and require a harder “press” to activate the Touch Bar, would have been likely a game changer.

But they didn’t. And I was always confused that the Touch Bar never got more love from the hardware developers.

That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.

It also wasn’t easy to build software for the Touch Bar from what I could gather. I had lots of ideas for little tools (think iStat-like gauges, but perhaps for things like the mic input level), but it wasn’t very easy to build one when I tried.

RIP Touch Bar. You might not be missed too much, but I bet something like you will come up again in a decade or two.


> That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.

Yes. It wasn't Jony. It came from the software side. I won't name who to protect the guilty.


    I was really hoping Apple had a big leap 
    forward in fingers-on-glass interaction 
    planned. 
Me too, but even if they solved that challenge -- I think there's an even bigger and insurmountable challenge there: a sizable percentage of users, particularly "power users", frequently use their Macbooks "docked" and hooked up to an external monitor and keyboard.

While many people happily type on their laptops all day long with no external monitor or keyboard, there are also many people (me included) who think that's an absolute ergonomic disaster and greatly value the extra screen real estate of an external monitor.

So, honestly, there's no way the Touch Bar could have been good enough for me to use it.

The whole idea was just misguided.

What Apple should have done IMO was allow an iPad or iPhone to fill that role. I still do not understand why my iPad can't be an amazing accessory to my Mac. With the right software magic and integration, it could be everything the Touch Bar was while simultaneously doing a lot more, and is of course extremely compatible with external keyboard use.

Well, I can guess why, actually -- it would require a lot of coordination between the Mac and iDevice teams and that is challenging.

(However I'm encouraged -- Sidecar is a good first step. Let's see more...)


> That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.

I think it was more like they decided to add the equivalent of an Apple Watch to Macs to support TouchID and then asked "what else can we do with it?".


If the analog add ons are DIY or even extra money, then software developers cannot rely on them being present and won't develop good software and use cases for it. At least not most of them. The best you can hope for is niche software support.

So adding stuff yourself is nice (I do it myself!) but not a way to move the industry or even the Apple ecosystem forward.


Analogue input is pretty much a solved problem. Not only do we have standards for game controllers, but also MIDI control surfaces give you a wide variety of analogue physical controls. MIDI even comes with incredibly rich input automation.

Sadly, the only company I'm aware of producing that sort of hardware for use outside the music industry seems to be Loupedeck.


And yet a developer still can’t assume that the customer will have such analog input tools available.


Niche software support? My keyboard and the analog addons works on Mac, Windows and Linux with excellent support since it runs QMK firmware.


How well does this analog input work with, say, Photoshop?


My touch bar almost always has "Display Connected: [Mirror Displays] [Extend Desktop]" on it. I can fiddle around and get it to show app-specific things, or hold Fn to see the F keys, but most of the time I'm using it it shows those useless multi-monitor buttons.

I'm sure there's some setting somewhere that defaults to showing whatever the layout for the in-focus app is, but it's failed to make me care enough about it to try to figure it out.


> It's not appropriate for a professional device since it requires you to look down and doesn't lend itself to the "mechanical" use of devices that high-paced work requires.

Sure it could; it just has to beat the cost of the lookup. If you could have done some complex operation trivially with it, that couldn’t really be done with some keyboard shortcut, being a dynamic visual field would be fine.

Of course, volume sliders don’t fit that bill, and I don’t think anyone really found something that did… but it’s not some fundamental guarantee that it would be useless.


> It's not appropriate for a professional device since it requires you to look down

Apart from developers many professionals do look down all the time because they typically have other devices connected e.g. synths, photo/video editing rigs.

And Touchbar was designed much more for that audience.


I can see how you might make that assumption based on how the Touchbar has exposed functionality, but this was not the goal of the Touchbar as it was sold internally. It was sold as one of the next great UI affordances. It came from some of the same people that brought us Mission Control, the Dock, Exposé, etc. I worked on a lot of these features and I never use them. Shame on me.


On a laptop, that "look down" means looking at the bottom pixels of the screen.

The looking was never the problem, IMHO, the problem was execution and utility. It was actually distracting when there was adaptive completion results continually flashing. And the rest of the buttons were never great.

If every single dialog box flashed the buttons, that would be a win as it is easier and faster to tap the touchbar than it is to navigate the cursor and then click. But this obvious use case never really materialized.

And if, like most "professional" users, the laptop is operated via an external keyboard, the muscle memory never develops.


What keyboard do you have with a trackball?


The Ultimate Hacking Keyboard has trackball options:

https://ultimatehackingkeyboard.com/



Oops. Should have been "trackpoint" and not trackball. It's a pimoroni trackpoint breakout board.


G80-11800 of course


> This is pure speculation, ungrounded from any evidence.

It's not ungrounded from the anecdotal evidence that these changes are coming after Ive's departure.

> I have seen no evidence that Jony Ive was its patron, and no evidence that including it had anything to with making laptops expensive as a goal.

Holy, evidence Batman! Leadership 101: When your title is "Chief Design Officer", the design buck stops with you. When your company releases an updated design to an existing product, you had some kind of say in that design. Period. Even if your "say" was just that you were aware of it, and didn't veto it.


> When your title is "Chief Design Officer", the design buck stops with you.

Agreed with this. When you're coming to the CDO position after 20 years of being a hands-on designer at that company, most recently as the head of both human interface and industrial design across the entire organization, and having been described as being the person with the most operational power at Apple, after Steve Jobs himself, even before being promoted, my guess is that these design changes did not sneak under his radar. It is most likely that he set the goals that produced these designs, and that he was aware of and approved of them from the beginning. And I suspect that as a new C-level, he was probably even more hands on than that.

But since in this thread we are being asked to hold ourselves to a very high standard of rigor, I should note that I have not submitted this comment to peer review, or made my data available for replication at this time. I'm just basing this on, you know, how jobs work.


> my guess is that these design changes did not sneak under his radar

Ive is part of the Senior Leadership Team.

No major decision in the company sneaks under his radar.

But that doesn't mean he is responsible for every decision.


There's a big difference between "Jony Ive, as CDO, must have signed off on this, and thus bears responsibility for it" and "Jony Ive was pushing for this, for these specific reasons".


> Holy, evidence Batman! Leadership 101: When your title is "Chief Design Officer", the design buck stops with you. When your company releases an updated design to an existing product, you had some kind of say in that design. Period. Even if your "say" was just that you were aware of it, and didn't veto it.

This is just shifting goalposts because you got called out.

You didn't word your comment as "these things happened on Ive's watch" you consistently word your comment like Ives was personally pushing for something.

It's a common refrain on HN and it's never backed with proof.

And speaking of your first comment:

> Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.

... you realize that this is a new generation of MBP landing on the exact same cadence they've come out on in the last few decades?

So it makes perfect sense to have drastic changes land now regardless of who's in charge?

Not mention the fact it hasn't even been two full years since Ives left. And the fact the HDMI port was coming back leaked at the start of the year.

So unless you seriously think Apple designs a laptop in the course of a single year, it's highly unlikely he had no input on the current machine.


> You didn't word your comment as "these things happened on Ive's watch" you consistently word your comment like Ives was personally pushing for something.

This is not a meaningful difference when he's in charge and it's a flagship product.


[flagged]


You're naming people that are managing entire companies.

The person in charge of design, for a company that has a handful of physical products, is a completely different situation. It's reasonable to blame them for top level product design decisions. What happens in that specific realm is what they want. The top priority of their job is those few dozen decisions. The opposite of a CEO that's overseeing ten thousand different things.

Be a little less stuck on the word 'pushing'. The fact is, when it's one of the main things you're in charge of choosing, and you allow a decision and then stand by it for a long time, you are now pushing it.

Also, wait, you're the one that inserted the word 'pushing' into the conversation! If you're upset with that wording, you're upset at a strawman.


Kevin Scott is a CTO. He's in charge of top level product technical decisions

The top priority of his job is those few dozen decisions.

Be a little less stuck on the word "Director". The fact is, when you're one of the main people in charge of allowing decisions, it's not the same as personally championing them.

-

> Also, wait, you're the one that inserted the word 'pushing' into the conversation! If you're upset with that wording, you're upset at a strawman.

You know you can just read the comment I referred to right if you've already forgotten right?

The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.

Does that sound like personally assigning blame to Johnny Ive for something? It'd be one thing if it said Ive's team or something, but it's the common refrain parroted on this site

-

If John manages Joe and Joe deletes a database in prod, do you say "John's subordinate deleted a database in prod" or do you say "John deleted a database in prod".

You see how there's a difference there even though both acknowledge that John has a part in what happened?

It's not that complicated to see the difference if you've ever interacted with any sort of situation where the buck actually stopped with leadership, but I guess that's not universal.


> Kevin Scott is a CTO. He's in charge of top level product technical decisions

Then it's probably fair to blame him for some high-level decisions. But technical decisions go well beyond design, and microsoft has so many products, so it's harder to say how much you can point at him.

> The top priority of his job is those few dozen decisions.

I honestly have no idea which few dozen you mean. Across all of microsoft? I could list a bunch for "apple product design", like the way airpods fit, the decision to have no holes in airtags, the keyboard and touch bar choices in macbooks, etc.

Maybe the start menu location? You could probably blame him for the choice of xbox models too. I'm not singling out Apple in saying that executives should be considered responsible for certain high-level decisions.

> Be a little less stuck on the word "Director".

I'm stuck on the word "design". He's the design guy.

> The fact is, when you're one of the main people in charge of allowing decisions, it's not the same as personally championing them.

If it's one of the top few most important decisions under your job purview, the difference is so minor as to not matter outside the company.

> Does that sound like personally assigning blame to Johnny Ive for something? It'd be one thing if it said Ive's team or something, but it's the common refrain parroted on this site

Assigning him blame is not the same as saying he 'pushed' it. The buck stops here for design. He gets the blame because he strongly approved it and he could have easily spent entire days on the decision because that's the core of his job, and spending enough time on the decision is also his job.

> If John manages Joe and Joe deletes a database in prod, do you say "John's subordinate deleted a database in prod" or do you say "John deleted a database in prod".

John decided to delete a database in prod. Ive decided to go with this keyboard.

Assuming the delete wasn't accidental, because the keyboard definitely wasn't accidental! If it was an accident this analogy isn't relevant.


Just putting this out there [1] - Steve Jobs would have rightfully put responsibility for these design changes under the Chief Design Officer.

"Somewhere between the janitor and the CEO, reasons stop mattering," says Jobs, adding, that Rubicon is "crossed when you become a VP."

In other words, you have no excuse for failure. You are now responsible for any mistakes that happen, and it doesn't matter what you say.

[1] https://www.businessinsider.com/steve-jobs-on-the-difference...


I think it's fair to say that Jony was ultimately the DRI ("directly responsible individual" in Apple-speak) for all industrial design, so he "owns" it, which is a bit above "signing off" or "accepting", whether or not he was personally pushing for something.

This is a bit of a quirk of how Apple structures responsibility, and makes it a bit more fair to say that "Jony made a disliked change" in a way that doesn't quite apply at Google or Microsoft, where responsibility tends to be a bit more diffuse.


DRI has expanded throughout the tech industry, I can't remember the last time I was on a team that didn't use the concept.

But I provided a simple analogy above.

Say John manages Joe and is the DRI for data storage. If Joe goes and deletes the production database, John has some blame even though he didn't personally delete the database.

Do you not see the difference between saying "John deleted production?" and "John's subordinate deleted production?"

Both are assigning some blame to John, but only one is factually true.

This entire conversations almost feels like the typical HN inability to realize the world is not black and white.

It's like people need Johnny Ives to personally have opened up a CAD drawing and shrunk the MBP because it's utterly impossible that a larger team decided on the vision for an entire flagship possible.

-

Lol the replies. What a weird way to dodge a simple question lol.

"John's subordinate deleted production" implies that John is partially responsible, but accurately reflects he did not personally delete it.

You're not even mentioning Joe, you're accurately reflecting John was in charge, but you're also not lying and saying John did it.


I work at Apple, I'm a senior engineer - been there for almost 2 decades. I'm DRI on a few things here and there.

Not a single decision is made on things that I am DRI on without me being a part of that decision. I may not get my way if I'm over-ruled for corporate reasons, but I know about it, and being the DRI, I get a slightly-larger-than-average say in what happens. Generally it takes a director or VP to over-rule what I want, and then the radar is very clearly marked as such.

Apple takes the concept of the DRI very seriously. You don't give responsibility without also giving power.

My opinion: There is zero chance (not "a small chance", zero chance) that Jony Ive didn't sign off on, and explicitly endorse the Touch Bar. Something that obvious, in that commanding a position in the user interface would never have escaped his personal input and attention.


Thank you for saying this, your personal experience here is just about the best insight we could ask for. Subjectively, there's an odd lack of current Apple engineers weighing in on threads here at HN relative to other FAANG companies. I've often wondered if the company's rules were stricter.


Apple engineers are generally shy to weigh in as Apple engineers. Apple takes the "opinions my own" more seriously than most.


Isn't it well-known that Apple's culture is extremely secretive?


"John was responsible for production having been deleted" because of the systems and processes he did or did not put in place. At a high enough level of abstraction, this is all that matters.

Jony was responsible for the Touch Bar.

Anyway, some evidence: "For years, Apple Chief Design Officer Jony Ive has expressed a desire for the iPhone to appear like a single sheet of glass", suggesting that this could have been part of a larger overall design direction. (https://www.wsj.com/articles/apple-unlikely-to-make-big-chan...)

I'd be willing to bet that they mocked up MacBooks with full touchscreen keyboards.

Further, I don't think it's a coincidence that I don't mind typing an email (core C-level activity) on an iPad on-screen keyboard, but I'd find it infuriating to try to code on.


John’s subordinate asks John if he should delete production. John says go ahead.


People say a lot of things about gate's Microsoft, Ballmer's microsoft, Job's apple, and Larry page's Google.


> When your title is "Chief Design Officer", the design buck stops with you.

That's only because in your ignorant reality you have made it so.

The actual reality is that what constitutes a product is so much more than just the design. For example it includes what features should and shouldn't be there. And that is a decision largely coming from the Product team. Or how it works. Which comes from Hardware Engineering team.


All Apple had to was to have the touch bar over function keys, and nobody would complain at all.


THIS! Touch bar is cool but when they went and removed _the ESC key_ they failed. Both would be ideal


Touch Bars have had a separate, physical ESC key since 2019. I'm looking at one right now.


ESC itself doesn't cut it for me. With my resting hand position on the keyboard, my fingers touch the touch bar, and it always causes something either catastrophic or very frustrating. On a similar note, I'd gotten used to pressing Fn keys without looking at the keyboard. With touch bar, I have to carefully analyze the touch bar before doing anything with Fn keys. It's a very problematic experience overall. If it was a separate bar, I wouldn't have any of these issues.


Due to years of vocal complaints from developers since 2016.


Sure, but "both would be ideal" exists. You can have an Escape key and Touch Bar, if you want.


I think two rows of function keys would be ideal!

Each with a little OLED display, please. (Why hasn't this happened yet??)


It's not "ideal". Function keys are exactly that: physical keys. Something you can use without looking down at your keyboard.


I thought the touchbar was great idea but I hated that the function keys (and especially esc for a while) were sacrificed for it. They could have taken that 1cm of vertical space from the ridiculously huge touchpad instead and given us a ridiculously huge touchpad along with function keys and a touchbar.


> The touch bar is a very flexible (effectively) analog input + rich display device.

It just can't work with people like me that never look down at their keyboard. I'm not trying to be elitist, it's the honest truth. I wanted to love the Touch Bar, tried plugins like Pock, but in the end no matter how hard I tried I can't help and force myself and interrupt what I'm doing to look down, it just doesn't make sense.


You're not meant to look at your keyboard. It's simply not efficient.


It might have worked with some kind of haptic feedback device?


It was tried. Many, many people spent many, many hours inside of Apple trying to make the Touchbar more useful. The simple fact was that looking down at it was a context shift and, in general, no one wanted to do it. It exposed functionality that you would eventually learn to drive from the keyboard.


> Analog knobs, jog wheels, sliders, trackballs, etc. are tragically missing.

Fwiw they are readily available by way of USB (e.g. MIDI) controllers. There are loads of dedicated knobs, faders, pads, etc. with a large amount of software to customize and translate those inputs (in addition to the array of software supporting them natively)

Obviously that would be external to the computer, but I think given the highly specific nature of analog controls it makes sense for these to be external. I'm having difficulty imaging a set of analog controls that would be at the same time universally useful and efficient in terms of weight and space utilization.


I actually really like the TouchBar except for the dramatic input lag. The input lag is so damn high that I never ever use it. If you could swipe left/right on it without holding down first, as on an iPhone, and if touch events generally had the same responsiveness as on an iPhone, I think everyone would have loved it much more. RIP TouchBar.


I used BetterTouchTool to add a second volume slider with no delay and no on-screen UI, for changing the volume while I watch something :)

I highly recommend using BetterTouchTool to get the most out of it if you still have a device with the touch bar.


I love BetterTouchTool, and it does help, but the input lag is still too high. And as far as I know, even with BTT, there’s no way to get a proper swipe interface that removes the “touch and hold” delay.


What I don't get about the touch bar is: Why didn't they put it above the F-keys as an additional input? There's room there. Why remove the F-keys (and, even more insanely, the ESC key)? People were more upset about that than any limitation in the touch bar's usability. It obviously adds some possibilities. It's just that some programs actually use the F-keys and have for decades. They provided a no-look keypress feedback for many tasks. What ever was the genius idea of taking that away?


The touch bar was just awful. Not only did it lack software support, but occasionally it would just become non-responsive. And that's really bad when you are trying to mute or turn the volume down quickly, and you have to wait a couple seconds for it to recognize touch input again, or fall back to the mouse.

Also it's on the keyboard where you're not looking. If you want analog controls, either make them tactile - like a physical dial or slider - or just make the display a touch screen.


> mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype

And this is why I believe more in the Ivy story than this one.

If you're a mechanical engineering group mismatch between prototype and production is MechEng 201 pretty much.

You wanna put out a new keyboard fine, but test the heck out of it. Let people have a go at it, put force onto it, spread some food on it, etc

Unless that mechanical group really think keyboards are a toy and don't need reliability, in this case they don't belong there


> No, this was some Apple-internal mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype vs. full-scale factory production + poor estimation of reliability in a wide variety of contexts over a longer period of time

Okay, but...what caused them to try to make a keyboard that thin in the first place? GP is suggesting that it was driven by Ive, which you dispute, but you only give an alternative explanation for the "what", not the "why".


>and no evidence that including it had anything to with making laptops expensive as a goal.

It is product differentiation. The MacBook Pro 2016 redesign was delayed by a year due to Intel's CPU problems. The touchbar Macbook also had higher ASP from the start. It was the Post PC era. Everyone was suppose to leave the PC ( including Mac ) platform to Tablet. It doesn't get any clearer than that. Even making an iPad ads "What's a Computer". Making an ASP increase is a typical move of a market where you want to milk it. Did I mention they completely neglect Mac Pro for years?

>Jony Ive was its patron

Despite media wants to claim otherwise at the time and had shills cover it up. He spend most of his time on Apple Retail redesign and Apple Park. But iPhone X and Macbook / MacBook Pro was his vision of how the ultimate MacBook Pro and iPhone would be as he said so himself. He was named CDO in 2015, along with some design team restructuring. The “Designed by Apple in California” chronicles 20 years of Apple design photo book came out in 2016. When he finally left in 2019, the media and shills were suggesting he hasn't actually been on product design for quite a few years. His earlier work on Apple in 2011+ was iOS 7 re-design. ( After Scott Forstall was out ) And we all know how that went as they spend the next 3 years to iterate out of it. To the point their old UX design head had to retire. And if you look at the changes to Apple Retail Store redesign, they were the same. Form over function. Partly Jony's fault, partly Angela.

>Nobody ever set out to make a “worse experience” or higher cost.

Apple filled many patents where they were looking at keyboard on a flat piece of glass with Force Touch and 3D Touch. These patents were specific to computer. Higher BOM cost are often used as moat in any luxury items.

>to design the best extremely thin keyboard they could...

If it wasn't for the butterfly keyboard. The internet would not have a group of people and product reviewer now talking about key travel distance. The thin keyboard has a similar typing experience as typing on glass......

>There are many suboptimal features...

No one fault them for trying. But the first report of keyboard problem came out in 2016 from MacBook users. Less than one year after its launch even before the MacBook Pro TouchBar. Apple constantly delete report of the problem on its support forum. The whole thing only gotten attention when an online press themselves decide to blog about it and went viral. That was 2018. They stopped reporting Mac user satisfactory in 2018, both in keynote and in investor meeting. It took nearly 3 years of ranting before Tim Cook even made a Keyboard Service Program.

Basically without Steve Jobs, no one had the gut to say, fuck this. This isn't working. Close it down. Work on a alternative or go back to where it was and we make a Service Programme. Instead they drag on it for years. Without product sensibility and direction.


>Replaceable RAM and SSD being lost is still painful. Personally I don't believe this was about forcing users to pay for upgrades primarily. It was about shaving off a small amount of volume.

Louis Rossman gets a lot of things wrong because he does not have a computer engineering background. For example, he does not understand why Apple used SPI on the Macbook Air instead of USB despite it having USB capability. I had to correct him to explain that when your design goal is extreme power saving, you have to cut everything including running your data over SPI instead of a more power hungry USB bus.

Furthermore one reason they ship soldered on Ram is technical. It has been explained here from time to time that they are achieving much higher memory bandwidth with the memory modules they are using and it necessitates being soldered on. If the design goal is to build the most responsive laptop while maintaining excellent power savings, then this is the right approach to take.


I can understand the soldred RAM on M1 — yes, speed of light and other laws of physics get in the way. But why solder the SSD? What's the technical benefit of that over putting an M.2 slot in there or something? How do you recover your data if you spill coffee on your laptop? What involved simply yoinking the SSD out of the slot now requires a fully working motherboard.


As things currently stand, the flash controller is internal to the M1 SoC. This results in significant cost and power savings, as well as some rather impressive performance.

Using an external SSD format like M.2 means the added monetary and power costs of 1) an external flash controller, 2) the mechanical components to interface with it, and 3) multiple PCIe lanes which can never be fully powered off.


> the flash controller is internal to the M1 SoC

Okay, it was part of the T2 chip before. And they were able to put replaceable SSDs into the Mac Pro. Those only contained the NAND chips and were proprietary, but still.

The problem I have with soldered SSDs is that flash memory has a limited lifespan in terms of writes. Once you're out of writes, then what? This is aggravated by the fact that M1 can't boot even from external media if the internal SSD isn't working.


It's worth noting that Apple's SSDs tend to be on the high side of the industry for write endurance. I think their calculus is that no reasonable user would exhaust the lifespan before the rest of the computer is well and truly obsolete.

(This was briefly not the case with that Big Sur SSD thrashing bug, but that's been fixed now)


It's actually split. The NVMe controller is in the M1 SoC, but the flash modules contain their own dumber controller and connect via PCIe in the back end (using a custom protocol, not NVMe).

Apple could and have put their SSDs in modules - that's how the T2 Mac Pro does it. They're still proprietary though, of course, due to the split design.


I don't. If on chip, yes. But in the same place with the same speed interface it should not matter once the wires are connected.


You can't take on-package RAM and stick it on a separate module through a connector (what connector? You need a 512-bit interface for the M1 Max...) without causing a problem or massively increasing your power consumption, for the same reason you can't string together 30 meters of USB3 extension cables and expect it to work. This is why USB 3.2 chipsets are little bare chips and run at 10Gbps, while 10Gb Ethernet chipsets need a big heatsink. More distance means a higher power consumption for the interface.

Interfaces are designed for specific characteristics like maximum capacitance and signal loss, and going beyond the design parameters means you need to change something. In this case, it would involve significantly increasing the power consumption of the interface.


Even if the SSDs were modular, you still wouldn't be able to do offline data recovery on them, because they're encrypted with system-tied keys. So this is tangential to them being replaceable or not.


>I can understand the soldred RAM on M1 — yes, speed of light and other laws of physics get in the way

so you do know the speed of signal in copper? ~20cm per 1ns. 5-7cm of extra track length to a slot would not hurt anything.


It absolutely does. It adds capacitance, which means you need stronger drivers and higher voltage levels, which means you end up with a quadratic power increase to maintain the same performance.

You cannot push a 512-bit wide memory bus like on the M1 Max through a connector and longer PCB traces without massively increasing the power consumption of the interface. Remember, those laptops have a memory bus equivalent to 8-channel traditional DDRx RAM. You'd need 8 SO-DIMM slots just to get the bandwidth, at a huge increase in power. LPDDR RAM isn't even available in modules for this reason; it's designed for low power, and using connectors goes against that design goal.


> Touch Bar? This was nothing more than adding expense to raise the ASP

I thought it was a great idea, and I still do, but am so glad they removed it. It sounds great on paper, but practically, I used it for nothing other than adjusting brightness and volume.

> The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.

Agreed. It really wasn't great. From a design perspective, it's quite clever, but from a usability perspective it was horrible.

> USB-C only was a philosophical move rather than a practical one that forced people everywhere to carry dongles.

Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.

I'm fine with it in moderation, frankly. USB-C is so clearly the future that where I take offense is that the rest of Apple's lineup doesn't work with it (iPhones, AirPods, some iPads, etc.).

> Replaceable RAM and SSD being lost is still painful.

Perhaps, but now that RAM is not only part of the SoC but a significant reason that the SoC is so good (high bandwidth shared memory between CPU and GPU), it's a change I'm more than fine with.

> Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.

Agreed. This finally, truly, feels like a Pro machine: "design first" is an approach for consumer products and, to Apple's credit, works very nicely on the iPad and iPhone and consumer MacBooks (generally). "Design first" for pro machines is great for the 3 minutes after opening the box, but when trying to do real work, you'd sacrifice all the bezels in the world to shave 30% off compile times.


> Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.

There's an important distinction here you're glossing over: unreliable, wireless, software-controlled Rube Goldberg-esque connections (bluetooth, wifi) can't possibly supersede reliable wired ones. Wired connections "just work" 99.999999% of the time, and when they don't, you can actually see and inspect the thing that connects your devices to troubleshoot it. Wireless works only when it feels like it.

VGA, on the other hand, was fully superseded by various digital video interfaces, and floppies were fully superseded by optical media and then various forms of cheap flash memory.

And I mean it. People do still miss headphone jacks, and people do still buy ethernet dongles for their laptops. People don't really miss floppies and CDs.


> Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.

The MBP designers still bravely include the 3.5mm headphone jack [0], though it is certainly true that the iPhone designers courageously jettisoned the jack.

[0] https://www.apple.com/macbook-pro-14-and-16/specs/


> The MBP designers still bravely include the 3.5mm headphone jack

And in Jony Ive designs they've had it on the wrong side for years. The 2021 model finally moves it back the left side.


> iPhone designers courageously jettisoned the jack.

There are a LOT of us that still believe this was a horrible choice.


That was probably sarcasm, Apple was widely mocked for referring to the removal as "courageous" at the time.


Indeed, I was lampooning their terrible decision to remove the headphone jack and their gall to refer to it as "courage." I'm still salty about the whole ordeal.


That they decided to remove it in the face of many people saying it was a terrible decision was exactly why they referred to it as "courage". A lot of phones have removed it since for most of the same reasons - the modern 3.5mm is a pretend spec.


"Pretend spec" misses the forest for the trees: amongst the vast, complex ecosystem of audio-visual ports, 1/8" audio stands out: we actually managed to converge on a connector format that is universal, omni-present, excellent quality, DRM-free, and which boasts an intuitive "no-code" UI affordance that Just Works.

I love my AirPods. But the instant I have to start digging through screens to do a pairing dance with some third-party speakers, car stereo, etc, I pine for the simpler times when you could just plug in the AUX, and I mourn for what has been lost.


Everyone says "bluetooth is great, just use that". Only, there's TONS of bad bluetooth implementations out there. There's very few bad headphone jack implementations.

You can literally hear how much worse the bluetooth audio sounds in my wife's care compared to mine. She uses the aux port because it's so bad.


3.5mm is very much _not_ universal.

In particular there is a wide variety of required gain/impedance as well as multiple different proprietary options as people attempted to adopt it to add microphones, controls/signals, and/or video. This is why Android, Apple, Microsoft and Playstation all have incompatible accessories.

The number of contacts can vary between 2 and 5. The length of the primary contact is not consistent, thus the mechanism to retain the plug often does not engage. Likewise, there is no specification on overall plug size to guarantee a jack will fit into the device case.

Thats excluding other extensions such as toslink. It is also worth noting that making the wrong connection, e.g. attaching an AV output to an audio input, can physically damage equipment.

Generally what people see is that the headset they got for whatever device works and assume there is broad compatibility - but to work best the jack on the headset is being designed for and tested against a particular subset of supported phones/controllers/music players. And broader support is typically not possible without separate connectors (e.g. a headset cord sold for apple, for xbox, for playstation, for android).


That may well be, and everyone, I'm sure, saw it differently. I happened to perceive their use of "courage" as condescending, hubristic elitism.

Anyway, I'm happy they are including a genuine, pretend-spec'ed 3.5mm audio output jack in their newest laptops.


I’ve never had any consequence from this except now my headphones also dont have the option of being a wired headphone, which means I have to ask the flight attendant for headphones to watch a movie on their screens. Everyone else is in the same situation though so its commonplace. Dont know if you’ve flown anywhere in the last year but a lot of people have.


I think most over-ear Bluetooth headphones have the ability to also act as wired headphones, via some flavor of USB -> Aux cord/adapter.


I don't think apple decided to "undo Ive" immediately, though. The 2019 models attempted an incremental fix approach - fix the keyboard layout, add a physical escape key - while trying to preserve the 2016 features (usb-c only, thinness, touchbar). It seems clear that it wasn't until 2021 that apple has decided to throw all the crap out entirely.


Yeah I agree this wasn't a binary switch.

Pure speculation: the forcing function for the big changes in the latest models was the M1. That forced a redesign of probably the entire unit anyway (eg different thermals, chipsets, power requirements, etc). Prior to that the path of least resistance was incremental changes and fixes.


> forced a redesign of probably the entire unit

Both laptops (Air and 13" Pro) that the M1 launched in kept their previous designs. So the switch from Intel to Apple Silicon itself wasn't the cause. The switch from M1 to M1 Pro and Max, maybe. But even the previous Intel machines had some serious TDP (and thermal issues), and even that wasn't enough to justify a redesign.


Where do the 2021 Macbok Pro Intel models fit in?


What 2021 MacBook Pro Intel models?

As far as I'm aware, the last Intel MBPs were released in 2020, and simply continued the 2019 design language.


The 13" Pro was updated to M1 together with the Air.


Sorry, still confused. That happened in Nov 2020 (so it's not a 2021 model), and it's an M1 (so not an Intel model)...so what 2021 Intel MBP is being referred to?


It's the 2021 model. They were working on this in 2020 maybe even 2019, but it takes time to deploy. Not that they didn't think about it until 2021. </pedantic_mode>


i have the 2019 mbp, bought reluctantly earlier this year after my 2015 went wonky. based on what i’d heard, i wasn’t expecting much, but it was clearly an improvement, even if it was incremental (touch id, bigger/brighter screen, better sound). the biggest obvious lack in the upgrade was the performance-battery life tradeoff, which is entirely on intel stagnating for over a decade. apple addressed these most glaring issues via the combo of m1 and more battery in the 2021. usb-c or touch bar were minor issues in comparison (that notch tho…).


> Also, the ports weren't all the same.

This was the most shocking to me. I learned that the left and right side ports run through different buses, but each side does not have enough capacity to supply both ports at full speed. This mean I had to buy long USB cable to run to the other side of my Macbook in order to supply 3 monitors. I have a port just sitting unused.

Also, there seems to be a problem with left-side USB ports when charging. They cause the system to overheat (or at least think it's overheating).

https://www.forbes.com/sites/barrycollins/2020/04/24/why-you...


Ive was probably cursing at the announcement: "WTF are you doing! You're giving people what they want? That's not the Apple I left, mate. By this time, in 2021, there should be NO ports on the Macbooks, and it should be so thin you could shave with it. What the hell is all this?"


If Ive had stayed for one more year they probably would have been selling 1mm thick rectangles of anodized aluminium. There's probably an Onion report for that.

It's ironic the extremes Apple design has ended up at, because there was a point in time (when both Ive and Jobs were working together) when Dieter Rams claimed Apple was the only company that really followed his ethos, minimalism with a critical qualification of honesty, where form follows function... but for the last decade it's been more like the aesthetic of minimalism at all cost... it's funny because that sounds quite a lot like skeuomorphism, it's pretentious.



"What the hell is all this?"

It's okay Ive. It's called "something useable" or even, if I might be so bold, "something the users want".


Sometimes I wonder if folks use rose colored glasses when thinking about MagSafe. I don't miss MagSafe and I enjoy the interchangeability of chargers.

I could not keep my 13" MacBook plugged in while using it on my lap. The MagSafe cord repeatedly fell off when my leg, or a pillow bumped into it. My MacBook MagSafe port had black marks on it from arcing. At least three times I couldn't get it to charge because a metal fragment (once a folded paper staple) became magnetically stuck to the receiving side. On one of those occasions I couldn't fix it till I got home and grabbed some tweezers.

It was a smart solution, but IMO to a tripping problem that wasn't widespread.


No, it's universally loved because it has universally saved macbooks from nasty falls. I upgraded to a sans-MagSafe 2020 16" MBP when my 2015 MBP had a $600 screen failure. I broke the 2020's screen in the first week. Sold it, repaired my 2015 and have been happy ever since.

Through all it's faults and problems, I can't shake how good my 2015 MacBook Pro has been: great keyboard, MagSafe, good enough everything, great keyboard. Did I mention how great the keyboard is? It's on my lap right now, purring and tethered to power as usual.


Yes, I have none of the experiences that thedougd has had, but have had multiple instances of the wire being yanked on.


And the cables were fraying constantly because the connector didn't have strain relief and it was using some weird rubber compound that crumbled. I had the misfortune of having the L-shaped one which was really bad (and they knew it because the next iterations returned to T-shaped). It didn't disconnect to prevent my macbook from falling from my desk and made my relatively new macbook all dented and beat up. When that fragile L-connector inevitably fractured and failed, and the cable not being detachable, I had to spend around hundred dollars to buy a whole new power brick. And that one was ruined within a year.


I really missed the orange/green charging/charged LED, I was a bit surprised they didn't throw that on a USB-C cable or the exterior of the machine.


That was a really nice touch and not too bright. I stumbled on a Dell USB-C charger with a white LED. It doesn't change colors and it's bright enough to light a room.


> I don't miss MagSafe and I enjoy the interchangeability of chargers.

The new magsafe uses a normal usb-c power brick with a special charging cable. The cable has a usb-c plug at one end and a magsafe plug at the other end. So it should be intercompatible between chargers now, and there shouldn't be any more need to buy a new charger when the cable frays. (And apparently you can also charge with a normal usb-c cable in any of the regular usb-c ports).

I'm quietly hoping they might be able to run data through that magsafe port too. Being able to have a small usb dock which connects over magsafe would be sweet.


I am appreciative they did that. I hope the USB-C ports on the laptop can handle the highest charging available today (~100 watts, not the new spec 200+ watts).


> The MagSafe cord repeatedly fell off when my leg, or a pillow bumped into it.

This sounds better than having your laptop's power connector internally damaged rendering your laptop unchargable just because your charging outlet was on the wrong side or your dog wanted to step on the cable.


The shape of the MagSafe 1 cord end was better for lap/lounging use because it didn’t stick directly out as far the way MagSafe 2 did. It came out into a fairly low profile cylindrical shape that made the cord turn an immediate right angle so things didn’t lever it over and make it fall off.


That’s actually the second iteration of the Magsafe 1 charger, FYI — the original was just a chunkier version of Magsafe 2/3. I liked the cylindrical version as well and I’ve always wondered why they did away with it.


I agree and I suspect this was less of a problem for people with a 15/16" MBP where the edge of the laptop was likely past their leg.


I know what you mean, although I thought the disconnection ease was calibrated correctly. The good thing is the modern battery life means an accidental disconnection is less of a problem.


> it was Johnny

Again with this rubbish.

I have worked at Apple and Ive does not unilaterally make the product decisions in the company. It is a combination of Product Marketing, Hardware Engineering, Design, Procurement etc and they are all discussed and signed off by the Senior Leadership Team.

When you building products at the scale Apple does decisions are years in the making. And so they need to make them based on what they think the future will be. Mostly they are right and sometimes they are wrong e.g. USB-C being the standard connector for everything.


> they are all discussed and signed off by the Senior Leadership Team.

I dont know how Apple works but sometimes if there is guy at the top who has to be kept happy you sign off on only those things which you know will make the guy at the top happy.


I'm quite happy about the move to USB-C and changed all my stuff to it as soon as possible. 5 years ago I had several micro usb and mini usb chargers, some of them broken. On a regular base I had to buy new chargers and cables. The MagSafe power supply cables broke easily (but yes, the port was nice). Now there's just one cable for everything, I still have 2 phone fast chargers but both actually work. Also I can just charge my phone without searching for the charger and the laptop can be connected to screen/keyboard with just 1 USB-C cable.

After all, Apple were also the first to sell Desktops without Floppy or Optical drive.


Worst of all, it was the loss of the much-beloved MagSafe.

I wish all of my cables were magnetic, The amount of things I have broken in my life by tripping is downright embarrassing. I do like being able to charge my mac book from my external monitor, and keeping my apple power supply in my bag in-case I need to go somewhere. It would just be nice if I didn't have to label my cables.


I use magnetic tear away cables for a lot of my gear. They have magnetic tips for lightning, USB-micro and USB-C.

For stuff like charging headphones, LED lights and other random gadgets with mixed plug types, I use charge-only cables for that stuff, and it's been super convenient.

There are also magnetic cables that support limited fast charging and data, but only at USB 2.0 speeds, so that could still be a deal breaker for some people.


I'm often tempted by magnetic adapters but when I look on Amazon at the options, it seems like I always see reviews from people who said it nearly caught their stuff on fire.

You have any recommendations for high quality magnetic gear?


> it seems like I always see reviews from people who said it nearly caught their stuff on fire.

This is part of a larger problem with a lack of regulations on high-current accessories. In the US, the FTC should probably be doing stringent inspections of imported cables, chargers, etc similar to how the FCC currently inspects communication devices so the substandard/dangerous ones get turned away at the border.


My experience with magnetic usb-c connectors is that the magnet can’t be very strong, because then it just pulls the adapter out of the port when you try to disconnect it.


I've used A.S., TopK and Melonboy without any issues.


Is the cable going from your laptop to your monitor in a position that you can trip on it?

Some kinds of cable really benefit from easily detaching, and some don't.


Is the cable going from your laptop to your monitor in a position that you can trip on it?

Not at my desk, but sometimes when I temporarily plug into another monitor or a television it can be a problem.

For my desk, I wish I just had a dock like my thinkpad from 10 years ago, or at least if the connectors were on the back so I didn't have wires sticking out on both sides.

Some kinds of cable really benefit from easily detaching, and some don't.

It's not like other connectors are difficult to disconnect. My previous macbook had an hdmi port that would disconnect if i breathed on the cable too hard. USB-C does seem a bit more snug so far, but who knows how well it will last. Magnets done well just have a better chance of surviving if you trip on something, or if you drop your laptop.


USB-C is supposed to be rated for many times more insertion cycles than USB-A which is a big bonus in it’s favour, that said they do tend to get perceivably looser over time.


I have one laptop whose usb-c port has loosened up to the point where I have to wedge something under the cable on the table to create upward pressure in the port or it won’t charge reliably.

I’ve been way more impressed with the durability of lightning ports. They get dirty and need to be cleaned out but their mechanical strength is amazing (apple’s cables on the other hand…). I like that Apple is confident enough in the strength of the lightning port that in the Apple stores, the standard display is to have the phone being physically supported by the port alone, even in an environment where hundreds of careless people are going to be messing with it.


Another advantage is that USB-C has all the springs on the cable, while USB-A has springs on the port. When the springs get loose, with USB-C you can just replace the cable, while with USB-A, you'd have to replace the port.


So far I've found the overall durability to be lower.

Usb a has nice big leads, that don't readily corrode away. Usb c on the other hand, my phone barely connects to power anymore


If it's pin damage I'd be kind of surprised. USB C has smaller pins but for power there are four each of positive and ground, instead of just one.


It would be nice if there was some sort of standard like USB for magnetic charge cables! I do like the standard USB c charger that I can use for my laptop, phone, and switch, but there are certainly clear advantages to magnetic disconnect


Touch Bar I think was a legit attempt at modernization.

Every time I 'see' a touchbar, I want one.

They look cool and useful.

The only reason I don't have one, is because everyone seems to indicate they are useless.

It's also possibly a platform issue - maybe they just didn't get enough participation etc..

Also - 'thinner' is a rational and generally positive thing, it's just that we've reached a threshold where the diminishing marginal returns are starting to weigh on other things.

Even ports - it's not an aesthetic issue only - they're trying to get everyone onto a standard. Frankly, I support the notion - I'd love it if everyone just used the same dam connector. The reason I don't like my 'USB C only Mac' is only because the ecosystem isn't there yet. If the ecosystem were there, I'd be fine with it.


I think that the touchbar could have been fine if it had been spaced one row above a full sized set of function keys.

As it is, when I use the keyboard on my work MBP, my fingers will brush the touchbar and do things. I have turned off the functionality in most apps so that I don't get that. (terminal especially)

For my actual usage, it's a step backwards from the pre-2016 function keys, an always available pause/play/stop, volume and brightness. I can't be doing something in emacs and then hit pause or louder, it's switch to music and then I can pause.

The lack of a physical escape key also dooms it in my usage. (no, escape is not going on caps-lock, that's where control goes)

So the work MBP is 99% used with an external keyboard/monitor. The personal 2015 MBP is generally used on the lap.

USB-c is ok, and the real magic is when you've got a Thunderbolt/USB-c power delivery monitor with a built in usb3 hub. One wire to the laptop and you're done. (and even better when the monitor has a built-in kvm so that the other computer is just a switch away, without mucking with cables).


Why not the best of both worlds? Bind capslock to "control when chorded, escape when pressed alone." I use Karabiner-Elements for this, it's very easy to set up. https://karabiner-elements.pqrs.org/


> Touch Bar I think was a legit attempt at modernization.

I agree, I think they're really interesting. The way we interact with computers hasn't changed much, and having a set of buttons you can configure sounds amazing.

I wish they would have just put it above the physical function keys, and deployed it to all MacBooks and wireless keyboards. Being limited to just some MacBook Pros probably hurt the number of apps that adopted it.


Not everyone says the touchbar is useless. I absolutely love it! I will miss it. And the new Macbook may be a beast but it looks terribly old - like a yesterday-machine. For the first time Apple released a product i will buy because it will be more capable - but not because i want it!


I'm kinda sad to see the Touch Bar go so fast because I have one and actually like it a lot. The variant where it has a physical escape key is perfect.


> Touch Bar? This was nothing more than adding expense to raise the ASP (Average Selling Price) of Macbooks, that had fallen precipitously low from a shareholder perspective because of the superb value-for-money proposition that was the 13" Macbook Air.

If customers don't like the Touch Bar, how does this make any sense? If pro users will pay (made-up number) $2000 for a MacBook Pro regardless of whether or not it has a Touch Bar because it comes with the CPU/GPU they want, adding a Touch Bar just decreases the margin.

If the MacBook Air is a better value-for-money proposition than the MacBook Pro to begin with, and customers do not actually like the Touch Bar, then why would they start switching to the MacBook Pro?


> If the MacBook Air is a better value-for-money proposition than the MacBook Pro to begin with, and customers do not actually like the Touch Bar, then why would they start switching to the MacBook Pro?

The psychological effects of the word "Pro".


Right, but the word "Pro" existed before the Touch Bar, and as far as we can tell customers did not actually like the Touch Bar.


Yeah it was a bit of a head-scratch moment when those 2016 machines came out. Until then it seemed like you could just trust a macbook pro to be a great machine that could do everything you need, and that each edition would get better than the last.

Around that period it felt like Apple made a pivot away from trying to increase market share through having the best product, and toward maximizing their revenue by leveraging their amazing brand image.

One of the early things was the change from the L-shaped mag-safe back to the T-shaped one. The L-shaped one was such an improvement - those things would last forever. But the T ones would fray in a matter of months. It almost felt like someone at Apple looked at the money they were losing from selling less chargers and decided they had to go back.

And I liked mag-safe but honestly I don't miss it. USB-C charging is fine and having one standard charger you can get anywhere work for everything more than makes up for any relative drawbacks.


What I always wondered is why they didn't start with a $999 base model (like the original iBook) that was cheap, but big and slow. if you wanted premium performance/expandability/ports/screen, you could pay $2K for the Pro model in the same form factor. If you wanted portability, you could pay $2K for the Air in a smaller form factor with the same performance as the iBook. The cheapest model being the most portable is bizarre.

Then again, the iPad Mini is more expensive than the larger iPad, so obviously there is something going on I don't understand. Perhaps the cost of engineering the motherboard and battery in an integrated package are so high that they can't afford to split the line any further.

The 2021 14" Pro is the first truly pro model in a while. I hope they keep it up. The keyboard is actually usable for extended periods, it has ports, the screen is great (to be fair, all Apple retina screens are great to varying degrees). Did I need it? No. But I wanted it. The last Mac laptop I bought for myself was the 2015 13" MacBook "Pro", so they're getting more money out of me this time around.


>the iPad Mini [$499+] is more expensive than the larger iPad

Sort of. The $329+ "iPad" has internals that are a few generations old, kind of like the iPhone SE. The iPad Air ($599+) and iPad Pro ($799+) are the "real" current larger iPads.


Each year apple has been shipping thicker and heavier products. I get that some people are happy but boy do I feel the heft when holding an iPhone 13 pro especially coming from a slim iPhone 6s Plus.

If Steve Jobs was alive he would never let such products ever released. You can feel he left his mark in MacBook 12 and iPhone 6 as those products were lightweight and thin - read pushing boundaries.


All the new ports are pretty useless to me. I think that adding them back in shows a lack of vision and just caters to the lowest common denominator of complaints instead of making a better product from first principles.

HDMI: I guess it's good if you have an old monitor? My monitor from like 2015 has USB C and charges my computer while I use it.

MagSafe: I've never had a computer fall off a table. Seems like a weird overoptimization for an unlikely scenario. If you use it you've got to carry a different cable that you can't use for anything else, unlike the USB C power cable that i also use to connect peripherals if needed.

SD card: I've never used one of these. I guess it's good for professional photographers? Why don't expensive cameras just have 256gb of onboard storage and connect over thunderbolt?


MagSafe isn't just about the convenience of it. It may not help you or have helped you but it's helped many people (including me) and even if it didn't, it provided peace of mind.

But the key reason for it is that it's a dedicated power port. That means that port is designed for that. You don't waste money allowing your laptop to be charged from any of the four USB-C ports. Like, who needs that?

Worse, those ports weren't identical leading to the advice to always charge from one side to avoid overheating.

I do find HDMI to be a bit of a strange choice however. I don't mind USB-C to HDMI/DP cables for this. Like you say, more monitors support DP passthrough over USB-C if not full TB.

The HDMI port is also 2.0 not 2.1. The difference? 2.0 can run 4K @ 60Hz. 2.1 can run 4K @ 120Hz. 120Hz continues to have poor support under OSX but it's clearly the future.

As for the SD card, I don't really use this either but this is aimed at photo and video professionals. Why not just connect the camera? Easy. Because you need to keep using the camera so it's far quicker just to swap out the cards and start copying.

A modern digital photo or video setup will have a camera with 2 SD card slots. The camera will write the same to both cards. When swapped out, one will be kept separately as backup. The other will be copied onto another device, which then may also copy that offsite. This way you immediately get 3-4 backups in 2-3 locations, which is a lot of redundancy.


> But the key reason for it is that it's a dedicated power port. That means that port is designed for that. You don't waste money allowing your laptop to be charged from any of the four USB-C ports. Like, who needs that?

USB ports have to be connected to the power system anyway because they have to power peripherals. Maybe it's got to be a slightly heavier connection to charge the computer, but I'm guessing the extra cost is minimal compared to a whole different fancy port with magnets in it and a whole different cable. Who needs it? People who don't want to carry around an extra cable everywhere.


I tripped on the magsafe cord on my 2011 macbook air and rather than disconnect it just chucked the whole laptop off the table and onto the floor.

Granted, it just bounced and flipped and sustained no damage, but I wouldn't guarantee you'd have the same luck.


> Why don't expensive cameras just have 256gb of onboard storage and connect over thunderbolt?

People don't want to throw away their camera every time the storage runs out. They want mirroring between cards so the wedding shots don't disappear on a storage failure. So they can swap storage on the go.

Your strong assertions based on your limited understanding of other people's needs says more about you than Apple.


I’m 100% with you. Including all these legacy soon to be dead ports (seriously, an sd reader?) feels like a big regression


my nikon z5 DOES have 256 gb of memory cards loaded in AND works over usbc.

That said, I still like the additions. the sd card and hdmi are the only reason I even have a dongle most of the time.


Apple products are excellent, well built, and they last. If they didn't add new things regularly far fewer people would buy. I think a lot of the new features on Apple products have simply been attempts to make things look different enough to be worth buying.

This includes removing the things they've added...


I'm in the minority, but I really enjoy the butterfly keyboard on my Macbook. I'd love to have a fullsized butterfly style keyboard for my PC. Unfortunately the high-end market for keyboards is dominated by mechanical keyboards which aren't my kind of thing at all.


Just plug in the Apple external keyboard and remap some keys. Unpopular opinion but I find the Apple keyboards feel “better” than most mechanical switches (certainly mx brown and clear), they have good tactile bump better than most and a nice lightness to them so they’re not fatiguing.


Apple external keyboard feels like a scissor switch keyboard. It's not too bad, but it's not as flat as butterfly keyboards are.

Personally, the only mechanical switch I could stand was Kailh Choc Blue - low profile blue switch.


Innovative design doesn't work without an internal champion who can rally the company around unconventional ideas. Jobs played that role, but now Apple is led by the operations team. The word "design" does not appear anywhere on their executive leadership page.

Unconventional ideas are inherently risky. They're just not worth pursuing if buy-in can't be secured and leadership is more focused on compromising to increase profit margins, etc. For that reason, it's great (in the short term!) that Apple is rehashing known-good designs from a decade ago. However, I don't see that strategy working in the long term.


I suspect M1X’s excellent memory bandwidth is related to the lack of replaceable memory. x86 CPUs can, as far as I know, max out the bandwidth of their DIMMs, and they don’t get anywhere near the bandwidth of M1X. And integrated design can do better.

Complaining about the price of Apple’s RAM seems reasonable, but complaining that they didn’t choose DDR5 DIMMs when they appear to have chosen something better seems silly.


How do you know all this? We’re you at apple at the time?


> The USB-C cable situation was and continues to be a nightmare as different cables support different subsets of data, power and video and, worse yet, different versions of each of those.

I’m about to upgrade from a 2015 MBP and am wondering - is there a usb-c cable I can buy which works with everything guaranteed?


> Steve Jobs bringing him back to reality

I find it funny to appeal to Steve Jobs for a reality check :)

btw my pet theory about the touch bar is: let's use the apple tech skills synergy again, we made tactile swiped device since the ipod, we made that into phones, now let's blend it into our laptops.


Surely the change to USB-C was a compatibility thing, as the chipset is Intel based, it’s probably easier to supply.

Now it’s their own chip, it makes sense that they have now switched to their own proprietary solutions.


I want this to be true but to be honest, I care less about the Ive angle than them simply doing it! It seems like someone with power said, "hey, what are these letters P, R and O standing for, again?".


I don't agree with the USB c story. To me it's extremely handy to have just usb c. All (most) devices work, can use the same cable for charging my phone. super convenient for me.


You are probably right but if we are talking about Touch Bar there are (at least few) people (like me) that prefer it to functional keys.


I fear 2016 MBP was Jony Ive unimpeded by practical concerns.


Courage!


Good riddence of him.


He still works with Apple as an independent stylist (Apple is LoveFrom's primary client), but likely not with the seniority and off-hands approach from others he had as chief design officers, which clearly was an issue with no Jobs to oversee.


He lived long enough to be the villain, huh. You don't have to love his style, and it got more extreme as time went on, but he did plenty of good along the way, and just struggled when not checked.


That would be nice, but would mean reversing all the decisions from 2016. Also, I'm happy about having more ports, but not all my devices are USB-C, especially pendrives. A noticeable part of my routine is dealing with various dongles just because someone thought they will decide what I need and in which direction I should be pushed. In the meantime, all other vendors continue to support USB-A.


This meeting, however, probably would have been in like 2017. We're just seeing the results now.


Does it really take four years to update a laptop design? That's getting up there with automotive development time, and cars are significantly more complex.


You don't think Apple has had a meeting yet about 2025 laptops?


Maybe some brainstorming ideas. But I would be surprised if they've put effort into actually designing it yet.


So I think we agree! The initial meeting OP mentioned (where they were like "let's make a list!") probably took place around 2017. I'll certainly grant your point about when more detailed work began. I'll also grant that who knows, maybe they didn't sign off on the concept of listening to the feedback until a year later.


If you create a problem, people will beg you to sell them the solution.


Like Coca Cola's New Coke solution.


I've always appreciated the fact that New Coke was more popular with Coke customers in a blind taste test. Pepsi usually wins, too, AFAIK. Probably why New Coke has a Pepsi ring to it.


This is the first I'm ever hearing of someone saying that New Coke was preferred. They don't even make it any more. They switched from New Coke to Coke Classic. What taste tests showed New Coke as popular?


Coke did blind tests at the time that showed New Coke was more popular (it's been too long, but I feel like there were some independent tests during the scandal time as well). The outrage when they released New Coke was all about nostalgia, nothing to do with taste. People were simply offended that Coke would have the audacity to change the flavor they were accustomed to.

I still remember the scandal. It was hilarious. There was a recent redux on a smaller scale when they modified the recipe for Coke Zero.

IIRC, the big difference between diet coke and coke zero is that one is based loosely on the formula for coca cola classic, and the other new coke. That may be an urban legend, though.


it would be hilarious if Coca Cola decided to produce new recipe under Classic label and see if anyone would notice the difference.

Or vise versa - old recipe under new brand)


It seems obvious to me that the decision to remove ports and buttons is not only for the design aesthetic, but less moving parts means less problems, less weight, less insulation, less cost and more elegant wireless peripherals to sell.

If you look at it the same was as the iPhone it’s clearly the same strategy as why they removed the headphone jack and why they want to remove all the buttons, and are ultimately aiming for a clear piece of glass.

It seems this 2021 Mac is a backwards step away from that strategy, perhaps recognising people aren’t ready yet, or perhaps competition and ecological concerns are becoming too big to ignore, either way, this will be a short lived regression, I’m sure.


Or perhaps people want want to connect three devices together without a bag full of dongles and let all batteries charge at the same time.


So looking forward to the next version of iPhone 4 — hopes higher than ever


Is the hard disk / SSD / NVMe still soldered to the board?


MBPs haven't had a "hard drive" to solder on in a few years: the "T2" security chip is also the SSD controller, managing the "freestanding" flash chips (which are soldered onto the board). The SSD isn't a separate thing at all.


That, however, does not prevent the flash chips from being removable. Sure you would not be able to access the data, but could replace/upgrade the flash without throwing away the computer when they fail. In fact that exists in a real T2 product: Mac Pro. You won't be able to replace it with off the shelf NVMe though.


There are still a few things left for 2022 models


I think it might be unlikely that we see replaceable flash chips come back as it sounds like the built in ones are insanely fast and likely cheaper as well.


I replaced one of the (honestly I don't know what to call them... gumstick sized ssd?) in a MacBook that had it as a separate component and it was a bad idea. It required an adapter board, some large proportion of available products on the market weren't compatible, and even with one that was there was still occasional strange behavior. Soldering in was the least of the problems.


"blade style drive" -> M.2

I also replaced the SSD in my 15" MacBook Pro 2014, and currently use a 4 TB M.2 drive.

When a replacement battery fried my logic board, I didn't lose my data. It didn't take multiple hours to re-clone from backup; I just used a screwdriver to move the SSD over to my spare laptop (13" Pro 2015).

A computer for me is primarily a data storage and retrieval device. Data loss is an existential risk to it. Data security doesn't bother me as much as it does other people; once the hardware is accessible, all bets are off anyway. I do get some strange behaviour with the new SSD (periodic weekly crash/reboot) but still accept that in order to have the larger capacity (4 TB is much more than the 512 GB when the laptop was new).


I did a similar replacement and it was a great idea! But you had to pick the right adapter, also the 2015 MBP had the least compatibility issues.


Wasn't 2015 still in the days of SATA? I put an SSD in mine and it was trivial. It's not at all what the 2016+ models are like.


- 2010 to ~2013: removable SATA SSD in a stick form-factor.

- ~2013 to 2015: removable PCIe SSD (connector not compatible with off the shelf ones though)

- 2016+: soldered


If you go to the other extreme of the spectrum, to the eMMC-based laptops, they are also soldered to the motherboard.

Which is really a shame (I have one Acer Aspire laptop and I mount /var/log as tmpfs without swap so it doesn't devour the eMMC like the RPis do with SD cards)


To summarize: the MBP no longer has a “hard disk”.

It has flash memory that is coupled with the SoC.

This is to make the memory bottleneck tolerable (considering the cpu and ram move 400GB/s)


It was all going well until somebody said 'let's have a notch!' when they'd already decided not to add face ID...


There isn't the depth to add face ID. If the notch is a real concern, you can use a setting which adds black bars to the top so it looks like the older macbook


Customer orientation was sadly extremely neglected by big tech, paternalism about what users want dominated design decisions. Most companies still do so, so kudos to Apple for once that they did really listen to customers.


Well said! Now they need to do the same with macOS.


AFAIK, replacing the battery on the previous generation meant replacing the top part of the body, to which the battery was glued. That included replacing the keyboard and the touchpad (not 100% sure if the old keyboard and touchpad could have been kept; maybe they were replaced just because of the damage done to them by the expanding batteries). At the same time, the previous generation had battery problems and keyboard problems (as pointed out in sub-comments), which meant many were replaced for free even out of warranty (as it happened to me due to faulty battery).

I suspect someone at Apple realized how much would have been saved if only the battery was not glued to the case.

Edit: mentioned the keyboard problem, which would result in replacing the battery too it seems.


It had the nice side effect that I got a brand new battery every time I had a single janky keyboard key. My battery is always nearly-new! :)


That's definitely a "glass is half full" view of the MacBook keyboard quality situation. :)


Same, but then the last time the battery went bad (swelling) with only about 30 cycles on it.


Yep I had annual keyboard replacements on all the butterfly macs I owned. I considered it a nice feature that the keyboard had this defect because it also meant a free annual battery replacement.


As I get older I start thinking more about our e-waste problem. This seems fairly egregious in that regard.


I wonder if all of the recent interest in right-to-repair laws by various states impacted the new design.


I wander if the new keyboard design with plastic rather than aluminium between the keys will also make it easer to replace it too, it almost looks like the module that could be swapped. I'm sure we will find out from iFixit soon!

I suppose it also means that the top case is no longer tied to different keyword layouts, fewer SKUs. That will have helped cut costs!

(written on a 2019MPB with duff ender key)


Some YouTube video said you still have to replace the whole top


it's a black anodized aluminum inset from what I can find online


Unfortunately there is a lot of really wild speculation going on that places a lot of the thinking into “upper management” and ignores the realities of engineering and production. IIRC - This is the first time the battery attachment method has been changed since they were made to be fully enclosed units. It could really be as simple as eng/Production standardising battery attachment tech for unibody enclosures and the switch to m1 was the first meaningful redesign where that could have happened.

This is underlined by Apple’s “repairability scores” being mostly random - it seems that apple just doesn’t spend a lot of R&D time obsessing about repairability tweaks in bump-releases, especially if they’re going to get in the way of their vision for the device.


I suspect that they see the right to repair movement and are trying to preempt it, at least to an extent.


It's probably just that replacing too many parts when one breaks was getting too costly for them


Maybe I'm reading too much into it, but adding convenient pull tabs really does feel like a shift in direction to at least be less hostile to non-Apple or DIY replacements. They could have easily opted for a special battery removal tool.


What good are pull tabs when you cant buy original replacement battery(1)?

/1 without giving up your business books for 5 years selling your customers privacy, and giving up ability to do component level repair.


More good than not having available replacement batteries and also having the original glued to the case?


You know, it could be both...


This is also how I got two new batteries replacements when I had to send my MBP in for keyboard replacement due to the stupid thing breaking (repeatedly).

Each time the system showed 0 cycles on the battery and I basically got a nice reset. Loved that part of it at least.


hahaha, my 2016 MBP chews through batteries - I'm on my fourth, I think - and every time I get a new battery it comes with a new keyboard, so I haven't had any of the keyboard problems some are plagued with.


That's what "authorized service providers" do. Unauthorized ones do replace just the battery by ungluing the old one.


Which involved hours of using chemicals to dissolve the glue with a plastic card. It was ranked one of the hardest repairs on ifixit.


I did it myself with some kit bought online for a 2013 (I think it might have actually been specifically an iFixIt kit). Ungluing the battery took all of 15 minutes and the plastic card jiggering was fairly trivial.

I’m quite positive the actual reason for its high difficulty score is that actually reaching the battery basically requires completely dismantling the whole laptop — the battery is under the motherboard, and every little module has to be removed and taken out fully before the motherboard can go. And putting them back in, the connectors are ridiculously small and fiddly.

The battery being glued was completely a non-concern IME.


I recently did my 2013 MBP, and while it was a massive pain that left my hands hurting afterwards, I was able to do it without solvents or having to dismantle the entire system like iFixit said I would need to.

Probably spent about 20 to 30 minutes using some old sacrificial credit cards to break the glue & pry the cells out.


Huh, no, the hardest one was opening the first Google Pixel without damaging the screen. I did break mine by prying too carelessly.

Still, replacing just the battery ends up being a lot cheaper.


At some point they ran out of that part, so they gave people refurbs. They didn't have the fr-CA keyboard in Germany, so I got upgraded from a 2012 mid-specced MacBook to a 2017 top-specced one for the price of a battery change.

Then I got two more battery swaps when my keyboard failed.


I still remember when you didn't even have to open up the "MBP" and could replace the battery directly from the bottom. Even had green led indicator to show how much charge it had directly on the battery case.


Maybe you also remember batteries wore terrible, especially durability. It was quite common to see people replacing batteries on less than 1 year old laptops. Nowadays batteries easily endure 2 to 5 years without becoming useless. Sure, there are still memory issues and reducing total time over the years. But replacement is required way less often.


lipo pouch batteries are not durable, they swell up and deform housings quite often... all it takes is heat.


Durability in this context clearly relates to charging cycles, not physical durability.

Old batteries wouldn't be useable at half the age most modern batteries to start swelling


Fair point. I bought at least 2 or 3 replacement batteries for the G3 iBook I had in college.

Then again, I was probably harder on the battery in those days than I am now, with my laptop often plugged in. I supposed my 2011 Air lated until at least 2018 on the original battery though.


Wow, I totally forgot about that period. I recall that I used to plug in my laptop when I arrived at my desk, then remove the battery to ensure that it was running exclusively on wall power in an effort to improve battery health. I never did have to replace that battery...


Was this MBP era or iBook/PowerBook era? I seem to recall this being the case on the old Titanium Powerbooks in the early 00s.

Remember the glowing light near the laptop latch that would slowly swell to tell you that your laptop was asleep?


There was a MacBook Pro era - even a couple Unibody revisions - that kept the easy user replaceable battery intact.

The Unibody had this lovely latch mechanism that also allowed you to easily replace your hard drive and/or RAM.

God, I miss those days. :(


Yup my beautiful 2008 "ALU" MBP had this. Was very sleek.


I have never seen "swell" used that way. I thought you were saying your battery was making the entire casing swell.


MBP era, I owned the first of the Aluminium unibody Macbook's which were released around 2008 which had a removable body with a tab.


Man, I wish they would go back to having battery indicator on the case again. Truly miss that as I don't want to wake it just to find out if I need to seek a plug.


I remember thinking it was absolutely genius when they first released it on the 2008 Alu Unibody Macbooks. I found that in practice I never ever used it.


Those were the days. I remember replacing the battery on my iBook G4 when it died. :)


Credit where credit is due. Let this be a trend.


Hear hear. It's frankly amazing how this model did a 180 in these aspects.


Yeah it's like they took the list of everyone's complaints - touchbar, ports, magsafe, battery and fixed them all. Very un-Apple-like.

There's only really minor complaints left, like the unreplaceable RAM and disks, and lack of touchscreen.


The "unreplaceable RAM" will only get "worse" from your perspective. Meanwhile, many people, myself included, are very happy to have better efficiency/performance by co-locating the all of the silicon.

Once Apple invests in 3D chiplets, it is very likely that RAM, CPU, and GPU will be all be the same component. This is also likely necessary to eventually get memristors into commercial SOCs. I think maybe even the SSD might get pushed into the chiplet if they can manage the 3D real-estate. Ideally, even colocate the UWB and cellular modem[1] onto the single 3D chiplet or maybe have two SOCs one for compute+storage and one for wireless.

[1] https://www.bloomberg.com/news/articles/2020-12-10/apple-sta...


I'm confused, isn't the M1 RAM+CPU+GPU on the same component?


I'm confused too, lol. The problem is it's all very new and people haven't really settled on standardized naming.

At the end of the day they are all MCMs(https://en.wikipedia.org/wiki/Multi-chip_module).

Previously, RAM, CPU, and GPU were all connected by the motherboard with copper(or similar) based wires. Because the wires can't overlap on the motherboard, there could only be X wires connecting the CPU/GPU to the RAM. To replace the RAM, either disconnect it from the slot or de-solder all X wires if needed, then replace the RAM chips.

Based on what I've read the M1 is a single MCM using a silicon-interconnect fabric, where the RAM and CPU/GPU are to each other's left and right[1]. This reduces the length of the wires a lot, which is a lot more efficient and improves performance. However, it still means you can only have X "wires" going from the CPU to the memory because the "wires" can't overlap. In theory (with very expensive equipment), you could "disconnect" all X "wires" and "pick up" the RAM chips and swap them for new ones.

A 3D chiplet[2] has the components stacked on top of each other i.e. the CPU/GPU is on top of the RAM, i.e. the RAM is below the CPU/GPU. Technically you could keep the legacy number of "wires", i.e. X "wires" between the CPU and memory, however, because the "wires" now go up and down (rather than left and right) they can never overlap and you can now increase the bandwidth by having X100 or X1000 more "wires" between the CPU/GPU and RAM. While, theoretically you could still "disconnect" the up and down "wires", because there are sooo many more "wires" between the stacked chips everything gets more difficult, and replacing them gets even even more expensive. At which point, from my perspective, basically puts them in the category of "single integrated circuit" or "monolithic integrated circuit" (which is probably a better description than my previous use of "single component").

[1] Pic from Apple: https://www.apple.com/newsroom/images/product/mac/standard/A...

[2] https://wccftech.com/amd-discloses-multi-layer-chip-design-e...


>lack of touchscreen.

Please god no.


Touchscreens are my personal nemesis, too. I have literally no scenario where I’d like to lift my hand from the keyboard or touchpad to put my finger on my screen to click something. Maybe it’s a thing with other use cases though. I think the new Microsoft Studio Laptop does it in a reasonable way since you can fold it and use a pen and all. Still not for me but with foldable devices I get it at least.


I actually go against the grain here and say I like touchscreens.

Being able to draw on screenshots with my finger (I that a lot for work) is really nice.

At least it has the killer feature of having the ability to be completely ignored, unlike other "trendy" features on new laptops :)

Also, I can never find a good use case for the folding tablet modes.

Maybe it's because I'm not an artist, and don't my computer much for pure media consumption, but almost everything on the computer is designed around having a keyboard.

I find it silly to type on a software keyboard 5x slower than the keyboard already attached to the device.


Huge props to the hilarious polishing cloth teardown, complete with American Psycho quote, shade thrown at what else you could get for $19, and how it's actually two polishing cloths if you cut it in half.


Replaceable battery, after removing the trackpad to access the final set of cells. Not the easiest projecdure and still is going to result in users doing what they do today, handing the machine off to a technician rather than be able to change the batteries on their device themselves. What a fall from grace from the first unibody macbook, where you could remove the battery with no tools and five seconds of your time since they engineered a door with a latch. I guess we celebrate what small affordances we can get these days.


God, grant me the serenity to accept the things I cannot change, courage to change the things I can, and wisdom to know the difference.

You can stop grinding that axe bud, it’s never gonna swing again.


If it wasn't for people with axes to grind this new macbook would have a touchbar, butterfly keyboard, and only thunderbolt ports.


Those are changes that the majority of users wanted, but very few want a return to clunky pop-out batteries, even fewer in an era of 20-hour battery life.


Thunderbolt “port” you mean.


That's not at all what I think of when I hear the words 'replaceable battery'. Too bad.


I don't get it - the battery isn't glued in, and you can remove it without damaging other parts. Isn't that pretty good?


When I hear "replaceable battery", it makes me think of the days when packing a spare battery was a viable alternative to bringing a laptop charger with you (I don't think any MBPs were like this, but Powerbooks were). It's funny how the Overton window has shifted.

This procedure looks doable and relatively low risk for technical people, but it's not something that my mom can do while sitting at a Starbucks


These days the battery lasts so long you don't have to bring a second battery with you and your laptop could realistically be charged by any phone charger overnight.

You also have the option of battery banks. As long as you can replace the battery when it wears out, thats all that matters to the average user.


Just curious, do existing M1's accept input as low as 5 watts, given it comes with a 30W adapter?


I just tested and mine is showing the charging symbol when plugged in to the 5w iphone brick. Obviously this will be super slow to charge though.


Your spare battery is now a power bank: https://www.powerbankexpert.com/best-power-bank-for-macbook-...

Sure, a bit less power efficient, but surely cheaper than an Apple product, and you can choose your features e.g. AC inverter!


Great, after dongle-ing USB-A, Displayport, and M.2, now my battery can be a dongle too!


I'm not buying a laptop unless it includes DVI-I, DVI-D, VGA, Mini DisplayPort, Mini HDMI, and Micro HDMI.


I'm not even joking, when you dock my Thinkpad it has every single one of those ports.


The first gen MBP (which was basically a PowerBook with Intel) had this. I remember pondering whether to buy one.


I remember borrowing my sibling's iBook G4 battery while sitting at the airport back in the day. You'd just push the grey button on the bottom face and it popped out.


I can flip my notebook over, pop the battery out and put a new one in within 5 seconds. That is replacable.

I also replaced a macbook battery once, with a hot air resoldering station (to soften the adhesive) and a metal bucket filled with sand nearby in case it catches fire (the battery was balooning and bending the aluminium frame out of shape). The whole thing took nearly an hour. Ultimately it was also replaceable, but this was needlessly painful.


It's still an entire procedure. You have to remove the trackpad to access all the cells. Most users aren't going to be confident doing that to their $2000 laptop, they will continue doing what they do with glued in batteries today which is hand it off to a technician and playing the flat rate apple repair fee.

Pretty good was the first unibody macbook (open latch with one finger, remove door with two fingers, remove battery with two fingers, done):

https://www.ifixit.com/Guide/MacBook+Unibody+Model+A1278+Bat...


Yeah there's still a long way to go to get back to where we used to be.


What is the use case? Modern batteries hold their charge for a long time. In a review I saw today, the reviewer - a professional photographer and filmmaker no less - said that he managed to shoot edit and export a video on a single charge[0]. The need to swap out batteries with these device has all but disappeared, save for permanent replacement.

[0]https://youtu.be/I10WMJV96ns?t=643


The use case is the ability to easily replace/upgrade the batteries, RAM and drives by the end consumer without requiring any specialized tools or procedures barring removing some screws.

This is what we used to have with laptops 10+ years ago.

Swapping batteries due to charge issues was never particularly important to the majority of consumers and is a red herring.


The benefits provided by SoCs outweigh the benefits of user upgradeable RAM for me.

> Swapping batteries due to charge issues was never particularly important to the majority of consumers and is a red herring.

Why?


>The benefits provided by SoCs outweigh the benefits of user upgradeable RAM for me.

And 90% of laptop users. Utility of computer upgrades for most people’s needs stagnated 5 to even 10 years ago. I think the last big material improvement before M1 for regular consumers was SSD replacing HDD.


You say that like those are somehow mutually exclusive.


They kind of are by definition - System on a Chip. I'll rephrase; the benefits of having on-package RAM (unified would be even better!) outweigh the benefits of user-upgradable RAM. User upgradable/appendable storage is another thing entirely.


The _only_ real benefits of on-package RAM are cost and forcing planned obsolescence. You get maybe 0.2ns latency difference by not laying out ram socket next to CPU.


Please show me a workload constrained by memory latency/bandwidth on MacOS. I'd really love to see how those benchmarks turn out so the average user can decide for themselves.


I'd love to see less passive/aggressive responses, but this is the internet...

Editing images and video comes to mind. Average users are doing that a lot more, since they have access to high resolution RAW output via mobile devices that struggle less than their desktop to cope with the throughput.


> I'd love to see less passive/aggressive responses, but this is the internet...

You're the guy who decided to drag this argument off onto the soapbox that you wanted to talk about that I never cared about. I wrote you off from the tone of your first response... because this is the internet...


Just noting an alternative use case: When I used to take my laptop (a Dell XPS M1530, later a Dell Studio XPS 13) on my walk between home and university, I'd take it without the battery in as that made it much lighter. There were power points where I was going anyway.


About as replaceable as a tesla's battery. You simply cant do it yourself. Seems like a trend.


This is dismissive and incorrect. iFixIt, the authors of this piece, provide everything someone needs to do such a replacement: tools, parts, and in-depth user guides. I've replaced an older-generation MacBook's battery using their stuff, and it worked fine. What more do you want?


I had a 2007 MacBook Pro. Replacing the battery for that laptop meant toggling two switches and popping it out, then dropping a new one in.

It's not unreasonable for people to ask for that level of simplicity.


It's similar for most older laptops too. I have a few older HP laptops, and on every one the battery just slides out if you pull it hard enough - no screws at all.


The vast majority of people are not comfortable opening their computer and mucking around the guts. In the old days Apple made this simple for their users:

https://www.ifixit.com/Guide/MacBook+Unibody+Model+A1278+Bat...


> The vast majority of people are not comfortable opening their computer and mucking around the guts.

That's why Apple offers to do it for them.

Sure, completely user-replacable would be ideal but such a battery design surely comes with other trade-offs and compromises.


Coming from a x220, x230 thinkpad, removing anything at all or needing to use a screwdriver at all is way above what most are capable of doing. Obviously what I am willing to do to fix my spouses iphone, and what the average person is willing to do to self - replace are very different.


Apple used to have some of the best-engineered batteries, now they don't. Simple as.


That comment about the new "Apple Polishing Cloth" is hilarious. 20$ for a piece of cloth...

Glad the battery is easier to replace, though (speaking from experience!)


The cleaning cloth was sold out immediately.

$20 is a ton for a cleaning cloth, but a small price to pay for consumers of apple products.

Here is the alternative (personal experience) 1. You search on amazon 2. Get presented with 1000 products 3. Read some reviews, wonder if this is the best product for your iPhone, will it work on the laptop 4. decide you don't really need a cleaning cloth

The apple solution, solves your problem for a price.

I haven't bought the apple cloth, but have and am considering it.


Also: It's apparently not just an overpriced accessory, but a necessary cleaning tool for the Pro XDR display with nano surface: https://support.apple.com/guide/mac-pro/clean-pro-display-xd...

Honestly, I clean my iPad Pro so often with lackluster results I might just buy Apple's cleaning cloth if it's actually better than other microfiber cloths.


Maybe you should consider bringing the ipad to apple for a professional cleaning? They are delivered without fingerprints when they're new.


It’s not really that much effort, is it? You just buy a cleaning cloth? No?

Unless of course all the while one was disturbed at the very thought of not having this revolutionary invention of Apple A Cloth That Just Cleans™ at home.


But it's an artisanal cloth made of unicorn fur!


The lint must be picked one-by-one off the unicorns with extra small tweezers that can only be held by child laborers!


Not just any child laborers, mind you: your $20 are getting your the finest Uighur and political prisoner labor money can buy!


In a world where Apple is pushing on-device CSAM scanning and serial number locking cameras to motherboards, it's nice to see that some of their products still respect users' rights.

Now if only we could get an iPhone Pro with this kind of respect for right to repair.


You can't wipe the drives on these machines fully and make them functional again without an internet connection back to Apple.

Even the Monterey installer doesn't work offline at present, even the full 12GB one, or even a usb one made with createinstallmedia --downloadassets.

You must transmit your serial number to the mothership.


On the flip side, these machines are unbrickable, which pretty much no PC is. If you wipe a PC "fully" (BIOS and all) it's dead. That's why you need to call the mothership to fully DFU reinstall one of these Macs - it literally goes and downloads things like factory calibration info from their database.

You can in principle do offline OS installs on these Macs in reduced security mode. I did it once with Big Sur. If that doesn't work with Monterey, that's a bug that should be reported and fixed. Full security mode requires phoning home, as part of their security model (this is so they can stop allowing installs of old, vulnerable OSes, which could be used as an attack vector on someone else's machine).

You need to make sure you install from 1TR if you want to do a reduced security install; again, this is part of their security model, to ensure you've asserted physical presence.


> again, this is part of their security model, to ensure you've asserted physical presence.

And again, it's totally optional, because "reduced" and "permissive" security mode exist.

(Yes, you said that, but I feel it should be re-iterated because it's important! The user is in control here!)


To get into those modes you need to be in 1TR which means you need to assert physical presence (hold down the power button to boot); that's what I meant :)

The user is in control, they just need to prove they're physically at the machine.


Really sucks to see how awesome the new MBP is. It's too bad the CSAM scandal had to happen; never buying another Apple product again because of that shit.


And heaer I thought I was the only person who just can't get enthused about their new MacBooks, knowing that CSAM could just get added at any point in the future without my consent.


> knowing that CSAM could just get added at any point

CSAM is an abbreviation for "child sexual abuse material", so no, that won't (hopefully) get added without your consent.


The scanning wasn’t to be done by the OS, but by the Photos app uploader, and only if you had iCloud Photo Library turned on. Lots of apps already do telemetry and photo scanning that goes beyond this, as well as nearly every major cloud provider (though Apple seemingly doesn’t scan iCloud photo libraries without a warrant unlike Google and Facebook).

Like a lot of apps, you don’t need to use Photos.app. They didn’t sneak it in without explaining how it worked.


That's why there are self-hosted solutions like NextCloud or home NAS devices.


They also said they're open to opening it up to 3rd parties in the future.


They are late though. Last year I have switched to Debian because of very few DIY repair options in MacBooks. And I discovered OS superiority as well. Everything in Debian (Gnome) is so fast - opening PDF files, Files (finder), Terminal, EMacs. Debian running on i5 is much faster then macOS running on i7.


is it faster than osx running on m1 though?


I don't imagine it would make much of a difference, unless there's some kind of M1 upgrade kit for older Macbooks I never heard of.


Stockholm syndrome doesn’t look like Stockholm syndrome.

Everyone cheering apple on for making the laptop they should have made in 2016. Wow! No Touch Bar it’s so much better.

Semi replaceable batteries! How great. The funny thing is, if they had dropped the ports on this generation instead of the last, less people would have cared.


Yeah but they also chose to bring all those much loved things back to coincide with their mac apple silicon.

Sly but I wouldn't be surprised if they held it back for M1 to make the appeal of the devices bigger.


The SSD also has to be replaceable (without a full logic board swap) or the machine also has a constrained usable lifetime.


They should check if the camera stops working when you replace it.


Very nice to see!

I use my machine in closed clamshell around 90% of the time which means the battery is usually in pretty terrible shape after a couple years of use. Will be happy to see battery replacement times hopefully go down on these new machines as waiting 5-7 days isn't fun to deal with.


>I use my machine in closed clamshell around 90% of the time which means the battery is usually in pretty terrible shape after a couple years of use.

For those of us who usually use our MacBooks attached to the wall, deep discharging (AKA "calibration"; <https://www.newertech.com/batteries/power-calibration-guide/>) on a regular basis is a substitute, but is annoying to do.

I similarly went through batteries every couple of years. I now use FruitJuice (on the App Store) as the menu bar battery indicator. If it finds that the computer hasn't been used on battery long enough, about once a month the app guides me through running a maintenance cycle to run it down to 20%. I presume that following FruitJuice's advice is why my latest third-party battery is still at 103% health after more than a year.


> FruitJuice

Thanks for the tip, I'll have to give this a shot.


I really don't understand why macbooks are set up to still draw off the battery when under AC power. You can only get a mac to run off ac power only if you start it up with the battery physically removed iirc. I had a macbook where I was spinning fans for most of the day and I got it down to 85% battery capacity within a year since it keeps straining the battery even when its just sitting on my desk running off a 90W power adapter at 100% charge. Its like, whats the point of paying for these workhorse laptops if you are going to be blowing through batteries once you actually start to utilize the power you are paying egregiously for? Might as well get a powerful mac mini and connect to it with ssh from a much cheaper laptop.


> I had a macbook where I was spinning fans for most of the day and I got it down to 85% battery capacity within a year since it keeps straining the battery even when its just sitting on my desk running off a 90W power adapter at 100% charge

They released a feature in Big Sur where it learns if you always keep your laptop plugged in, and during those times (which may be all the time, like in my case) it'll hover the battery between 70-80% to preserve battery health, and won't charge up to 100% until you tell it to manually (click the battery icon -> "Charge to full now")


Check out the app Al Dente. There’s a free version, but the pro also comes with some nice features.

https://apphousekitchen.com/


Apple seems to be taking a more repair-friendly approach for high-end products recently (except the iPhone and iPad).

AirPods Max are also great: - User-replaceable earcups with no tools needed; - User-replaceable headband, just needs a paperclip; - Somewhat user-replaceable battery - you can have them do it for $79 out of warranty, but a repair shop can definitely do it as well because it's just screws.


I’m sure this is more to save $$ for Apple, but either way it’s a win. It’s nice to feel like someone is actually listening with this new redesign.


2022 iphone will have a 3.5mm jack. Mark my words ;)


Do people still use wired headphones on the go? It’s been a while since I’ve seen any. You can get decent Bluetooth wireless headphones for ~$20 now. I remember spending $10 every few months when I was using wired headphones back in the day. Every few months because the wired inevitably broke or frayed (don’t forget you have to put them somewhere when you’re not using them). It was hell with jackets and layers in the winter.

I always got the orange version of this: https://refreshcartridges.co.uk/productimages/a_121995.jpg

In any case where I care enough to listen wired, I also care enough to get a separate DAC. The cheap Bluetooth is competitive with the cheap wired these days and last longer.

Example: https://www.amazon.com/TOZO-T6-Bluetooth-Headphones-Waterpro...

Id rather use that space for extra battery capacity.


>Do people still use wired headphones on the go?

I do, at least. The main reason is that they just work. I never have to worry about pairing with different devices, walking too far away and disconnecting, keeping them charged... I have a pair of bluetooth headphones but I'm pretty sure they're not charged.


I do, I use some sony MDRV6 cans when travelling or doing work. They are 10 years old and will last another 40 at least producing this reference tier sound. I think I payed $70 for these. I have no interest in airpods or crap like that, I already have their wired ones which I use exclusively for video conferencing and I know what sort of sound comes out of those.


I only use wired headphones. Bluetooth = PITA. Wired earbuds are like $1 for crappy ones, which are good enough for my purposes. I also have a bigger set of over-the-ear headphones from back before wireless was a thing, and those work fine in all my stuff too. Bluetooth is sort of like USB, a basically sane and simple idea that got committee-enhanced to the point of permanent brokenness.


I hate having to remember to recharge yet another device. My wired earbuds never run out of battery.


Yes - no need to charge them, less likely to get lost, no figuring out if my device speaks random codec v1 or 2

MMCX connectors mean you can replace the cable, not that I've ever broken a Shure one (they use Kevlar)


I use wired headphones. I don't like charging my headphones. I'm not big into mobile phones or other devices though, so IDK, I'm probably an atypical user. I'm actually annoyed that phones no longer have headphone ports on them.


> You can get decent Bluetooth wireless headphones for ~$20 now

IMO $20 is only barely enough to get something WIRED that won't turn your music into a mushy mess.


I only use wired headphones when the wireless alternative uses AAC.


no. apart from loud voices on HN, basically everyone is living in a wireless world because its better in almost every way. i was personally a long holdout on wired, but modern wireless (bose qc30s and now airpods) are so much more useable in almost every situation except maybe for never leaving your desk.


Maybe your argument holds good in a country like the USA, but everyone is not actually living in a wireless world. It might be that people who can afford an apple product buy wireless products. But atleast in india which has over a billion people, majority of us still use wired earphones. The concept of "wireless" things is definitely on the rise here but it still hasn't been completely adopted.


Yes they do.


Both my teenagers do. And yeah, I have a monthly subscription for headphone cables.


I'm not positive, but by removing the headphone jack, they sell more AirPods and force bluetooth on which enables the airtag and find my network to be more way more functional.


words marked


So they say it can actually be replaced.

Nice!

Hopefully they don't pull an iPhone and individually serialize batteries that authenticate with other chips on the phone - if they don't match you get an annoying warning and supposedly underclocked CPU causing performance issues even if it's a brand new battery.


With the iPhoneification of their product line I'm sure that's exactly what's going to happen.


It’s the right thing to do from a service prospective, the last design had it built into the top case. I’m sure they hit supply issues with all the keyboard failures on the old model. Are the speakers in the old model removable? If not more topcase constraints ha.


At least you got a new/clean keyboard everytime your battery did need service


Ease of being able to replace the battery has never been the reason stopping me from upgrading from a 2015 MacBook Pro. Anyone with the the intent of replacing a battery today will be able to make it after buying some readily available tools.

What I need is the ability to swap out my SSD in the laptop, either for data retrieval, backing up, or upgrades. Unfortunately good removeable storage seems to be a thing of the past. I say "good" because I don't want to boot the newest MBP off a slow SD card.

I could depend on Migration Assistant or Time Machine, but I don't trust anything other than a good raw "cat" or "dd" of a drive


> We removed the trackpad and, lo and behold, there are cut-outs to access the pull tabs that hold the middle battery cells in place.

That's not exactly what I was expecting based on the headline. Still seems pretty complicated.


For me the 2021 MacBook Pro is a single USB-A port away from perfect. I want to use my old but good printer/scanner/mouse/thumb drive/external hard drive/yubikey without dongles.


A lot of dongle talk seems weird to me. A to C adapters are cheap. Buy a bunch and leave them on the things you regularly use. No dongle to track.


It's still way more convenient to have the USB-A port built-in, even if you have myriad USB-C dongles.

I'm trialing switching away from an XPS13DE (Develoepr Edition) to some "cheap"-ish 600 dollar laptop from HP just because I literally cannot stand the fact that the XPS doesn't have USB-A ports. I notice the difference (my cheap laptop is so much nicer) since it's so much more convenient since a lot of things are still USB-A these days (eg. my Arduino uses a printer cable which is USB-A, my flash drives are USB-A, my HDMI capture dongle is USB-A, my cheap webcam is USB-A, et cetera). Even if I have myriad dongles, if I'm travelling away from home, I don't have to remember to bring those dongles with me anymore.


totally agree. A sucks, C is a way better port. leave the $2 adapters on your old a cables if you have to. total non issue.


Are they pads for extra DRAM on the bottom left of the PCB?

Will we see hobbysts with a heat gun adding ram to these things? BGA soldering with tweezers and a heat gun doesn't look too tricky on those because the pad spacing is easily within what you could manage with tweezers.

The actual pin layout isn't a standard I recognize - so I assume apple has ordered custom made ram chips ..


That's probably far too distant for LPDDR. I guess these pads are for NAND flash, and RAM is right next to processor under the heatsink.


The RAM is "unified" and on the same chip as the CPU and GPU.


I love that they literally did a teardown of the polishing cloth as well. Which received a 0 out of 10 on the repairability scale.


I'm pretty sure that's the same cloth that came with my XDR

I say it deserves at least a 5 out of 10 since you can wash it in the washing machine on cold and it air dries quickly


> we notice ... battery pull tabs

Is the picture supposed to show these new tabs? I don't see anything. Can I get a red arrow?


It's the thing being stretched between the fingers. Like a command strip.


It seems to me, making batteries difficult to service and replace is very consumer-unfriendly--not to mention shitty to the environment. So this is definitely step in right direction. Not exactly so easy as pulling it out from the bottom like back in the day of bricks, but hey.


This is wonderful news, I do wonder whether the 2021 MBP would work properly with DIY battery replacement ?!

newer iPhones notoriously don't like DIY battery / screen replacements, and they disable features such as battery health with a DIY replacement.


Well, that's a huge plus for me. It seems that this kind of laptop will have much better lifespan. It seems that some engineer in Apple now is able to make decisions. I thought that all the decisions were driven by designers.


Important question: Are the batteries serialized or will any 3rd party battery just work?


I have been due to replace my battery on my 2019 MBP (I know, yeah…). The problem is not the price (which is steep) but the fact that I have to leave « my girlfriend » at the shop for 3-4 days. This new design is a game-changer.


Any apple repair they are going to make you wait. They don't do much of anything in the store anymore these days, they just scan with their software and send away. No clue why they don't image a loaner laptop for you in the meantime. They much such a big hubbo about pros using their hardware then they don't offer services that pros require for their hardware, like no downtime since they can't just tell the client that their rig is out of commission for four days.


This would be great - they could image your machine & imprint onto the replacement with activation/etc even having you authenticate AppleID as needed. Is disk-imaging like CarbonCopyCloner possible on newer machines?


Apple definitely has the software to do this. When you buy a new laptop from them they can do this for you to migrate your old mac. No reason why they couldn't do it for a loaner.


I’d call that technician friendly, certainly not Do it (Yourself).

What happened to outer compartments you could pop open without having to tear apart your $1500 laptop with a pair of tweezers?


Thank you, frame.work - https://frame.work/ - this is what happens when there is real competition and innovation in the market.


I sincerely doubt framework was even a factor in the decision making of whether or not MacBook batteries should be follow iPhones already existing design.


100%. It was probably due to some EU lawsuit or other, and rather than make 2 designs (US / EU), they just changed the design to make everyone happy.

It's weird that US companies basically need to be sued by the EU for stuff to be consumer friendly..


> I sincerely doubt framework was even a factor in the decision making ...

Oh, you may be surprised at how closely Apple watches and reacts to PR and its competitors - it's one of the things they do right and are really good at. Some examples:

1. After Apple's very successful launch of the iPhone, they got a huge shock in Europe when Jolla, a small, new startup of ex-Nokia employees launched a phone with a new mobile OS that outsold the iPhone. When Apple realised that Jolla's marketing emphasised "user privacy", Apple strategically temporarily shelved its plan to collect user data (for which it was getting bad PR) and even pretended to abandon their advertising platform. And that worked out very well for them because luckily for them Jolla was mismanaged, and failed.

2. Frame.work has received highly positive reviews from both the media and users / patrons all of whom have acknowledged and appreciated the creativity and innovative use of existing technology to create a highly repairable laptop. While it may not have outsold any Mac device yet, the PR buzz it has generated has focused public spotlight back on right-to-repair and created new awareness and appreciation for repairable electronics. Invariably, comparison has been made with Apple's popular yet deliberately hard-to-repair devices and you can bet that it has made Apple quite uncomfortable. (With regulators breathing down their neck about right-to-repair, the last thing Apple needs is an innovative competitor that tauts repairability as a feature).

3. When Apple released a Mac Mini with soldered RAM and SSD, the criticism and poor sales forced them to backtrack and release the next Mac Mini with replaceable RAM. (Again, a temporary strategic withdrawal).

4. The current and new Apple iPhones size and design are inspired by Sony mobiles phones, one of the few companies in the world that still has their own design division and produces amazing phones with great hardware.

5. The whole "thin device" craze at Apple was inspired by a Motorola phone. (And ofcourse, it remains popular as it aids their "planned obsolescence" goal for their devices).

I am not completely disparaging Apple - reacting to PR and their competitors is something giants sometimes ignore at their peril. But Apple doesn't, and they cleverly calibrate their strategy to maintain their competitive lead.

(I'd even say the article linked to is just a fluff piece trying to convey the impression that the new Apple laptop is suddenly a more easy to repair device because the battery is no longer glued like before but uses stretchable adhesive :). I recently repaired an iPhone SE and the battery adhesive broke as I was pulling it carefully and after that it was a real pain to remove it without damaging anything - easy to repair, my ass.)


So in just a few months Apple pivoted all their years of planning, engineering, design, supply chain and production for the new Pros to "compete" with a small startup that sells probably less in total than what Apple might sell in 12 minutes?

Right...

Snark aside, the 2021 Pros seem well balanced between moving things forward hardware wise, and still offering enough I/O. They're likely not perfect at all, and for me the M1 Air is more than enough.

But all those things - planning, engineering, design and supply chain and last mile distribution take years to execute on, not weeks or months.


Yes, I do honestly believe that Apple does respond to negative PR and potential competitors (minor or otherwise). It's one of the things I admire about them.

But my earlier comment itself was partially snarky to what is essentially a puff piece trying to convey that the new Apple laptops is now more "repairable" - "Oh my god! Apple has replaced the glue they use on their battery with stretchable adhesive that is now easier to remove!"


Not arguing the fact they do respond.

Just that the timelines on products are very long especially with hardware. There are no fast or simple pivots, and not usually in months - more like years on many things.


I’ve never heard of this. Only one of my casual tech blog reading or coding friends have. I don’t think Apple would do something because of such a small competitor. The Wikipedia entry for it only has an introduction section.


It's gotten a bit more media attention recently as Linus (of Tech Tips/YouTube fame, not Linux/Git fame) has made a rather large investment in them and has been talking about them a lot on his platforms.

https://www.youtube.com/watch?v=LSxbc1IN9Gg


It said his investment is $200K-ish. His YouTube stuff seems insanely popular, so he invested a couple weeks of profit at most?


I have no idea of his financials but that does sound like more than a few weeks of profit. Remember he has a lot of staff to pay, the huge office/studio space and bills for that and running stuff like the LTT Store which will have expenses also. And then supporting his family on top of that.


I imagine the LTT Store is a net positive with fantastic margins.


I wouldn't call it as easily repairable as frame.work - frame.work doesn't use those glue white strips instead they just have few screws and you can remove such battery without struggle.

I once changed battery in my old iphone SE and it's a real pain if such white strips break - and they are sooo thin that it's really easy to break. Once it breaks you have to use heat gun and fishing line as improvised saw to cut battery from the case.

BTW My old macbook 15'' pro 2012 unibody had only screws and easily removable battery


Same here - I had to use a hair dryer and dental floss to "saw" through the adhesive strip once it broke off ... damn near took me around 30 - 40 minutes! (And some HP laptops don't even need screws but provide sliding buttons to easily "unlock" and remove / change the battery).


They’re a competitor by definition, but not by any practical consideration.

Despite seeming like a great company and product they probably sell fewer machines and have less mindshare than the $19 Apple cleaning cloth.

Since all laptops previously had and some still have replaceable batteries, including Apple, I’m also curious how Framework gets credit for this innovation?


By bucking the trend and good PR really. Framework is interesting, but in reality reminds me of when Thinkpads were still tank-like workhorses that were quasi self-servicable over decade a ago. For the most part, you could swap components out, which is Framework's whole schtick.

The ship has sailed on replaceable RAM and storage IMO, because most of us want the performance increases that losing upgradability brings. It's a trade-off but one that's worth it in the long run.


> The ship has sailed on replaceable RAM and storage IMO, because most of us want the performance increases that losing upgradability brings. It's a trade-off but one that's worth it in the long run.

That's what Apple marketing and clueless fanboys want you to believe.

M1 is fast because it is fast. Not because it has RAM in package or soldered SSD.

Socketed RAM and SSDs would be just as fast.


Hi, Linus! :P


Is this Apple?

Well, maybe I'll consider one of these over a Framework. I'll lean strongly towards the Framework machine but at least Apple is reconsidering some of their design choices over the last decade.


I recently bought a Framework as a replacement for my 2012 MBP. I ordered it a few weeks before the new M1 announcements because I liked their philosophy. And the hardware is, in fact, great!

But the Linux desktop experience has been ... frustrating to say the least. And I say this as someone who has used linux in the terminal at my job every day for 5+ years now. It's ridiculous that the software experience still cannot match my 9 year old laptop running macOS Sierra on a 2.5GHz Core i5 and 8GB of DDR3 RAM!

I am running PopOS 21.04. There's inconsistent keyboard shortcuts, lack of touchpad precision, glitchy touchpad gestures, inconsistent fingerprint auth and more. Just on Day 2, I somehow ended up with a machine that somehow took 30+ seconds to go from login to desktop and 3-4 second lag when typing with 100% CPU usage when running any GUI app. The only way to fix it was to completely reinstall the OS -- I have no idea what caused this. I have now found some usable options for implementing my preferred touchpad gestures ("fusuma") and system-wide keyboard shortcuts ("kinto.sh") though there are still many quirks that are irritating.

I like Framework's approach, so I supported them by buying the device. I promised myself I will keep daily driving it for at least a month to give it a chance. If it keeps getting on my nerves, I will sell it and buy one the of the new 14" MBPs. Life's too short to spend hours upon hours fiddling with configuration files and reinstalling operating systems to get basic functionality working.


Personally I find Gnome environments pretty ugly. I switched to KDE on my Framework and it’s almost night and day that I’m leaving my desktop Gnome install behind for KDE as well.


What distro are you running?


Manjaro on the framework (first time) and Ubuntu and Manjaro on the desktop.


I tried switching to KDE yesterday (still in PopOS) and it's been a much better experience so far. I was able to customize a bunch of stuff including dock/global menu to be more macOS-like which is what I prefer, and the touchpad also seems to be much better integrated. I will look into Manjaro as well.

Thanks!


Hi, glad to hear you made the de change. I'm also on PopOS 20.04LTS and I would like to go to KDE. How did you install from terminal, the standard or the full version? Which is the best version? Thanks in advance for your help.


Awesome to hear you made the leap! I think KDE works wonderful on Pop_OS since Pop is mostly just a few gnome tweaks here and there on top of Ubuntu.

FYI I only am trying out Manjaro just because I wanted an Arch experience and for my gaming build I thought it might fair a bit better since the Steam Deck is built against it. I think Ubuntu is still a perfectly great solution.


What I enjoy most about KDE is the need NOT to install extensions!

Do it your way vs Gnome "here's what we think matters".


The article says "reasonably DIY-friendly". I think this is an important distinction. Just because the battery has pull tabs doesn't mean that it's easy to get to that point.


As a mac user, I welcome the improvement in battery replacement design. The cost of replacing the battery (and the keyboard) has been one of the worst design decisions by apple.


Too bad you can't change out the soldered on RAM. Thus still not buying even if adobe keeps up the "no photoshop on Linux" bs forever


Well, this is great. Because, while I don`t like to admit they launch a lot of trends that other manufactures tend to follow. Hope they go the same way.


Whoah! That's fantastic news for the customers and the environment! I was thinking of skipping this model, but now I'm sure I'll buy it.


It's still not repairable...


Well, I'm realistic, I don't believe Apple stops being greedy anytime soon. But the battery issue is a huge step forward. I mean, of all components, that's the one I know will break first because of the laws of physics (and chemistry). So having it irreplaceable is particularly cruel. It's like saying "buy this expensive machine and watch it die." I really hate throwing hardware away, I try to make use of everything I own and repurpose it if necessary, but the battery literally destroys everything, I have this bulge that damages the touchpad and keyboard and Apple says they will fix it for a ton of money. I could have done it a long time ago if I could, thanks.


Now if only the rest of the Macbook was DIY friendly; where I could upgrade my RAM to 64GB and NVME ~1TB without giving up my left kidney.


Isn’t ram now part of soc?


It is, that's why I'm saying it would be nicer if I could upgrade RAM myself without giving up my $$$ kidney $$$


What is the component between the CPU and the fan on each side?

It appears to be non-rectangular silicon... Or perhaps some power MOSFETs under a cover?


Is it just me or does the 'since 2012' reads as if it can also mean "and it has been that way since 2012".


Damn it really is the return of the Titanium Powerbook G4, what?


about time. My battery needs service now and I'm dreading to go to the store.

Would much rather buy a new battery and put it in myself and be done with t


I wonder if this is the effect of Jony Ive being gone.


Does framebook have an easily replaceable battery?


Depends on your definition of "easy"

https://guides.frame.work/Guide/Battery+Replacement+Guide/85

It's past my personal definition, but just barely, since it involves dealing with a ribbon cable and an apparently fragile connection for the battery itself. It's not too bad, but not something I want to try and walk someone through if they're not used to this sort of thing.


Yes, the Framework laptop's battery is more easily replaceable than the new MBP's, from what I can tell without yet having seen iFixit's full teardown of the latter—there's no need to remove the trackpad or deal with adhesives on the Framework, in contrast to the new MBP.

(I'm still thrilled to see this improvement to serviceability on the new MBP, even if previous generations set a low bar.)


The commentary about the cloth is fantastic


IMO apple’s peak ridiculousness had to be the Magic Mouse that required you turn it on its back and unusable in order to charge it.


I agree it's silly, but it means that once a month, I need to plug it in when I go for lunch, or before I go to bed.

And you can still use the scrolling top and click while its charging if you really need to continue browsing.


Is a charging dock for a mouse ridiculous? You can't use the mouse while it charges! How absurd! How silly!

Did you actually have one? Or are you remembering a picture you saw online once?



Thanks for the direct link, it'd be good if the post linked here vs a reheated version of the same content.


It's been updated to this link, now.


I feel that, on the ports and touchbar side, this was a step backwards. The SD slot is a good thing, but the replacement of one USB-C with a (vintage) HDMI port and the addition of a proprietary power connector is a leap backwards in time.

The Touchbar was my daughter's favorite - an Emoji keyboard. It also did a lot more - bringing up manpages in the terminal, having app-specific buttons (no need to hunt down the Zoom window to mute). People wanted an ESC key and they got it eventually. I couldn't care less about function keys. I get it was very expensive for the little if offered, but, still, it's still useful. More useful than F1 to F12 ever were.

Magsafe is nice from a safety standpoint (I have a kid and work from the couch sometimes), but with all-day battery life, what is the use of it? On my desk, the Mac is plugged into power, two external monitors, ethernet, keyboard and trackpad with a single cable, as God intended it to be. That's 3 free USB-C connectors for anything else (such as the external storage) and there is one power brick that lives in my backpack along a pair of US/EU adapters for when I need to travel. With USB-C power, it was nice to have the Dell and the Mac sharing power bricks when needed (if I said that 10 years ago, I would laugh myself out of the room)

I like the replaceable battery, but I miss the touchbar and actively dislike the reintroduction of Magsafe.


Vintage in what sense? HDMI is literally one of, if not the most common types of display ports on TVs, Monitors, game consoles, etc. I think most people would choose a straight HDMI to HDMI connection over trying to find the right USB-C/lighting/whatever cable to fit their needs (which is very hard to actually do). The HDMI port has stayed the same for a long time, yet new HDMI specs come out every few years expanding the capabilities.


I was being a bit cruel. It's an HDMI 2.0 port, which is useful for presentations (a USB-C to HDMI dongle lives in my backpack for that reason). 2.0 will drive a 4K monitor at 60 fps. Never tried that, but, IIRC, the original USB-C port (and HDMI 2.1) could do it at 120 fps (which, for my terminals, would be... totally overkill, just like 60 fps already is).


While I agree HDMI 2.0 on a new for 2021 machine is an odd choice, displays can still be connected to the Thunderbolt 4 ports much like the previous gen. Apple's spec sheet says two external monitors at 6k/60hz supported this way via TB4.

All of this is to point out, has anyone confirmed 4k/120 on the thunderbolt port? Given the 2x6k monitor output 4k @ 120hz sounds like it should be possible, unless Apple have nerfed the output.


> Apple's spec sheet says two external monitors at 6k/60hz supported this way via TB4.

It’s 2 for the Pro but 4 for the Max (technically 3 plus I think 4K@60, because that’s the limit of the hdmi port).


What's wrong with a USB-C to HDMI cable if you need that connectivity?


You can pick up a 2015 MacBook and walk away carrying nothing else, and handle a wide variety of situations that may occur. Three things enable this: long battery life (don't need your brick); a touchpad that's not absolute hell to use for more than a minute or two at a time (don't need an external mouse); and port selection. Need something off Bill in marketing's USB stick (it'll be USB-A, almost certainly, even in 2021, let alone 2016)? Need to plug into a TV or projector, or even just a normal monitor that's not super-duper-new? HDMI is far and away your best bet, especially if you're not carrying your own cables. Photographer has some pictures for you that need to go on the web site (or you are the photographer)? SDCard reader, no problem.

That ease-of-use—just pick it up and walk away, you don't even need to think about it—is significantly weakened if you need a few special cables and dongles to be similarly-well-prepared.


I mean let's be honest, how many of us are road warriors that need every single type of connectivity known to exist? Having a single port type on your device simplifies one half of the equation.

Your device's ports are a casualty of the lack of standard data port. The HDMI on displays makes you feel like you need an HDMI on your laptop, I disagree if my USB-C port can do that AND a whole bunch of other things.


> I mean let's be honest, how many of us are road warriors that need every single type of connectivity known to exist?

Road warrior? It also meant you could grab it and go to the conference room and not have to take anything else with you, or go back to your desk for something. Packing for a business trip? Maybe you need to throw the power brick in the bag. That's all. Every single type? It had, what, five, including the rather niche Thunderbolt ports? Those are what should have become USB-C ports, as that change would have been 100% an improvement. Keep the rest, including USB-A, which might finally not be the most useful USB port to have in, oh, 2030 or so, if trends continue. Should be right about the time USB-C is being replaced ("Can you believe anyone ever thought that cable situation was OK? LOL.")

> Having a single port type on your device simplifies one half of the equation.

I don't think it's been most people's experience that having only USB-C makes their device simpler to use with a broad range of peripherals, even ~5 years after Apple went all-in on it.


I'm not sure I see it, the same.

At home I have a CalDigit docking station. One cable to my laptop, and I get power, video to a second monitor, digital audio connection to a DAC for audiophile quality headphones, SD card reader if I really need it and wired ethernet. The laptop still has 3 open USB-C ports. How is it not beautiful that a single cable takes care of all of that. Sure, we've had docking stations forever but they weren't single cable, they looked like cash registers you have to literally sit your laptop in to get the same level of connectivity.

When I'm traveling I have a few cables for various scenarios and adapters. USB-C to HDMI for video, also works for my iPad so it serves two purposes. I don't use SD-cards on the road so this isn't a problem for me but a USB or USB-C or whatever to SD-card reader isn't bulky enough to gripe about for traveling.

At work conference rooms can simply provide the needed connectors so when I disconnect from my desk and end up in a conference room I still have the connectivity I need.


I definitely agree that USB-C makes a great connector for workstation-type docking situations, if you're getting to pick all the hardware for that purpose (which is what I've done, too).

> USB-C to HDMI for video, also works for my iPad so it serves two purposes.

I get why they didn't do it at first (lower-end models exist in part to use up parts from previous higher-end models, so you can't just change them all on day one) but it's crazy to me that they're still shipping iOS devices of any kind with Lightning ports. The dongle thing would have been less annoying if I could at least use the same dongles on all my (new) Apple devices, iOS and MacOS alike. As it is, only my 4th-gen iPad Pro has it. [edit] And man, is it so frustrating that they've almost achieved a situation where you can travel with one brick and one cable to provide power for your laptop and phone and a tablet... but no, they kept putting out new Lightning devices for years.


When Apple finally ditches Lightning on the iPhone, people are going to raise holy hell. That's probably the only reason it's still there.


Despite intensely disliking meetings I regularly get pulled into them, and having to go back to your desk to get an adapter wastes everybody’s time and makes you look like an idiot.

It also requires carrying an adapter at all times just in case.

> Your device's ports are a casualty of the lack of standard data port

To my great dismay HDMI is the standard video port, that’s why it got added back.

> The HDMI on displays makes you feel like you need an HDMI on your laptop

No, what makes them feel that is that every video input aside from specifically desk-top computer displays is HDMI.


We used to have other cable's for video prior. However, today we expect interoperability across so many more devices that saying we need this to be a video cable and this to be an audio cable put us in this position in first place.

Over time I imagine we will evolve to a more standard "data transfer" cable that is what USB-C is trying to do. The transition isn't always easy and will introduce friction to various use cases.

Remember firewire? Today when I'm mobile I have a bookbag, a laptop and a small accessory bag. The accessory bag has various cables, dongles and adapters to ensure I have plug-ability.

Going back to your desk is a work problem, the office should just have those dongles in all the conference rooms and the problem you describe is entirely moot.


> Going back to your desk is a work problem, the office should just have those dongles in all the conference rooms and the problem you describe is entirely moot.

You run into this a lot places where only some developers and maybe the artists use MacBooks. Everyone else has fat Windows machines that have every port known to man and don't need dongles. Past the initial (annoying and unnecessary, but oh well) adjustment period, it wasn't that bad for all-Mac shops, but it's a real pain in mixed shops because it's basically just a MacBook problem.


And this is why our meeting rooms have either usb-c called directly for video or an hdmi to usb-c adapter tie wrapped to the hdmi cable.


With my 2014 MBP, my Windows laptop, or my Chromebook, I can plug straight into a hotel TV and watch a movie from my laptop. With my 2019 MBP, I need an adapter.

This doesn't seem like a big deal, but the first time I went on vacation with my family after getting the 2019 MBP, I forgot that I needed to pack the adapter, and we couldn't watch whatever series we were currently binging on Netflix or HBO, which was pretty annoying. I'm happy to see HDMI ports showing up on more laptops these days!


> I'm happy to see HDMI ports showing up on more laptops these days!

HDMI is going to be especially sticky, and great to have built-in, for years to come, most likely. AFAIK USB-C cannot replace it, because, like most data cables that aren't HDMI or Ethernet, it has really, really short max-length limitations. Meanwhile, HDMI can have runs of 20+ meters and work totally fine, no repeaters or anything. If you're building in a ceiling projector, or have a TV at one end of a room but the connection in a conference table, you will use HDMI. Something might replace it, but it'll be a cable we've not heard of yet, not USB-C.


You can send USB C and/or Thunderbolt over fiber. It's rather expensive as of now, but it's possible.

Example 50m cable: https://www.rockshop.de/corning-thunderbolt-3-optical-cable-...


Or, you know, buy a $10 dongle.


That's actually really cool if you wanted to send input over the same connection. Like, if the thing that's remote is the computer, not the monitor or projector. Put your noisy graphics workstation in the utility room or something, run Thunderbolt & USB3 over fiber to your desk on another floor, but still feel like it's local, not like RDP or VNC.

Way too expensive for that, but if the price drops a bunch on that tech, that'd become an option.


> Way too expensive for that, but if the price drops a bunch on that tech, that'd become an option.

You can get those cables for ~500$. It's not super practical yet, but if you can afford a 1500$ GPU, you can also add that :) Plus, you'll need to worry about cooling, noise and optics a lot less, which, on a high end build, might already be enough saving to make it worthwhile.


Yes. Luckily I fetched up at that place around 2015 - so I was given the best MacBook Pro ever made, and the best iPhone ever made - the 5S. Shame to have to give it all back really.

Still baffles me why there was no hash key though.


This is why I never upgraded from my 2015 MBP. I'm still using it everyday, God forbid it breaks. My next laptop will be something else entirely, with Linux on it.


It seems that you're not in a line of work where people give a lot of presentations. Where I work it was almost a certain that at the beginning of some presentation session one mac speaker had to ask if someone had an adapter because they lost/forgot... theirs. If lucky another speaker was on a mac as well and has an adapter, otherwise someone has to go find an adapter somewhere


In the world of covid no I don't give presentations which are not virtual. In the prior world I did and if I was traveling I would bring dongles needed for connectivity. In the office I would our company provided dongles for this purpose.

I really don't see the big deal with a one-off use case like that being the reason my machine NEEDS a specific port that doesn't have a compatible adapter.


and that would go away shortly as everything becomes usbc


it's great if you have one. doesn't work so well if you don't have one.

HDMI is great because whatever random TV you want to connect to probably already has some other device plugged in with a 6' HDMI cable that you can steal.


Ports tend to be common until Apple deprecates them.


Apple is still one of the large personal computer manufacturers.


Nevertheless, when Apple excludes a port the other OEMs start wondering whether that port is necessary or if they're including it out of blind convention.

Likewise, when Apple champions a non-proprietary port on their PCs, other manufacturers tend to follow suit.

Apple is more than just another large OEM. They are the standard bearer for the personal computing industry. They have tremendous influence over what a computing device looks like and how it connects to peripherals and other machines. (WiFi was largely a lab experiment before Apple's AirPort.)


> They are the standard bearer for the personal computing industry.

That they are indeed. HP has a whole line of laptops named "Envy" for a reason.


> I feel that, on the ports and touchbar side, this was a step backwards. The SD slot is a good thing, but the replacement of one USB-C with a (vintage) HDMI port and the addition of a proprietary power connector is a leap backwards in time.

Just because it's a leap backwards doesn't mean it's bad. Sometimes changes are not good and backtracking is.

* HDMI inputs are in super common world in the professional world, it's become the standard for all video projection or conference room TVs (unless you still have an even more ancient VGA) and having to carry adapters to meetings or to give talks is a pain in the ass. It's also pretty much the standard for digital audio (since Apple also removed SPDIF).

* The touchbar was shit, it could have been OK as an addition to function keys, but it precluded any and all muscle memory and quick access to featues. "More useful than F1 to F12 ever were." isn't even remotely true.

* And not all users are at your desk, that the laptop has "all day battery life" (as long as you only watch youtube videos, you're not going to get 11h battery life if you're actively working with demanding software) doesn't mean people won't need to charge them at places which are not your desk. Plenty of people work on kitchen tables and whatnot, with the power cable in the way and yankable by a pet, a child, or just somebody going through.

> On my desk, the Mac is plugged into power, two external monitors, ethernet, keyboard and trackpad with a single cable, as God intended it to be. That's 3 free USB-C connectors for anything else (such as the external storage) and there is one power brick that lives in my backpack along a pair of US/EU adapters for when I need to travel. With USB-C power, it was nice to have the Dell and the Mac sharing power bricks when needed (if I said that 10 years ago, I would laugh myself out of the room)

Certainly I can't see how you could ever survive with only 2 free USB-C connectors left. Poor you.


> * The touchbar was shit, it could have been OK as an addition to function keys, but it precluded any and all muscle memory and quick access to featues. "More useful than F1 to F12 ever were." isn't even remotely true.

I mean, the F1-F12 keys are so mostly useless that the majority of laptops require you to press a modifier to use them as such. They replaced them with functions that are more broadly useful to most people: media keys, brightness control, keyboard back lighting, etc. I mostly agree that those keys are easier to use than the touchbar, though.


> media keys, brightness control, keyboard back lighting

Gaming and hotkey macros notwithstanding, I think this is what people are mostly lamenting when they complain about the TouchBar replacing F-keys.


When was the last time someone touch-typed F9?


> The Touchbar was my daughter's favorite - an Emoji keyboard

This is the Macbook Pro, they shouldn't be optimizing for children's love of emojis. I get what you're saying about your other usages of it, but I hate the Touchbar with the heat of a thousand suns, and given that Apple did decide to remove it on the Pro I'm guessing my sentiment was in the majority.


I don't get this charicterization where professionals don't have fun or use those childish whimsical things like emojis. By a huge huge margin I use more emojis at work than anywhere else and having a keyboard for them is quite nice.

The Touch Bar is infinitely more useful than the function key row -- volume and brightness sliders are way better than "louder" and "brighter" buttons. Answering calls with the bar is easier than mousing over to the Teams notification.


I found it useful when changing brightness or tuning with the sound, other than that I don't use it, like, 99% of the time. It would be awesome if you could customize it easily (think i3 status bar, polybar) but I guess this wouldn't happen with Apple. So they abandoned it (which is a wasted effort) for a promising feature.


Even when using emojis, I am extremely serious and professional.


I may or may not have (ab)used mine as an emoji keyboard after realizing Github Enterprise could deal with emojis just fine.


Cmd+ctrl+space

You can then search for whatever emoji you want and hit enter without taking your eyes off the screen or your fingers off the keyboard. Typing emojis was never a problem in macOS that the touchbar needed to solve


The default fn key with globe icon by itself keystroke works as well on the newer MacBooks.


Thank you. [hands pressed together]


The MagSafe connector is just USB-C on the other side, so it's almost just a form factor thing (though obviously it doesn't support data).

And you can still power the thing over USB-C (alongside data), just not at the same wattage/no quick charge. It's perfectly serviceable unless you're doing something really punishing.

The only problem I've had so far is getting a two monitors working with my particular USB-C dock (it works with an Intel MacBook). Hopefully it's a software/firmware thing, because I enjoy the single cable life as well.


It was a known limitation of the M1 that it only supported one external display even if your Thunderbolt dock supported two displays.

The M1 simply couldn't push more pixels.


This is false. It has nothing to do with pushing pixels. Else it should be easily be able to handle 4 Full HD monitors instead of a single 4k monitor.


If it completely replaced the USB-C port and made the power brick also an Ethernet interface, I’d love it. USB-C minus data isn’t what I’d expect from USB-C.


my only real disappointments with the new MBPs so far, on paper (haven't received mine yet) have to do with networking. I was really hoping the new MagSafe would be precisely that, data+power with ethernet in the power brick. I'd have recommended adding 140W bricks to every desk in our office if that had been the case.

And while I understand the chipset availability limitations, 2x2 802.11ax will at best nearly equal the performance of my 2017 MBP in my current 3x3ac deployment. Guess I'll have to hurry up transitioning to ax.


> Magsafe is nice from a safety standpoint (I have a kid and work from the couch sometimes), but with all-day battery life, what is the use of it?

I was having a similar discussion the other day. MagSafe was amazing when my Intel MBP had to basically be tethered to power all the time. My M1 MBA only ever gets charged at my desk plugged into a dock/monitor. MagSafe was great when it was needed, but its time is passing.


> bringing up manpages in the terminal

Ha, that was one of my use cases as well. Not an extremely compelling one, though, to be fair. (^⌘? does the trick with a keyboard shortcut).


To be fair you can still charge the new macbooks via USB C if you want, and magsafe in this iteration is just a USB C to magsafe cable now


We detached this subthread from https://news.ycombinator.com/item?id=29015869 (there's nothing wrong with it, I'm just pruning the thread, which got too top-heavy on the page.)


The proprietary power connector is not a leap backwards in time. The current generation of USB-C spec just can't provide more than 100W PD charging, which the 16 inch model needs.


It seems I fall in the minority but I've become quite fond of the touchbar. Its a dynamic user input, if my phone rings I can touch answer on my touchbar to pick it up, if I want to seek or scroll on a spotify song I can do that with a touch and slide.

I don't see the dislike for it.


> I don't see the dislike for it.

1) You can't really make it part of your muscle-memory workflow if you use an external keyboard more than a very small percentage of the time, rendering it nearly useless for lots of people, right out of the gate.

2) Some of us discovered we barely brush the Touch Bar (F-key) area while typing certain keys (numbers, for me, which means also all the symbols that are on the number keys). This meant we had to all-but disable the damn thing to keep it from opening iTunes and doing other crap, apparently "at random" but actually because our fingers were straying 0.01mm onto the Touch Bar without our realizing it. Result, a fair percentage of users had to force like 3/4 of the bar to be empty all the time, or else face constant irritation when using the built-in keyboard.


I'm able to tolerate #1, but #2 definitely makes it suck for me. Haptics and requiring some pressure would have made the touchbar much easier to not hate.


Right, I'm not necessarily anti-touchbar as an idea (though I also wouldn't pay any amount of money to gain even some hypothetical good version of it) and wasn't that bothered by losing the F-keys, but as implemented it was harmful for me, not even neutral, so I lost my F-keys and all I gained in return was an extra step to set up a new MacBook (set Touch Bar to one static mode rather than context-sensitive, then replace most of it with "spacer" elements).


1) we are talking about MBP - it is just under the screen you will always see them.

2) I get it but comparing to desktop keyboard F-keys are dead without Fn-key so unlike Touch Bar mbp f-keys are not on par with desktop keyboard.


This of this. Made me strip the touchbar of all the icons I could on the left side.


I already have a dynamic interface to control: the one on the display. Having the "keyboard" change throws off my intuition on where to place my fingers and how to interact.

Looking down at the touchbar each time to assess what the current interface is adds a tiny amount of additional overhead to my interactions that I dislike. It's not _terrible_, but it's not my preference.

I also really dislike the lack of tactile feedback.


I was indifferent to the Touch Bar, but Apple needed to release an external keyboard with it if they really wanted it to catch on. I just never took the time to figure out good use cases since I could only use them when using the laptop keyboard.


I can dream. Mechanical keyboard with carefully implemented Touch Bar and Touch ID. Sign me up and take my money. But Apple seems to only want to make external keyboards that use laptop keys.


That makes total sense actually, I'm surprised they never released an external keyboard with a Touch Bar


I think an external keyboard with the Touch Bar would have been incredibly expensive. The one that ships on the laptops was wired into the T1/T2 chips and ran a custom version of iOS. Effectively an interface to a second built in computer


IIRC the dislike was mostly driven by the touchbar substituting for physical F keys. If they somehow managed to have both, I suspect people would be happy.


They would have been happy and probably continued to never use the touch bar. I find it pretty cool and interesting to look at but I have basically never used it because everything it does can be done via the mouse which I am already using for other task so I never felt it was useful to switch to pressing the touch bar.


Touch feedback is important for a lot of us. Specially when I’m concentrated while debugging something, I don’t want to have to look at my keyboard. But I see value for other use cases, maybe having both the touchbar plus the F keys would have worked fine?


Once you start using it for autocomplete in a text field, the touch bar is pretty cool. You also can glance at it during a zoom meeting to see if you are muted. RIP.


We detached this subthread from https://news.ycombinator.com/item?id=29015869 (there's nothing wrong with it, I'm just pruning the thread, which got too top-heavy on the page.)


I love my touchbar. Until it freezes. Then I am whining that I want F keys back.


That's the only thing I use the touch bar for (I have a 2018 MBP); to hit the red button after a FaceTime call.



Thanks, I considered posting the iFixit link but the title of that article being “Teardown Teaser: Is the 2021 MacBook Pro Repairable?” didn’t really convey what I thought would be specifically of interest to HN (ie the battery angle which the original article chose to focus on).

Didn’t want to editorialise the iFixit link with 9to5mac’s title but we ended up there anyway :)


Yes, it's a mixed bag no matter how one slices it. (mixed metaphor too, apparently)


I don't get what is the news here and why this is in top. You could always replace the battery. It's just that it is not easy. Now to replace it, you need to remove a trackpad. Ridiculous. You still can't purchase a replacement battery from Apple to replace it yourself. The only option is to purchase it from a place like iFixit once your warranty is out, or pay ridiculous price at Apple.


This is incorrect. We are an Apple self-service shop (meaning we are authorized by Apple to do our own repairs, order OEM parts, etc.). In the 2019 models, battery replacement required a new top shell.


I'm impressed that you dignified that post with an answer.


No it doesn't, you're (understandably) just not willing to do the large amount of labor required to rip out the battery:

https://www.ifixit.com/Guide/MacBook+Pro+15-Inch+Touch+Bar+2...


FTA: "this new MacBook Pro has, at the very least, the first reasonably DIY-friendly battery replacement procedure since 2012."


Sucks that this is now the gold standard whereas for laptops sold a decade ago you didn't even need to open up the chassis or use a screwdriver to replace the battery.


Laptops a decade ago were modular and thick enough to have a frame to support the structure.

The removable battery in your device wasn’t supporting the device. It’s a design trade off that most OEMs have made.

It’s an argument akin to complaining about unibody cars. For the vast majority of people, like 95%, nobody was replacing laptop batteries on a regular basis.


My month old Framework belies this.


My experience with expanded lithium batteries says otherwise.


My team maintained about 30k laptops at one point.

We historically bought <1000 batteries a year, mostly due to environmental issues like cold and good lifecycle management.


I must either have extremely bad luck or I'm doing something wrong with my computers that's triggering battery swelling.

having > 3% of your batteries go puffy though is still significant when you can redesign your machines to be easier to repair.


The main correlations are thermal conditions and how much you plug it in. “Desk bound” laptops don’t fare well.

I’m not running that business anymore, but I’d be interested in changes between 2019 and 2022, as people are running on battery more and using residential electric service.


I had a black PowerBook in the early 2000s where I could swap out the battery while the system was running, if I was fast enough … sub-5sec swap


You did for the MBPs, the last of their laptop I remember needing no tools to swap the battery is the plastic macbooks.


early unibody macbooks had a door built into the bottom. You had access to the battery and HD with no tools. Apple said they anticipated users would upgrade to SSDs in a few short years during that release keynote so they wanted to make it easy for their users (imagine apple saying something like that now!! the heavens would open up!!!) By the time the late 2012 unibody models were released, the door was removed and you had to remove the whole bottom case, but this wasn't so bad as it was only like 8 phillips head screws (merciful in a world of tri tip sandwiches and screwdrivers)


Unless you’re talking 2009 only then no “early unibody” did no have a door, I have a 2010 on my desk.

Pre-unibody I can believe. I had a whitebook and the battery popped off and revealed the disk bay and RAM.


As asdff said, you're wrong. The first metal unibody MacBook Pro (15") and MacBook (13") in 2008 had batteries behind doors. In 2009 the MacBook was replaced with a) the unibody plastic MacBook and b) 13" unibody metal MacBook Pro, while that year's 15" MacBook Pro was almost unchanged, but all three 2009 models lost their doors.



Aluminum MBPs that had batteries that were swappable without opening the case existed as well, in the first few MBP generations around 2006 or so.


I have a 2012 MBP, and Apple won't replace the battery. The laptop works great, but Apple considers "obsolete". Sad.


If you have a 2012 you can easily replace the battery yourself, just check the relevant ifixit guide for the screw bits you need, IIRC from my 2010's replacement the battery has a pair of screws (one of which I think is behind the HDD bracket, though I may be misremembering) and an adhesive strip on the back which can easily be overpowered (though it's probably a better idea to soften it using heat from the trackpad side).

According to coconut I replaced my battery back in 2017~2018 or so (battery was manufactured in October 2017) and it was a breeze. "New" battery is at 6934/7000mAh (99% capacitity).


According to ifixit, just 106 easy steps to replace and re-assemble! :)


I think there are early and mid 2012s that differ quite a bit. I have a mid 2012 and the battery replacement is a nightmare. I'm waiting for my 14" to arrive, I'll switch over, then give it a go so I can hand me down the 2012 with some life in it. And if I blow it so be it.


You must have the 2012 retina!


I have this computer too and I recommend the replacement sold on BH photo video. I just put it in and I can somehow get like 7 hours plus of use from this computer doing very light stuff (like reading hn). Impressive for a computer so old. Still very performant for me imo with 16gb of ram and an SSD upgrade under the hood. I was looking at the m1 but I'll hold off, nothing pushing me away from this device currently and it seems like I will have software compatibility issues on ARM until they refine rosetta or offer bootcamp again.


Do you have the Retina model? How long did the swap take you?


I have the nonretina. The swap took me probably 5 minutes if that. However long it takes to unscrew 10 screws.


Replaced the Retina 2012 15" MBP battery fairly recently, as well as the SSD a long while ago (with an adapter) and took maybe 8 minutes, minus the screws, each time. The worry is the 3rd party batteries won't last long or will swell sooner than OEM parts.


If I get even half as much usage out of this third party battery as the OEM one it will last me another 5 years. Not a bad price for $80 imo.


The unibody (non-Retina)'s battery is very, very easy to swap. The Retina's is more difficult, but I do not have personal experience.


I'm not sure what is supposed to be an improvement? I replaced a MBP 2015 battery myself, and this did not require removing the touchpad.


How long did it take you? What level of expertise do you have in terms of disassembly / electronics in general? Did you read this quote from the article:

"Even better, it appears the battery isn’t trapped under the logic board. That could mean battery swaps without removing all the brains first—a procedure we’ve been dreaming about for a while."

That...seems like a pretty big improvement?


I was going to say that they may have been talking about the 2016 revision (g4 / touchbar) with that quote and maybe the retinas (g3) were not that bad.

But I went to check the battery replacement guide for the 2015 and…: https://www.ifixit.com/Guide/MacBook+Pro+15-Inch+Retina+Disp...

> Steps 74

> Time Required 2 - 3 hours

However according to the comments the ifixit people were playing it very safe, most of the steps in the first half are not entirely necessary and the procedure can be completed in about an hour.


The guide is bullshit and what one would write if one wanted to minimize liability. A fishing line (or plastic spudger) will save those 2h.


It would be an improvement if swapping batteries on a 2015 MBP required removing the logic board, which it doesn’t.

As to expertise: you just need to remove a few screws and then the power connector, its not difficult just tedious.


all the steps are designed to prevent you from damaging your speakers with adhesive removal solvent.

You don't have to follow each step, but if you don't know any better you should.

taking shortcuts just leads to sloppy outcomes, like any repair method.


The tradeoff is not so clear cut. When disassembling more parts, more can go wrong. This is true for the MBP, and especially for the iPhone. Screws can get damaged, misplaced or swapped, which can be disastrous because they have different lengths. If the whole procedure takes a long time, people will get impatient and probability of errors increases.

This is why I still feel the fishing line method, combined with low quantities of solvent is the least risky option overall.


Well, the last battery I bought from iFixit was flawed and killed the logic board. Victim among a few others of a bad batch. They refunded the battery at least.

Now I can fry my logic board again without having to go through the major pain of removing the original battery.


The 2013 and 2015 MacBook pros have a battery and charge controller that is so badly glued onto the body that you're bound to damage something else in the process of trying to replace it.


I guess "pull tabs" makes it slightly different...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: