My new setup as of November is an M2 Air (16GB /1 TB) and an base model M1 Studio.
I really couldn't be happier. I run a small WFH / Music Production / Photography studio in my basement and I really wanted an always-on always connected computer, for which the Studio has been perfect.
The M2 Air is pretty close to a perfect laptop. My complaints (low audio volume, slighty cramped screen) are absolutely dwarfed by the long list of decades-old issues this laptop just eliminates:
- All day battery life. Basically like the Apple Watch - charge it overnight and it's really good all day with normal workloads (e.g. not rendering video). But it was absolutely surreal the first couple of weeks watching an hour long show while running a dozen apps and seeing the charge go from 100% to like 98%. It makes no sense and I had to adjust work habits.
- Extreme snappiness. It's in every way snappier than my previous computer, a 32GB 2019 MBP. It _might_ be slower on long CPU-bound or RAM-bound workloads, but photo editing is both and it feels faster on the Air.
- Form factor. It's my first Air. I'm not going back. I'm 40+ and keeping extra pounds off my lower back matters for me. The Air, if it's all you're carrying, is barely noticeable.
It's just fantastic. If you were wondering if you can replace an MBP with the Air yet, it's almost certainly a yes (though I wouldn't try all this on 8GB RAM). At $2000, what I paid, it's a revelation.
Also if like me you work on a desktop/laptop Apple setup at your desk, the new continuity features are totally incredible. Working with the studio/air together is totally seamless (You can extend the screens and use the same keyboard/mouse). It just works. Apple really is getting a lot of things exactly right on the software side in the Apple silicon era.
To reiterate your point about RAM for anyone wondering: don’t get the 8gb version! I’ve run 3 Apple Silicon Macs in different configs since 2020. Real-world performance on a range of different workloads is substantially worse than the 16gb version. Don’t even think about having more than one user logged in!
I'd imagine it's relatively easy for the OS to just swap out the memory from the inactive user. Unless they're actually running a task in the background, but how often does that happen?
I have used the 2020 M1 air with 8Gb since last summer without any issues as a documents/productivity suite and web application development tool. There are zero complaints here. I had a Core i9 with 16GB of ram MBP before it. This feels just as fast.
I have an really old macbook pro (mid 2014) with 8 gigs of ram. I am regularly running out of ram, I think it should be independent of processing power, right? Or is there some other way it feels snappier? Maybe the SSD got so fast it can better handle memory pressure...or you just don't need more than 8 gigs and it wasn't a limiting factor before
The SSDs in the apple silicon machines are likely much much faster than whatever you have in your macbook pro so when they do spill over into swap it would be much less noticeable. That being said, I'm pretty pro apple silicon (purchased an M2 Pro this morning) and I would never recommend anyone buy a new computer with less than 16GB of ram unless the usecase is strictly web browsing/streaming/ms word type stuff.
Our 286 (which was the first computer my family owned) also had a 40MB Hard Disk, which seemed like plenty at the time. It could run Windows 3.1 with that, although it wasn't exactly fast.
I scrounged enough chips and a carrier board to max my Compaq out at 3.6MB. That may have been a model specific limitation.
I believe that this computer had a 20MB hard drive, and that I eventually ran Stacker to compress the volume and squeeze out a bit more space. I know for sure that I never got anywhere close to 100MB.
On one of my computers in this era, I managed to get a paperwhite VGA and Windows 3.11 (for workgroups) running despite not meeting product specs. My 40-something brain is not quite reliable enough for me to state this with any certainty.
The Apple silicon chips handle ram consumption differently. Maybe it’s the ssd speed but I hadn’t had an 8Gb ram computer since I bought a MacBook Pro in 2011 that supported 16gb and I have 64gb of ram in my windows desktop. I really don’t hit a wall with the M1 over ram. It’s an incredible machine but YMMV, especially if you use virtualization.
It really depends on how you are using it. If you are primarily a single-tasker, then the 8GB should be enough. A lot of people just use one app at a time. I imagine that most people on this forum would be more technical and more likely to be multi-taskers where the extra RAM would matter more.
I’d be so up for a 16 inch ‘air’. I so love my M1 Pro Max MacBook, but Christ it’s heavy.
It’s heavier, and I think thicker, than my 2012 MacBook Pro. Now I expected it to be heavier, but was expecting it to drop back maybe one iteration to MacBook 2014 kind of heft. But we’re a full decade back of beefiness.
I need a big screen though so it’s the only option. Unless they make a nice thin 16 inch air. I am pretty confifent it would sell extremely well.
I sympathise. I have a 16in M1 Max, coming from an LG Gram 17. I can't fault the power of it, but the heft isn't ideal for me.
I'd welcome a 15in or 16in Air. The rumour mill suggests a 15in Air is in the works. However, I'm now used to the 120Hz display of the Max, which I suspect wouldn't be an option for the non-pro models...
I am as happy with my 13” M1 mbp 2020 as you with your. My only gripe is that the screen is too small and more importantly after two years I went from a stellar 21h battery life to 13h for lightweight work and it now also seems to drain on sleep.
And it seems to lose charge quicker under load.
The battery capacity is still reported as 91%of spec which does not explain this behaviour.
I recently noticed a decline in battery on my M1. It is ridiculous to complain about 10h battery life but the decline was noticeable. I fixed it by:
1) Turn off siri suggestions & privacy and all the app watching it does. System Settings -> Siri & Spotlight -> Siri Suggestions & Privacy. It turned it off for every app.
2) Turn off siri all together. System Settings -> Siri & Spotlight -> Ask Siri Toggle
3) Turn off Spotlight for everything you do not need. System Settings -> Siri & Spotlight -> Search Results check boxes
4) Turn off login items. System Settings -> Login items. Remove what you do not need from the list.
5) Turn off run in background items. System Settings -> Login items -> Allow in background
6) Remove items from ~/Library/LaunchAgents/ that you do not need. I had an old docker vnet item that wasn't needed.
Other than #6, you can do the first 5 things on all of your Apple devices and it makes a big difference in battery life. I just did the same steps with my phone and ipad.
Thank you! I have turned off a lot of background helpers from apps I don't use anymore and removed some search result types. Docker was in the background helpers multiple times. Let's see how much this helps.
In activity monitor, I also saw that Steam and iStat Menus both hand an "impact" of 50 which made them the top 2, the next had an "impact" of 5.
This might be an old issue, but try turning of the hard drive space indicator in iStats Menu. It was a major drain for me a couple years ago. After a restart, energy use went way down.
I wonder if you have installed software over that time period that is doing a lot in the background. I'd check activity monitor and see if there is anything suspicious.
I had some major battery and charging issues while sleeping with 2 older macs that were a challenge to isolate. It turned out that in one case, the charging brick itself had started having problems and couldn't keep up with how quickly docker was demanding power. Got that replaced under warranty.
The other was just a straight-up hardware issue with the computer. I'd close the lid close to a low battery and let it stay closed long enough for it to hibernate, and then when I'd open it back up it will have completely drained and I'd get a kernal panic. They replaced it after quite a lot of back and forth until I was able to prove it.
I have a M1 air and with basically the same comments. The con is that it's no longer the newest air out there, pro being that I had it since November 2020 and I really don't have a single complaint with the hardware.
I thought killing MagSafe was pretty dumb, too, but every time I tripped over the USB-C power cable on my 2018 MBP it came right out, no problem. Plus it's really nice to be able to move the power cable to whichever side the outlet is close to. When I upgrade, I think I'd choose to continue using USB-C power if possible.
I'll actually miss the Touch Bar. It's an interesting idea that seems useful, but isn't, except in one very useful case: it's really nice being able to slide the volume to exactly where I want it. With the keys back I'll have to memorize Shift+Option+VolumeUp (or whatever it is).
I've also tripped on a magsafe cord and watched my 2011 air fly off the table and bounce+flip on my wood floor, so it's not a panacea. I think the right-angle ones were more prone to yanking your machine off the table, especially if pulled perpendicular to the magnetic connection.
The MacBook Air at least still supports USB-C power, which is useful for USB-C monitors that also supply power. But I still think you need to plug it in on the right side to avoid overheating the laptop (the MBA at least doesn't have a fan, so I can't tell if this is still true or not).
Back then I liked the Magsafe, but the my new Macbook's Magsafe cable is still in the box. With 12+ hours of battery life I barely charge it when I use it and when I'm it is usually at my desk connected to my monitor via USB-C.
When I'm on the go I only want to bring one charger and cable for backup for every device, which is USB-C. (I don't own an iPhone)
I have never used the MagSafe connector for my 14" MBP. It either sits docked with my monitor which provides power over USB-C, or when I take it, I don't take the power brick. I've taken it camping over three days and never had to charge, so having a dedicated charging port doesn't really suit my use (also not a fan of them adding HDMI back to the machine)
I’m glad to have it as an option, but I think I’ve used my magsafe cable for charging two times in the past two years. The incredible battery life + switching my desk setup to USB-C means I pretty much never need to charge outside of normal usage at my desk
In retrospect, I think i would have bought the Air! I somehow thought the Air still had only two USB-Cs, one of which would be occupied by power if plugged in. Which was not enough. Two USB-Cs plus a separate magsafe would, barely, be enough for me.
the price difference is not huge once I put in 16GB of RAM and a 1TB HD. But the smaller size of the Air is nice! I suspect even the Air is enough CPU for me, although the new M2 MBP seems like a lot more CPU I think.
As a photographer and DJ even the MBP runs out of ports pretty quickly, so I had a standing collection of travel hubs for this already + a CalDigit thunderbolt hub for home.
I'm really looking to get a similar setup. I don't think they would do a refresh to the Studio this year, or if they do, it would probably come at the end of the year.
Oh, please don't let anyone at Apple hear you complain about "thicker". I'm still not back to trusting them after years of downgrades pretending to be upgrades: "We've made the thinnest, best-looking Mac ever while removing less functionality than we've removed in most previous upgrades!"
My sentiments exactly. Go buy an iPad if you want thin light and all day battery life. I want what Apple is delivering today, real power with real portability, even if it weighs a bit more and happens to have some love handles.
I went from an M1 Air to an M1 Pro 14” and agree with you. The Air weight is a bit magical. Very chuckable, makes you want to reach for it whenever you want to get something done quickly or perch it on the arm of the couch.
By contrast the M1 Pro 14” just tips over into unwieldy. So much so that I regret not just going the whole hog and getting the 16”, since at this weight I prefer to have it sitting on a proper table and could’ve used the extra screen real estate.
I had intended to sell the Air but ended up keeping both, mainly for this very reason. The Pro rarely leaves its dock.
Agreed - it's great that the battery life on the pro is so good now, but that extra weight is, to me, the difference between noticeable and not noticeable.
How is the heat? I've heard from anecdotal reports that under heavy workloads, the M2 Air can get super hot, and so Pro is the way to go if you're using heavy workloads. But you're doing music/photo production, etc, so I wonder if you agree. I know that's not the same as video.
Well I do light music production - mostly on the Studio.
For photography work it's been fine. But I haven't done multiple-minute exports since buying it so I can't really say.
I _can_ say that under my normal workloads (coding web apps, office applications open, chrome with say 20 tabs, often playing chess, live streaming / zoom) heat has never been an issue. It's also my DJ computer for gigs and no issues I've noticed there, although it's not on my lap.
Contrasted with the 2019 intel MBP it's again night and day. Heat was very noticeable, as were heat-related throttling issues. It would just grind to a halt under normal workday loads esp when screen casting zoom + having all my normal apps and IDEs running.
If you do sustained work, like render or transcode video, the Air only goes full tilt for 5-10 minutes then starts throttling until you stop doing the heavy work.
If that's something you do more than once in a while, definitely opt for the Pro. For any other workload that's less than maxing out CPUs, or just bursty, the Air has been excellent.
I still think the M1 Air was a better value overall. And I liked the curves to it more than the M2's boxy design.
Speaking of form factor I am really surprised more people don't want an Air. My work laptop (purchased by my employer) was an Air in the Intel days, and I loved it even though it was dog slow. It didn't bother me because I basically only need a browser and ssh. And, I bike to work so weight reduction matters.
Nowadays after the Apple Silicon transition my employer no longer offered an Air. The reason was too few employees wanted it and it was too much trouble to maintain a separate SKU in the inventory for the few of us. Really quite a bummer.
Did your employer provide a speced up Air or just the base model. Last time I only chose the Pro because it had twice the ram and storage otherwise I would have went with the Air.
Yes they upgraded to RAM to 16GiB. The storage didn't need upgrading because almost everyone in engineering (maybe except designers) ssh'd to big power computers elsewhere.
You can, but honestly, if you can afford 24GB, go for it. Remember that unlike in the Intel days, the GPU uses main memory for its RAM. (Well, I guess integrated Intel GPUs did some of that, too.) That can be painful when you've got a few clips in the timeline and some plug-in wants to have multiple copies of each frame for doing its processing. (Like it applies a blur, then mixes that with the original, or whatever.) I have a 16GB MacMini I sometimes edit video on, and it works, but there are times when things get choppy/slow. I also have a Studio with 64GB, and it's smooth as butter. (I mean it's possible to load it up so it slows down, but it takes a lot more to do so.)
I used FCP on an M1 Air with _8gb_ to edit 4K videos for over a year. Assuming you have enough fast storage (or are happy to use a proxy workflow), it certainly can be done, although everyone’s workflow is different. Assuming no storage speed bottleneck, timeline scrubbing was smooth up to about 2x forward or backward, from memory - after that you hit decoder limits. Proxy workflow gets around that too, though. Render times on even the M1 were a big jump over Intel chips. If Apple has a 2-week return policy where you are, why not give a 16gb M2 Air a try and see if it meets your particular needs?
Thanks, that's great to know. It sounds like it will be a huge step up from my 2014 MBP which grinds to a virtual halt on 4k. For that reason I've limited myself to 1080 for now, but when I upgrade I figure I should step up.
I plan on getting a re-furb, not sure if I can return one of those...
I don't do this specifically, but if you can go without $2k for a few weeks you can take advantage of Apple's permissive return policy and just demo for IIRC 14 days.
How come in all these new announcement threads there is always someone talking about how their current setup is great. Good for you, really happy for you. It's not really relevant to the discussion right?
Its relevant because if current setup for people in various scenarios is working great more folks can compare their requirement relevant to them. They may decide to go for newer version which can be even better or buy older version if available and save some money in process.
The lack of dedicated DisplayPort ports is fairly disappointing to me since their implementation is just "native DisplayPort output over USB‑C" and it doesn't support MST, so you can only do one displayport signal per port, even with a thunderbolt 4 hub with some bidirectional USB-C <-> Displayport cables.
I guess my main point is that, if we're going to move to displayport over USB-C, we (A) need Apple to put out official $100 cables, since I tried some cheap Amazon cables that didn't work and had to purchase some $60 moshi cables that I knew would work[0], and (B) they need a minimum of 5 ports since being limited to 1 peripheral port+3 monitors (2 over DP, one HDMI) on the mini is terrible. I know they want your mouse and keyboard to be bluetooth but no thank you.
For your information: AmazonBasics cables are very reliable and reasonably priced. You can get an USB-C to DisplayPort cable from them for around 10 bucks and you should have a lot less trouble with it.
In my experience spending big on video cables is only necessary for large distance and bandwidth requirements (like HDMI at 2160p60 over 20 meters).
They never really took off on laptops to begin with. However most PC GPUs are still majority DP despite USB C being tried as an easy connector for VR headsets and most monitors use a mix of DP/HDMI.
Yes, although that's a MacOS restriction, and MST is already used in MacOS to carry multiple streams to a single high-resolution monitor. And as Asahi Linux keeps getting more usable, I hope it can be a viable alternative, so Apple's hardware can be used without MacOS's arbitrary restrictions.
This obdurate refusal to support MST is indeed awful (and, actually, a minor one of the few reasons I mainly stopped using Macs).
But it is "just" a software problem, right? So they could come to their senses and change their mind.
It seems insane to me to keep putting DisplayPort ports on computers when we have USB-C, but it also seems insane to ship machines with crippled USB ports that don't support more than one monitor... so I wouldn't be surprised if they fix their OS in this regard.
(Apple being Apple though, the software fix might well "require" an M3 processor....)
I used an M1 Air docked with displaylink for about 8 months.
1. The Displaylink software heats the machine lightly in the background, and it just gets worse with resolution.
2. The Adaptor is unreliable and can do some really wonky stuff around dock and undock due to Apple's locked USB policies.
3. The old series of Adaptors do not support 4K60, so if you plan on using it with a 4K display, you need to pony up on costs.
Assuming we see more ~$1600 sales on the M1 Pro 14 inch, that is easily $300 more computer than a $1299 M2 macbook pro.
Not to mention DisplayLink can be broken by MacOS updates. Tunneling video over USB was a clever hack back in the day, but with USB-C and TB3/USB4 docks, things are much less software-dependent.
DisplayLink is a daily source of frustration for me. If I disconnect my DisplayLink hub, and then plug it back in... I have to open and close my 13" M1 MacBook Pro a few times; restart the DisplayLink software; and sometimes reboot my computer before it will work again. For some reason, Zoom will just _hang_ after all of this. I just keep repeating the above until Zoom works again.
This is my employer provided MacBook Pro. I hate the experience so much that when it's time for a laptop refresh, I'm going to ask for the Windows option if they haven't upgraded their Mac offering to one that supports two monitors.
I love Apple products -- but I never understood how they could release a "Pro" laptop that supported only one external monitor.
That wouldn't work with my desk and screen setup that uses MST. I use devices that do support MST but not displaylink like my Steam deck, or out the box Linux installs.
I'm not sure the downvoting is warranted. For what it's worth, I use display link every day (with an M2 MacBook Air) and it's fine. The refresh rate really suffers when driving two 4k monitors (as well as the built-in panel), but I've found it to be a reasonable workaround.
Of course, it's also helpful to know that others have had worse experiences.
They are diversifying their laptop/desktop solutions like crazy. Even a Macbook Air can almost compete with the entry-level version of the Macbook Pro, which is great to be honest.
I heard they are exceptionally good.
Honestly, 7-900$ for a machine that will not get old very soon is not so bad, especially if you already own a screen (which due to WFH you kind of have to have together with a "decent" keyboard, etc.)
I made the same jump. The MacPro was an awesome machine. In fact, it didn't feel slow to me at all and I wouldn't have upgraded if it weren't for the fact that they stopped supporting it in new OS releases, which means critical apps I use would be stuck at their current versions. (Plus, I assume they'll be dropping Intel support altogether soon, so I wanted to get on Apple Silicon for that.)
I felt exactly the same way. There was one app that was a little slow for me, though, and that was Lightroom. Definitely way faster on the Studio.
In the summer, I love my Studio way more, too, because it generates much less heat. In the Winter, I miss my Intel Pro. It really did heat up my office!!
Lifecycles for Macs are really long. If you don’t buy late in the product cycle, they are usually 6 year devices and work pretty well, with the exception of the butterfly laptops.
If you’re pushing the limits, the lifecycle doesn’t matter as you’ll replace annually or whatever anyway.
I tried the older 14" MacBook Pro for just over a month, and it's a world of difference going back to my late 2013 13" MacBook Pro.
I've been holding out for the new one and will be placing an order in a bit. :) The splurge will more than make up for the hours of life I'll save from browsing resource-heavy websites.
I was using Blender on a 12 year old iMac until just a few weeks ago. Desktops don't seem to age as quickly as their portable counterparts (they are certainly less fragile).
I still use a 2012 air. It has been demoted in the food chain (so no longer main laptop), but it’s mind blowing how well it still works for basic browsing. Maybe time I replace with a 14” MBP. If that one lasts another 11 years I’m happy.
My motivation for the new air was that I was tired of investing in replacement batteries for the 2013 air. Otherwise, it is pretty much fine for most tasks. Though, not having the fan constantly cranking is nice!
I was thinking about throwing some last gen parts into an old tower/PSU I have to build a desktop for inlaws since they are stuck on a 10year old hand down laptop. This is a good package at that price (599$) - power consumption is likely nothing, MacOS included in price, small and quiet - will comfortably browse the internet, play videos, etc. for the next 5+ years. I won't be able to repair it out of warranty so that's a bummer- but at such low price and their usage pattern I guess I can live with the risk.
I imagine a lot of the performance of the M1 and thereafter is a direct result of moving the memory closer to the die. The power efficiency, sure, that's all engineering, but super close and fast memory right next to the CPU is only one step away from AMD's 3D v-cache that has (had) the 5800X3D beating out newer 7000 series AMD chips.
> I imagine a lot of the performance of the M1 and thereafter is a direct result of moving the memory closer to the die
I'm not sure why this Apple marketing point gets repeated so often. The M2 has the same memory support as current Intel chips: 128-bit LPDDR5 6400. Mounting the memory on the same package as the CPU might have some engineering advantages compared to socketed RAM, but it doesnt make the (industry standard) memory any faster.
IMHO the biggest advantage is the RAM->GPU VRAM pipeline, or at least, that was my experience before - whatever you want to send to the VRAM, it has to be in RAM first, and you have to wait until it gets copied, before you can use that RAM again. With unified memory it should not be necessary anymore (you just mark that part as input for GPU). Same should sort-of apply for iGPUs but I'm not 100% sure.
And given that the whole MacOS is built around various GPU-accelerated frameworks, it kind-of matters a lot.
ATX is just a formfactor, there's nothing preventing apple from releasing an ATX board with soldered-on memory - this is actually not even that unusual in the µATX formfactor.
It would be cool to get the SoC on a socket like the Pentium II was, and then have some NVMe slots, PCIe slots, and more I/O ports. This would also be better for the planet if it was a standard feature of each Macintosh and just the SoC got replaced. Unfortunately, Apple's Macintosh line doesn't work that way. The first Macs were all-in-one designs, the 90s saw some heterogeneity, and then the Macintosh line came back to an all-in-one design (except for the MacPro line and the G4 Cube). Today, the G4 Cube is like something out of an alternate universe. Not only was it upgradable, it was also incredibly easy to "open up".
Though for web browsing both presumably fall solidly under "good enough".
If I was signing up for Tech Support for Relatives, though, I'd _definitely_ go with the Mac; you don't want to be getting phonecalls about Windows 12 or whatever in 2027.
M1 was faster than a 5600g, I think, but you can get a 5-series micro PC for cheaper than even the cheapest Mac Mini. For 'browsing' operations, I suspect it wouldn't matter much. There may be some advantage to using macOS vs. Windows, if this is aiding a family member... since I assume the IT support burden would be lower.
Most JavaScript isn’t dependent on that specific a feature for performance, however. In most cases what matters more is that they have great memory performance and handle branch-heavy code well.
Mozilla also implemented it. I believe Chrome/v8 have as well, but I know there was some work there to avoid accidentally enabling it on older ARM chips which don't have support for it.
That's sounds like the PSU spec. They could have clocked it significantly higher than laptop for perf but I expect it to be in the M1 Mac mini range power consumption wise.
I hear you, but in Europe it's around 700€, which also buys a pretty solid windows or linux laptop. If you're not locked in to mac, a comparably priced laptop with intel 12 gen CPU, 16GB RAM and a RTX 3050, which is arguably a lot more versatile machine.
I think I would like Mac's more if they weren't so absurdly overpriced (approx. 25%+ over US prices) here in Europe.
Here in Austria we have 20% VAT, and US prices are typically without tax. So the 25% price difference is really just a 5% price difference, which is still shitty, but not as horrendous as you make it seem.
Also, the Mac mini is very special:
- tiny footprint without external power brick
- very high reliability
- completely inaudible for typical developer usage
- extremely low power usage (the 6 core Intel Mac mini was an exception)
It's the perfect home (or office) server. You can put it into a bookshelf or in a cabinet, it won't run hot, it won't disturb you with noise in the living room.... I just don't know of anything else that fits the bill.
The only shitty thing is that they are charging ridiculous prices for storage, and attaching external storage sucks because it ruins the tiny form factor, and also because the USB-C cables that come with SSD drives tend to easily disconnect in my experience.
Yes, but that difference isn't Apple's fault, only the 5% they add on top. Every product you buy in Europe is more expensive because of the higher VAT.
Apple silicon macbooks have one property that no windows or linux laptop will ever be able to give me. I use my MacBook to take notes in class, work on my CS homework, etc, so I spent most of my time in Firefox or VS Code. I charge my laptop once a week.
I'm likely going to order one of their Minis with the top CPU option and 32gb ram for audio production here. I've got an 2018 i7 model which is fine, but this one should set me for a _long_ time.
That's my setup and I was thinking exactly same; Mini M2 Pro with 32GB RAM would be perfect for me. Except that I don't have too many complaints with my current machine which I feel can easily work just fine for at least 2 more years.
I really just use my machine 95% as a tape machine with few plugins or edits. So as long as I can get 64 tracks of in/out reliably with 128-sample buffers (or lower) then I'm set. I just haven't tried that out in my new studio yet, as it's under construction. If I can do this, then I'll skip the upgrade for now.
Whilst I am utterly impressed with my M1 Pro; they are comparing to the "fastest Intel-based MacBook Pro" which is 4 years old when Intel was struggling anyway. Not at all representative of what they could have done with the latest Intel or AMD silicon.
Yeah, it's a bit cherry picked, especially since the last gen Intel based macbooks had horrible throttling and fan maxing out issues... I think the cooling design was insufficient, or they should have lowered the max voltage slightly.
Funny, I felt personally targeted in the release video. I have a 2019 MBP I have been waiting to upgrade, and I suspect a more than a few others are in the same situation.
They do also give, further down the page, some comparison with the M1 MAX, saying 20% faster (except for memory bandwidth which is twice as fast at 200GB/sec).
They comparison exists because the majority of buyers will probably have an Intel-based Macbook Pro. From a marketing perspective, it is more useful to focus on how much faster it is than that product than it would be to focus on how much faster it is than some-very-specific Windows laptop, or an M1 Pro.
It's worrying that they are shying away from specifying generational improvements or at least comparing to it's competitors. The "fastest Intel MacBook" is not really the competitor for an M2 MacBook Pro, it's a predecessor. But I'd rather hear about generational improvements from M1, especially because both Intel and AMD seem to be making great strides in the low power space lately.
It's no secret that Apple's M2 series processors are a pretty minor improvement over the M1 series. Expect much bigger improvements in the M3 series which will likely be using TSMC's 3nm process (and then probably only minor improvements again for the M4 series).
It seems that Apple's ridiculously impressive consistent year-on-year gains up the M1 were in large part possible because they were behind the state of the art. Now that they've caught up with Intel and AMD we should probably expect the same slower more gradual improvements from Apple that we see from the other companies.
Yes, the year over year improvements were 90% just the tsmc node changes. But the ridiculous (and IMO amazing) performance per watt is a mix of iOS/macOS and their chip design. Those efficiency cores are unmatched, paired with the right scheduling from the OS.
The M2 is underwhelming as its the same 5nm as the M1 was.
But rumors say the M2 pro and max are on the 3nm node. And it took so long for the release, as 3nm was delayed.
Apple has a monopoly right now on 3nm, which is a shame. But if we only got tsmc to fire on all cylinders because apple stuffed them with money, then so be it.
In the beginning I thought the “performance per watt” thing was just folks who want Apple to be better clinging to a stat that made it look like that was the case. But in late 2021 I was long overdue for an upgrade, so I got an M1 Max MacBook Pro and holy hell, now I get it.
The battery lasts stupid long. Like, the fact that I can bring this thing on a plane or train without worrying if they’ll have working electric sockets is huge. The fact I can stash this thing under a desk with no power available and have it run graphics-intensive projections for a 3-hour theatre show is huge. The fact that I can achieve all that with zero throttling (I turned on High Energy use and never turned it off) is huge. The fact that I can achieve all of that with basically no fan noise is nonsensical.
Like, compared to the 2020 Razer Blade (RTX 2070 Max-Q) I use for work, it’s like a completely different class of device. My Razer Blade spins up like a jet engine under load, requires a cable if I’m going to do anything intense for an hour, underclocks from ~4.4GHz to ~1.5-2.5GHz when on battery (seemingly no way to disable this), and really only beats my MacBook for tasks that specifically require an Nvidia GPU.
> underclocks from ~4.4GHz to ~1.5-2.5GHz when on battery (seemingly no way to disable this)
digression: I believe there's no way to disable that because the laptop's battery is physically incapable of reliably supplying the necessary amperage at those higher clock rates. It's an 80 watt-hour battery for a laptop that needs a 200W+ power adapter. The battery can't drain that fast even if it wanted to.
Yea, I upgraded to an M1 Pro and it's truly amazing. I used to always look at benchmarks and always want to upgrade to the newest thing that was coming out. Now I have a laptop that feels like it just works. It doesn't get hot, I've never heard the fan, and just so many of the typical worries/concerns just aren't there for me anymore.
Yesterday, there was an article about Intel showing off a "35 watt" chip that was benchmarking better than last year's 125 watt chip. Except, you go into the comments and people note that it'll draw 107 watts for short periods of time - probably just long enough to get good benchmark scores.
The magic of the M-series processors is that they have really great sustainable performance and without needing to draw power and create heat like Intel's. The Razor Blade machines are amazing in how they deal with the amount of heat generated with their liquid cooling and everything (it's quite a design), but as you note it's still a jet-engine sounding machine that requires being tethered to power.
One of the big things that Apple showed off when introducing the M1 Pro/Max was the idea that you could be a creator doing your job and no longer feel like you were tied to a power outlet. Photoshop from a park or edit video wherever you are: you really don't need more power than you'll always have with you.
As I said, I bought the M1 Pro and specifically upgraded to the 8 performance core version of it. I'd always bought the high-watt Intel processors for my MacBook, not the paltry 15W Intel parts. I had a 2020 MacBook Pro with 2.3GHz Core i7 (28W, 10nm) and I'd regularly have it get really hot, the fan often wasn't too loud but it was always a constant drone, and things felt slow.
My next machine is going to be a MacBook Air. I feel like I'm using nothing on this machine. I have 8 performance cores and 2 efficiency cores, but it just feels like there's no point to having that much CPU. It's a really crazy feeling for me. Yes, more performance will be useful for some people, especially people who do video editing, graphics, etc. But damn does all my work never even come close to stressing this thing - something I'd regularly have a problem with on Intel's greatest drawing a lot more power and making a lot more heat just a year before. Intel has made some strides since then and I'm happy that their newer processors are doing well, but for the first time in my life it just feels like I don't care.
I might grab a 3nm laptop from Apple when those come out, but I'm not anxiously awaiting them. I might just wait for whatever comes after TSMC's 3nm process. I am kinda excited that I can go to a MacBook Air.
Stories like yours made me upgrade my home computer to an iPad Air 2022. It’s basically an M1 laptop with a touch screen, and when you pair it with a Bluetooth keyboard and mouse, and attach it to an external display, it feels like a full pc.
The hardware is basically equivalent, but unfortunately that's far from the case with the software. It's definitely useable as a home computer if you mainly browse the web or use office apps. But there's a lot of software that just isn't supported on the ipad. It would be pretty useless for me as a software developer.
Correct, it’s not a good tool for programmers, but many other jobs it’s a great tool: photo organization and editing, video editing, note taking, learning and playing music (connected to a guitar), writing and dictating texts (Dropbox paper), creating texts and presentations (ms office or pages/keynote), reading news (browser, Flipboard, get pocket), watching videos/Netflix/prime, controlling music on the HomePod mini, banking, home control (control thermostat).
The only reason I still have a laptop is software development.
I use the slightly older 14” macbook at work and since I’m always in different meeting rooms, and only at my desk for half the day I charge it like I charge my phone. If I notice it’s low, I charge it when I go for lunch or whatever.
Well in that case: whether it’s the CPU, or the way CPU/GPU/RAM are tightly coupled on the SoC, or the 4Kish MiniLED screen (which never seems to be dim), or some sort magic glue between the battery cells… Apple is doing something spectacular, and other manufacturers need to figure out how to do it too.
Most signs point to it being hardware-related. Without any Mac-specific optimization, the Asahi folks got fairly long runtimes out of the CPU (which makes sense, ARM has low idle draws).
Frankly though, I don't want most manufacturers to make ARM machines. While Apple is allowed to monopolize the latest TSMC silicon, it's completely pointless trying to compete with them. May as well focus on delivering a great x86 experience with AMD and switching to something more open like RISC-V when the time comes.
I agree with your sentiment for the most part. I would love to see someone take RISC-V or Power seriously to compete in the consumer space with an ISA. However, I don't think Power is quiet cheap enough and Risc-V just isn't there yet. So I'd like to see someone throw their hat seriously in the ARM ring until Power becomes cheap enough or RISC-V gets there. Plenty of people producing Intel/AMD machines, we have too many there.
I think the ChromeOS world has done a pretty good job of getting there. Yes, it's not quite as impressive as the Powerbooks, but they're able to get surprisingly close with a much smaller budget.
> But the ridiculous (and IMO amazing) performance per watt is a mix of iOS/macOS and their chip design.
Not disputing this, but even Asahi Linux without GPU acceleration has pretty much whole-day battery life on an M1 Air. I'm not sure how well their kernel manages scheduling on the different cores but I suspect the vast majority of the perf/Watt is on the actual hardware.
According to Apple's press release^1, at least the M2 Pro is fabbed on a 'second generation 5nm' process. Since the power/cores/performance claims seem to be sub-linear going to the M2 Max, I'd expect it continues to share the same node.
It's worth noting that TSMC is looking to exploit Apple's demand for the hottest silicon. The latest reports suggest that they'll cut the prices to lure in others, so Apple will be forced to ante-up or stay on an old process: https://www.techspot.com/news/97269-tsmc-may-cut-3nm-wafer-p...
It's fascinating to think about. If the Mac is currently a low-margin product, Apple might be forced to stay on one node until the better silicon becomes cheap enough to use in production. If Apple doesn't play their cards right, TSMC could shanghai Apple in the exact same way Intel got stranded in the sea of process enhancement.
I'm not so sure that's true. Linux seems to handle these SoCs fine as well. The secret sauce in that regard seem much more in the very large amount of co-processors rather then the OS itself.
Apple’s M1 chips already offered improvements compared to their previous generation Intel chips, and Intel hasn’t exactly been innovating in the last 2 years. By all accounts, they have been ahead of Intel, at the very least.
I’m not exactly sure how Apple was “behind the state of the art” and has caught up. Can you explain?
> Intel hasn’t exactly been innovating in the last 2 years.
What on earth are you talking about?! Alder Lake was a huge step up for Intel, and Raptor Lake has improved things more than I would have expected as well. Before Alder Lake, Intel's CPUs honestly sucked. AMD still seems to have a modest competitive advantage in terms of efficiency and battery life (making AMD not that far off from where Apple is in hours of practical battery life when doing something other than looking at a static screenshot for hours on end), but Intel makes up for that with performance, and as I recall, Alder Lake is still more efficient than what Intel had before in laptops.
Sapphire Rapids is also a huge innovation compared to what Intel had been doing for years in the server market. Intel has also been introducing some really interesting GPU products now to bring competition to AMD and Nvidia, but it will probably take another generation or two to iron things out. If none of that is "innovation", then I don't know what is!
The last 2 years are the first time Intel has really been innovating in the past 5+ years!
When M1 came out, it was awesome. Since then, Intel and AMD have released processors that are significantly more competitive, while M2 was a mediocre step forward. I appreciate my M2 MBA, but Apple Silicon needs a huge upgrade with M3 to remain competitive.
For a lot of legitimate use cases, there are at least half a dozen Windows laptops coming out right now that I would instantly pick over a MacBook, and the same would not have been said 2 years ago.
> For a lot of legitimate use cases, there are at least half a dozen Windows laptops coming out right now that I would instantly pick over a MacBook, and the same would not have been said 2 years ago.
Would you mind listing a couple? I'd prefer to buy a Windows machine for my next laptop, but I've been SO burned by hibernation bugs, CPU throttling, battery issues, etc. And I've had an OG 13" M1 MBP from work for the past year or so, and the performance and battery life have been unreal to that point that I'm very much considering one of these 14" M2 MBPs for myself.
> Would you mind listing a couple? I'd prefer to buy a Windows machine for my next laptop, but I've been SO burned by hibernation bugs, CPU throttling, battery issues, etc.
Those have a huge tie in with the software, so I wouldn't expect hardware innovation to help.
CPU throttling and battery issues aren’t necessarily related to software at all, but “battery issues” is very nebulous, and I expect (problematic) CPU throttling to be less of an issue with the more efficient chips and cooling solutions that we have now.
I didn’t bother addressing those points because it’s hard to know where things were going wrong without a lot more detail.
Sleep issues are definitely a hallmark of Windows Modern Standby, but one might hope that will be addressed soon. Supposedly Microsoft is looking into it now.
For me, there are basically two "categories" of new Windows laptops that I'm excited about: innovative form factors, and compact laptops with RTX 40-series GPUs. As much as Nvidia has been charging ridiculous prices for desktop GPUs, it looks like laptops using their GPUs aren't going to cost crazy amounts this year, and DLSS 3 Frame Generation should be a huge benefit for laptop gaming.
In terms of innovative form factors, look at these:
- This dual screen (dual OLED, even!) concept seems like it would be challenging to pull off, but every hands-on review I've seen was really impressed with it, and I can totally see use cases for this: https://www.theverge.com/2023/1/5/23518872/lenovo-yoga-book-...
- The Acer Swift X 14 comes with a nice 120Hz OLED screen and an RTX 4050, so it seems like a nice, balanced laptop, but I wish it had the option for more than 16GB of RAM: https://youtu.be/va3OmoYHKYs?t=298
- The Flow X13 now comes with up to an RTX 4070 and 165Hz display and up to 32GB of RAM, and this is significantly more compact than the Swift X 14, but the display is not as nice as an OLED. This is also a convertible, so you can use the 360 degree hinge to make it into a tablet: https://rog.asus.com/laptops/rog-flow/rog-flow-x13-2023-seri...
- The new Zephyrus G14 is probably the one I find most exciting, offering a 165Hz MiniLED display that's almost as nice as OLED, as well as up to an RTX 4090 (Laptop) and 32GB of RAM, but it is about the same size as the Swift X 14, so not as small as the Flow X13: https://www.techradar.com/reviews/asus-rog-zephyrus-g14-2023
A number of these also have user-upgradeable RAM, SSD, or both. I really like OLED, so it's great to see so many options for that, and MiniLED is going to be significantly more common this year than it was last year. It would be nice if my M2 MBA offered OLED or MiniLED.
Thanks for sharing all these! I'm in total agreement with you that there's an insane amount of cool stuff and innovation happening in the OEM laptop/tablet/hybrid space.
But I'm still really concerned about the consistency of hibernation and sleep so I'm not taking a searingly hot machine out of my bag at 1% battery life. Or the weird quirks of some new hinge or keyboard design that I have to live with for 4+ years of owning a machine.
I'd honestly take half the specs and double the thickness/weight if it meant that my keyboard felt like a keyboard and that I can confidently close the lid at any time and re-open it back to whatever I was doing.
I don't _like_ Apple. But they can do that. And I don't even have to compromise on anything except using macOS (which I can tolerate).
> I’m not exactly sure how Apple was “behind the state of the art” and has caught up. Can you explain?
The M1 was when they caught up. They were behind the state of the art (of laptop/desktop chip performance) with their A series chips starting with the A4. They gradually caught up over the course of several years to the point that when they released the M1 chip (which was really not that different to the A15) they were slightly ahead.
Yeah, as the other person says, it’s weird to combine A-level chips which were mobile only (minus that short stint where they put it in a Mac Mini before the M1 was ready) with the M-level chips.
Sure, they’re all “Apple Silicon” but the chips should be compared to their equivalents, M-level with Intel/AMD and A-level with Snapdragon/Exonys. We don’t really compare Snapdragon to Core any other time, as they both optimize for vastly different things.
I would also argue that Apple is continuing to make big boosts in their A-level chips compared to their mobile counterparts.
I thought the M-level and A-level chips are pretty much identical CPU cores (just different numbers of them). As such it seems reasonable to me to compare single-threaded performance between A-level and Intel/AMD (making allowances for power envelopes). And indeed one could compare Snapdragon, except they're quite far behind.
> They were behind the state of the art (of laptop/desktop chip performance) with their A series chips starting with the A4
...but A series chips were never intended for laptops or desktops? And by all accounts, the A series nearly always out-competed comparable QC Snapdragon chips.
They weren't... but evidence is that that's only because they didn't have competitive performance for those applications. Once the performance caught up they did choose to put them in laptops and desktops.
I think A chips were quite considerably ahead of low-power Intel chips (e.g. the i7 in MBA) several years before M1 was released. But I guess they didn't believe they could scale it up to compete with higher power i7/i9s shipped in pro macbooks.
I mean that's true. But that doesn't prevent comparing performance of natively compiled ARM binaries in A chips vs natively compiled x64 binaries on x64 chips.
20% improvement is substantial even if not worth an upgrade. Add the same for next iteration and it the gap gets sizeable and harder to resist. I'm actually reliefer i can keep my less than a month old 16 inch since it was 700 euro cheaper than a new M2 pro and not worth returning for 20% improvement
> It seems that Apple's ridiculously impressive consistent year-on-year gains up the M1 were in large part possible because they were behind the state of the art.
I was nodding along about M2 being a minor bump over M1, but I'm not even sure what you mean here. Apple previously used Intel processors - how is Intel not caught up with Intel?
Apple was key to many many computing technology adoption - USB, Thunderbolt, even early WiFi. Now that they're making their own desktop processors and entire SoC, I think they can move in ways that are not constrained by Intel or other players. It's possible we see the laptop market move more like the smartphone market.
You’re right in that TSMC / Samsung were for many years behind the state of the art, and by extension so was the technology that Apple was using. Apple’s CPUs defined the state of the art for mobile though.
And now of course TSMC / Apple haven’t caught up, they have surpassed comparable offerings from Intel.
> Now that they've caught up with Intel and AMD...
I'm an enormous fan of the M series. But there is an interesting consequence of a couple of fundamental design decisions.
The M series has huge memory bandwidth but look at its focus on I/O. It reminds me of one of the design decisions of the Alto (memory bus was 3/2 the screen refresh rate, a mind-boggling decision for its time). The fact that the M1 can go in an ipad is insane, but is enabled by the way the M series was designed. Their design for long battery life while drawing onto directly connected displays is unmatched.
However I believe that same decision has hobbled the M1 in a way that may make a Mac Pro version "impossible" (i.e. too much change to be worth doing). M series are optimized to dash rapidly then quickly go to sleep.
I feel the Intel and AMD guys are still thinking of sustained performance, a holdover from the desktop world and its mainframe, or at least minicomputer roots. Psychologically their mobile chips look to me like scaled down desktop machines.
If my belief is right, AMD and Intel aren't really catching up on mobile, while Apple will probably never produce a Mac Pro worth buying (for me they never were, but I'm sure there are people for which they were a great deal).
Apple's CPUs do very well in sustained performance. They fit 8 high-performance CPUs and 2 efficiency cores into a 40w peak power envelope while hitting peak clockspeeds (that 40w includes all the IO, mostly idling GPU, idling NPU, SSD controller, etc). Based on M2, their new chip should be 3.4-3.5GHz with 4 efficiency cores that get 30-40% better IPC all within that same power envelope.
AMD puts 96 cores into a 360w TDP (not counting spikes that go higher than that) at 3.6GHz. Apple could most likely fit 72 high-performance cores and 36 more efficiency cores into that same 360w TDP.
Given that AMD's chips require much higher clockspeeds to hit the same total performance (nearly 5GHz for Zen 3 to match a 3.2GHz M1), the final product from AMD would likely be quite a bit slower overall.
I believe the real reason for the Mac Pro not hitting the market is their insistence on unified memory. At that size, unified memory and controlling latencies explodes in complexity.
Even worse, the Mac Studio already appeals to most of the higher-end market meaning this $20-50k system probably doesn't have very many buyers either. They could sell such a product in the server market, but they left that market years ago and the reinvestment costs would be massive and very high-risk.
Yes but people buying current gen Mac Pros probably value upgradability and want/need discrete GPUs. Just try to imagine how much would Apple charge for 256GB based on M1/M2 pricing.
Apple was never competing in the same space as Intel and AMD. From the beginning, Apple made ARM-based RISC chips. Intel and AMD used their own x86-64 architecture. Apple's was great for iPhones because of power efficiency. They were able to improve their chip designs so much that they smoked the competition away with the release of their first fusion chip (iPhone 7 I believe) and have been miles ahead of everyone else since.
They then scaled up performance so much that a desktop ARM chip was made. That had never been done on a large scale before. So, no, imo Apple was never behind Intel and AMD, they were never competing in the same space.
You seem to have a short memory. In the beginning, the Apple I had a 6502.
The Macintosh ran on a Motorola 68000. Power Macintoshes used PowerPC chips.
Then came Intel Macs, they started with 32bit CPUs, then 64bit Intel chips. And finally, we got ARM Macs from Apple.
They did say 20% faster CPU and 30% faster GPU than M1 series in the same thermal envelope.
Intel doesn’t really seem to make any strides in low-power designs, they just throw more mid-power cores at the problem to achieve better multi core efficiency as well as manipulate power brackets. Shirt-time (benchmark-relevant) consumption of Intel chips is insane and their published TDP figures are utterly meaningless. AMD has very scalable cores and Zen4 performs admirably at low power, but AMD too falls victim to power inflation to keep pace with Intel.
They do give some numbers vs M1s, but most people actually considering buying this will be coming from the Intel ones (no-one replaces their laptop every year) so it makes sense to labour those in the marketing material.
Traditionally Apple's marketing compared against the previous generation, not multiple generations old hardware.
It seems tacky to me because the M2's competition isn't really 2019 Intel MBPs- it's laptops using modern Intel/AMD CPUs (the M2 may still be better than those, but if that's the case those are the benchmarks they should be giving us)
I think the reason for the big push is to get people to switch over. At this point, they are still supporting some Intel macs and needing to try to keep some parity with previous macs is probably just a cost sink now with how their future trajectory is going. Get all the people off the intel macs ASAP so they can drown intel support in the bathtub.
Granted, I don't think they will 100% kill intel. My guess is, it just won't be public. They will keep a few internal machines that MacOS is minimally functional on so if they ever want to switch back, they already have a base. Would be my guess.
Apple is not particularly concerned with competition, it’s all about the upgrades. If you are still on Intel Mac, the numbers look great for an upgrade.
Less is more. I'm guessing the % of M2 customers that are upgrading from an M1 is less than 10%. If ~75% (my guess) of target customers for M2 are Intel mac owners, Apple should focus on them first. They can then do more targeted ads w/ relevant stats to win each subgroup in the remaining 25%
I didn't buy an M1 because I didn't want the "first" Apple processor. Maybe it's just my bias from seeing lots of Intel competitors come-and-go? I don't really care about specific improvements, I just want all the minor things that the engineers didn't get to finish the first time around.
In my case, living in the US, we all do our taxes in the first three-ish months of the year. Last year Turbo Tax told me they wouldn't support my 2013 MBP this tax season, so I decided to wait until fall 2022 and get whatever Apple released. Fall 2022 came without a release, so I've been biting my nails: Turbo Tax won't run on my MBP, and I don't want to upgrade to an "outdated" M1.
(As you can infer, I'm planning on using my next MBP for ~10 years.)
I know it's already purchased so not a big deal, but if anyone is in a similar situation, there are other free and/or well-featured tax processing tools.
I use FreeTaxUsa, which is entirely online and free for your federal return. I think it may be $15 or so for state returns, but I live in Florida so it's not really something I think about.
Any half decent state also offers free online tax returns. If all you are is an employee with stocks/bonds in a brokerage that gives you the necessary forms, it is super easy.
It does some calculations itself, and some you have to do yourself, but it is very easy to use and file online, and get a pdf return to save for yourself.
Any half decent state also has had free online tax return filing for some time now.
I've been using TurboTax for almost 20 years. Yes, I could save a few bucks by doing it a different way, but I learned how to do it one way and I don't want to learn how to do it a different way.
The bigger issue has to do with how US politics work. "What makes sense," is what most countries do: the government sends you a completed tax return and you just verify that it is completed correctly. Due to lobbying, the US government is prohibited from telling you how much tax you owe. Instead, what happens is the government does your taxes for you in secret, and if the amount you pay is significantly different from what they believe you owe, the government sends you a bill with a suggested payment. You don't have to actually pay the amount that they suggest, but if you don't, you better have a darn good reason to explain why the government calculated your tax wrong.
In general, it is to your benefit to err in your favor. The interest rate on a "tax mistake" is extremely low. But, if you accidentally overpay, the government will very rarely send you a refund automatically.
“ MacBook Pro with M2 Pro features a 10- or 12-core CPU with up to eight high-performance and four high-efficiency cores for up to 20 percent greater performance over M1 Pro.”
Performance will be comparable to top Zen4 mobile CPUs at 45W TDP, just that Apple will use 50-60% less power in single-core and 20-30% power in multi core. That’s about it.
Edit: I no-longer hold this opinion as I missed that there's already been an M1 MacBook Pro and it should be comparing to that.
For as long as I can remember, Apple has compared new products to the previous product. And it is a more sensible approach than picking some arbitrary competitor from a huge collection of possible options, because it gives a fixed frame of reference, one of which many of the target audience are aware of. 6x faster than the previous model is very good marketing.
Honestly, I think this is also partly them patting themselves (or their silicon team) on the back for a job well done. And as everyone else has said, their competitor is the fastest Intel/AMD chips if they're trying to win market share.
Does anyone know "the fastest Intel-based MacBook Pro" means? In my mind, I've always pictured them using the 16" i9 model that throttled before it could hit the boost clock, let alone sustain it.
> Results are compared to previous-generation 2.4GHz 8-core Intel Core i9-based 16-inch MacBook Pro systems with Radeon Pro 5600M graphics with 8GB HBM2, 64GB of RAM, and 8TB SSD.
My guess would also be the 2019 16" Pro, the last one released before the M1 wave. That happens to be the one I have now, I'm still pretty happy with it! Great battery life, doesn't get lava hot unless I have a bunch of plugins going in Ableton or something else demanding.
I had that machine for about 6 months and found the swapping between nvidia and integrated graphics to cause such unbearable graphical glitches [1] (well from my self-diagnosed OCD-influenced perspective, anyway), that I sold it on Craigslist for a $500 loss immediately.
I'm afriad you are going to need to get used to it.
The £ is depreciating against the dollar, and wages in the UK have been stagant for 10+ years and are set to remain stagant for another 10 based upon current forecasting.
Based upon current trends countries like Poland and Slovakia are going to overtake the UK on many economic metrics within a generation, which is both a credit to the growth of those countries and the enmourous decline of the UK relative to places like Germany, France, and the US.
Imports are going to just keep getting more and more expensive.
brexit was - from an economic suprarational viewpoint - obviously a non-forced error. the resentment that eventually got channeled into this populist surge is very real and stems from the same deepening inequality that plagues the US and other globalizing economies.
people scraping by outside London see the decades going by and their lot not improving, and eventually the busy beavers crafting messages found the right ones ("the UK is paying so and so much pounds every day to the EU instead of funding the NHS more")
this solved nothing, people scraping by still have low economic productivity. infrastructure development outside London is late by decades (because every time there was a cost increase it slowed down basically, and there were a lot of cost increases - every recession, interest rate cut, Baumol effect, etc meant costs going up)
similarly development inside big cities is stunted because of cost increases.
because a lot of globalization made local services relatively more expensive (this is the Baumol effect basically) and all the affected areas are very labor heavy: NHS (healthcare), education, construction.
then all this recently culminated in idiots getting elected (Boris Johnson, etc) who then made things worse by not doing anything.
https://noahpinion.substack.com/p/why-the-uk-is-having-an-ec... - in this paywalled post Noah argues that the UK is going through a mild version of the emerging economy crisis (like Sri Lanka), because its import dependent, trade deficit, capital flight, pegged currency (basically for price stability the Bank of England has to), and lots of borrowing in foreign currency (USD, EUR), and accelerating inflation
Totally agree regarding the UK, I was mostly trying to understand where the belief in the "lockstep decline" of the other countries (Germany, France, ...) comes from.
Apple products are produced in Chinese Special Economic Zones (SEZs), so they have to be technically imported into China if they are sold there, and are produced with machines and labor paid for in dollars.
The pound has gone to shit and the UK price includes VAT. Take away VAT and use current exchange rates and you are paying a $200 premium in the UK, so 10% above US price is actually a smaller difference than it has been in the recent past when the pound was $1.5 but the only difference between the two price lists was changing a $ for a £.
This is false. Apple adjusted price for products when exchange rates changed. They don’t do it daily, usually it happens at product releases like today.
I do get sticker shock at the prices, but I have to say I still use my 2015 MBP for dev work and it's been serving me well. That's quite the longevity... I'd hope if I got an M2 (thinking about upgrading in a year or two), that it lasts a similar duration. ~8 years, I wouldn't mind shelling out that price, especially since it pays me back so quickly (some consulting work).
Have to check to see how reliable those thinkpads are but I also just enjoyed the physical experience of a Mac (know it's not for everyone and subjective)
I used a 2013 MBP for work, and when work upgraded me to a 2019 MBP I kept the 2013 for myself.
Since then, the 2019 macbook has been having tons of problems, crashing, thermal issues, software issues, keyboard issues. But the 2013 has been working like a dream. My coworkers have many similar experiences.
Mid-2010s was definitely a peak for macbooks. Hopefully M1s/M2s will be another peak.
2019 MBPs unfortunately weren't great. I had horrible battery life issues after years of heavy use and got my work to replace it with a M1 max 16". Night and day difference, a huge huge step up. The thermal issues on the 2019 models were really ridiculous.
I run Turbo Boost Pro Switcher to help with the heat and disable it when I actually need the performance. It was crazy to hear the fan on this laptop-shaped room heater just go on and on while I was just working on code.
I ran thinkpads for years. There is a nice market for previous generation ones new in box with 3 year next business day warranties for stupid money. I got my daughter a 13 month old T495s for £600 for example. That’s still under warranty now.
But yeah I’ll bitch and moan and just give apple the money like I always do now n
A new Thinkpad x1 with an i7, 16gb ram and 1tb ssd is at £2400. And you cant get more than 16gb of ram. imho the MacBook Pro is not unreasonable expensive.
While the value of the £ has rebounded somewhat since a low in September, it is still much below what it was just a couple of years ago. Apple is basing it’s prices in USD and that has hit the £ and the Euro over the last six months or so.
Yeah, I get that it’s a reflection of the state of things but it doesn’t make it any less disheartening. I just expect every new generation of their products to come with a £100-400 price jump now.
Or, in the case of the iPads last time round, a price jump for the exact same product.
Same for me, I bought it thinking I'd use my desktop more anyway so I wanted something very light, fanless and with long lasting battery – and I've ended up using it as my daily driver since the summer.
It's absolutely flawless, and very good value for the price (even with the overpriced RAM)
Yeah, that's a fair point and probably my only complaint. It's probably necessary (for now) to get the rest of the hardware to be what it is. Probably a high-res high-refresh display would require everything else.
My personal machine is a 2014 13" MBP. I haven't needed to replace it, but that time is coming relatively soon. I've been contemplating just going with an MBA instead of a MBP for my next machine. I don't do so much heavy lifting that I think I need it. It's a weird feeling because I think of myself as a "power user" and in the past that always meant getting the most computer you could afford. Now I'm not so sure...
That makes me feel good. I ordered a M2 MacBook Air last week knowing that Apple might release new MacBook Pros at any time. I have family coming to visit and most Apple products are significantly cheaper from where they are vs where I live, so I was hesitant about what to do. Looking at the prices, I think I made the right choice. A 32GB MacBook Pro 14 w/ 1TB would be about $1500 USD more than the MacBook Air M2 with 24GB and 1TB, since I'd have to order it from here. Obviously the extra ports and RAM would be nice, but not that nice!
Possibly, although the M2 Pro mini almost entirely cannibalizes the base Studio. For the same $2000 you can get the the mini with the same memory and storage, a significantly faster CPU (12 core M2 vs 10 core M1) and approximately equal GPU (19 core M2 vs 24 core M1). The Studio now only makes sense if you need the M1 Ultra or lots of memory.
Yeah, I understand from a product perspective, but the studio feels like a bit of a weird product to me, I'd much rather either buy a mini form factor for the size, or just get a proper desktop machine (since the Mac Pro has been outdated for a while)
I just got a Studio (still in the box waiting to be set up), and I'm really curious to see how it does. I love the idea of it; it's specced more than I could get in a laptop, which is exactly what I want for my desktop work.
I never really got the appeal of the Mini, because it wasn't specced much better than a MacBook. The Studio seems to be a nice middle ground between a MacBook and a mac Pro, which I know I'll never need.
> I never really got the appeal of the Mini, because it wasn't specced much better than a MacBook.
The appeal is in the form factor: you buy Mini, so you can hang it behind your TV using VESA mount, and you use it to watch Netflix on 65" OLED screen. Or you put it in the corner of the room and use it as a home server. You do not use Mini as laptop replacement :) Although I have to say this: Intel NUCs a quarter of the size of Mini, so much better for both the purposes above.
If you were to go over 32GB of RAM you are probably better off switching to the very similar Mac Studio with the M1 Max which can go to 64GB. If you want an M2-based SOC, I suspect that the Mac Studio will get an upgrade later in the spring.
I really hope the next Macbook Air supports multiple external monitors, that's the only thing holding me back from it right now. I know there are DisplayLink solutions but not sure how well they would work.
I didn't even notice the mac mini bump. Got mine in aug 2021 - more ram would be nice. And with medium drive, that puts it at ~ $2k. At that price, I'm more tempted to go back to the laptop world.
> With up to 96GB of unified memory in the M2 Max model, creators can work on scenes so large that PC laptops can’t even run them.4
...
> 4. Testing was conducted by Apple in November and December 2022 using preproduction 16-inch MacBook Pro systems with Apple M2 Max, 12-core CPU, 38-core GPU, 96GB of RAM, and 8TB SSD, as well as a production Intel Core i9-based PC system with NVIDIA Quadro RTX 6000 graphics with 24GB GDDR6 and the latest version of Windows 11 Pro available at the time of testing, and a production Intel Core i9-based PC system with NVIDIA GeForce RTX 3080 Ti graphics with 16GB GDDR6 and the latest version of Windows 11 Home available at the time of testing. OTOY Octane X 2022.1 on preproduction 16-inch MacBook Pro systems and OTOY OctaneRender 2022.1 on Windows systems were tested using a scene that requires over 40GB of graphics memory when rendered.
Two things:
- The Quadro RTX 6000 shipped in 2018 and the GeForce RTX 3080 Ti is a 12GB card vs. the 24GB 3090 or 3090 Ti, much less a 4090. I get that it's a marketing eye-roll claim, and it's cool to see a laptop post up against those specs, but why is Apple even bothering to measure performance against 4-year-old or under-specced cards? I wouldn't expect a 40GB OctaneRender scene to run on a 12GB gaming card or 4-year-old Quadro on any system.
- If 60% of VFX workstations are running Linux vs. 11% of macOS,[1] how does the M2 Max MBP stack up against a garden-variety Linux workstation?
It's not? Are you getting A/B tested or something?
I see the RTX 6000 Ada and A6000 in both the hero image and the first products listed. The Quadro 6000's a 24GB 4,608 CUDA-core PCIe 3x16 card[1] and the A6000 is a 10,752 CUDA-core 48GB PCIe 4x16 card.[2]
I know NVIDIA's branding and product line naming sucks, but those clearly aren't the same cards, or even same generation of card.
I see. So NVIDIA has actually made a completely new GPU with the exact same product name, with a new architecture? That's cool. Totally clear and understandable.
Though there's one detail: Apple probably couldn't have tested against the new "Ada" one because it's not even available yet, so they still tested against the best available workstation GPU, unless I'm missing something.
"Notify Me" button on the page for RTX 6000 Ada says "You will receive an email when the new NVIDIA RTX 6000 Ada Generation graphics card becomes available." I think it's unrealistic to expect companies to benchmark against future products. (not saying you expect that haha)
I like how it shows that we all just casually know that everything websites show you is constantly in flux and tailored to a particular person via Orwellian tracking measures
So I still don't know why their benchmark is a 2.5-year-old laptop.
I'm also noticing that Apple didn't list the generation of i9 in that marketing disclaimer, so this makes me think they're also possibly comparing against 2- to 3-year-old CPUs.
They are not. The person I responded to also quoted the text I will quote from Apple's page:
> Testing was conducted by Apple in November and December 2022 using preproduction 16-inch MacBook Pro systems with Apple M2 Max, 12-core CPU, 38-core GPU, 96GB of RAM, and 8TB SSD, as well as a production Intel Core i9-based PC system with NVIDIA Quadro RTX 6000 graphics with 24GB GDDR6 and the latest version of Windows 11 Pro available at the time of testing, and a production Intel Core i9-based PC system with NVIDIA GeForce RTX 3080 Ti graphics with 16GB GDDR6 and the latest version of Windows 11 Home available at the time of testing.
>a production Intel Core i9-based PC system with NVIDIA GeForce RTX 3080 Ti graphics
To be clear, NVIDIA (and Intel) use identical names for their desktop parts and for their laptop versions. There is the 3080 Ti (GA102-225-A1) which has 10240 shaders clocked at 1365 mhz and a 350 watt TDP, and the 3080 Ti (GA103) for laptops which has 7424 shaders clocked at (up to) 1230 mhz and a (up to) 150 watt TDP. The former does about 305 fps in userbench and the latter does 193 fps. (Until it thermal throttles) https://gpu.userbenchmark.com/Nvidia-RTX-3080-Ti/Rating/4115https://gpu.userbenchmark.com/SpeedTest/1775790/NVIDIA-GeFor... There is an equivalent bait and switch for the Intel i9 desktop vs mobile.
It's unfortunate that Apple isn't making themselves 100% entirely clear on this, but to be fair, neither is the competition.
Apple have caveated the comparison by specifying "PC laptops" and according to NVidia's own site [1], and clicking through to see the actual specs [2], there aren't any currently available that out-spec a 16GB 3080 Ti.
I'll admit I haven't gone spelunking down the specialist laptop manuafacturer sites, but on the surface it seems to be not an unrealistic claim.
> creators can work on scenes so large that PC laptops can’t even run them
You can't open a 40GB scene entirely in GPU RAM on any single-GPU system, laptop or otherwise, because there aren't any 40GB+ GPUs.
But you can open a 40GB Octane demo scene with out-of-core loading enabled on a Windows laptop with 64GB of RAM. Hell, Otoy has a demo from 2018 of a Windows system loading and editing that "worst-case" 40GB scene entirely out-of-core at 60fps.[1]
So the suggestion is that the M2 is doing it without enabling out-of-core loading because all system RAM is in-core. Which is cool, and something Otoy's CEO was boasting on the M1 MBP's release day two years ago.[2]
So why bother going through the motions of benchmarking anything like this against 2+-year-old systems, just to make a claim the M1 also made, just less precisely?
With the recent Stable Diffusion boom, It's been funny that the tutorials requirements look like this: Some beefy workstation hardware or Macbook Air M1.
The unified memory architecture is a bliss and they are talking the truth about some popular contemporary workloads being out of the reach of the most PC.
"PCs can't even run this!" ....when using a PC with specs too low to run the specific thing we chose to do.
"Mac's can't even run notepad.exe!" would be fair using Apple's approach to these kind of claims (i.e. choosing to run software with requirements that you know they cannot meet)
Would an eGPU enable certain laptops to do this? I get that that's 'cheating', but also is it? I need a shit ton of peripherals to run my Mac workstation
I wonder what the breakdown of software used for VFX is in this report? There are tools like Adobe After Effects that doesn't run in Linux. I know about Fusion and Nuke, but wish the report recorded what software was being run on each OS.
Apple has a point tho: what PC laptop or what typical PC for that matter (and that price) can give you 96 GB of VRAM?
If you could run ML models properly on that machine, it would be pretty nice for inference on larger models.
Now, of course, Apple being Apple they expect you to hand-code ML algos in Swift (lol. lmao.) but still, 96GB of VRAM is 96 GB of VRAM.
Edit: Just to clarify, I understand that this has 400 gbs of bandwidth, not 3,2 tbs like an Nvidia Accelerator with the same size. But the latter costs tens of thousands and requires, well, a whole datacenter probably.
This allows you to run some GPT-X or Diffusion model on your laptop. In theory.
I think Apple is making a point that their GPU has 96GB of ram which no enthusiast PC Card can touch. The A100 (2020) goes to 80GB but is $15K (but also much much faster). So if you need that extra 16GB…
There are several comparisons scattered throughout the announcement:
- Rendering titles and animations in Motion is up to 80 percent faster1 than the fastest Intel-based MacBook Pro and up to 20 percent faster5 than the previous generation.
- Compiling in Xcode is up to 2.5x faster^1 than the fastest Intel-based MacBook Pro and nearly 25 percent faster^5 than the previous generation.
- Image processing in Adobe Photoshop is up to 80 percent faster^1 than the fastest Intel-based MacBook Pro and up to 40 percent faster^5 than the previous generation.
Those numbers are somewhat surprising since they're for M2 Pro, and later on in they say the M2 Max, presumably faster than the M2 Pro, "delivers up to 20 percent greater performance over M1 Max."
"MacBook Pro with M2 Pro features a 10- or 12-core CPU with up to eight high-performance and four high-efficiency cores for up to 20 percent greater performance over M1 Pro. "
"MacBook Pro with M2 Max pushes workflows to the extreme with a much larger GPU featuring up to 38 cores and delivering up to 30 percent greater graphics performance over M1 Max"
I assume their main target is those still on Intel Macs as those who are already happy with M1 can't really justify already upgrading to M2 as it doesn't offer any major bump.
All PR is meant to sell something. Those with M1s are mostly already happy (and most of those in-the-know crowd are aware that M2 isn't as big a step up from M1).
However if you're an Intel MBP holdout, those are the folks who are the target market for this announcement. I have both (M1 Air, 15" 2018 MBP) and I've stopped using my Intel MBP despite it's bigger screen and better storage/RAM.
I've been waiting over a year for this. My M1 8GB is simply not up to spec for iOS Dev work. Apple Store literally had nothing with 16GB when I had to buy it. I asked about when they could get one in -- Apple store assistant told me it would be at least 3 months. This was in Ireland during Covid.
For reference my 2016 MacBook Pro had 16GB RAM so it was surprising to me that 6 years later I would be unable to buy one with 16 GB of RAM. So while the CPUs are amazing what the hell are Apple doing selling machines with such little amounts of RAM?
I would strongly recommend the M2 air at 24GB, I too bought one of the first M1 airs with 8GB ram and that laptop is just gathering dust right now, but the latest ones are great, the increase in RAM really lets the processor shine.
Would you recommend it over the 32GB 14" M2 MBP? The price difference is $500 but for the same amount of storage and cores, you get another 8GB of memory, multi-monitor support, faster wifi, another tb4 port, etc. Honestly despite their newness I was hoping the airs would see a small price drop.
Only if you are doing long sessions of heavy processing. Even then it just slows down by 15-20% so if you aren’t doing that on a regular basis it shouldn’t be a problem. Under normal loads there is no throttling. If you do video editing on a daily basis, you probably want to look at the MacBook Pros.
To make money just like you showed. You got the 8gb and now you want to pay more to get another one with M2 and 16gb+. Apple product segmentation is legendary for extracting the most from consumers.
It's also not enough for VSCode and Web Development. Opening VSCode on a medium sized Next.js project takes a good minute or so before Intellisense kicks in.
But I'm not willing to pay the prices on the UK website. I think an M3 Air with 32GB will be my next purchase.
One advantage that I have with the M1 Pro 32 GB RAM over my gaming desktop is that I'm able to run large ML models such as Bloom, Whisper, and Stable Diffusion with reasonable performance.
How are you running them so well? What do you use for your device since CUDA is obviously not supported and 'mps' is not very impressive, compared it to just about any NVidia GPU including the aging 1080ti[1]
Out of curiosity what are your desktop specs? I have stable diffusion running quite well on a few different systems of varying spec. That said, it's great to be able to take it on the road with you.
I bought a MBP 14 (max 32, 32GB, 2TB) Dec 2021 and I mostly don't regret it (probably should've only gotten 24 GPU cores and gone silver). Here are some pros and cons I don't think get enough play:
- Turning on "Low Power Mode" when on Battery gives me roughly 10h of battery life. Functionally this is more than good enough for me.
- Backlight leakage around the keys feels janky
- It's pretty hard to pick it up unless you get it by the back vent, also yeah, it's heavy
- The resolution is pretty weird. I wish it were really 4k, then bumping it down to FHD with an integer scale would make sense. As it is, the integer scale is 1512x982, which many websites think (including some you might build!) is a tablet.
- Viewing angles on the screen are not at IPS levels. At regular usage distances you can definitely see a difference around the outer 1/3 of the screen.
- Dark text on a light background is different than light text on a dark background. When I do `set bg=light` in Vim I usually need to use a thicker font or turn on antialiasing.
I think most people should get the M2 Air (or the M1 even). I have a couple of reasons that going 14 max makes sense for me: I moved abroad and my brother and I keep in touch via gaming (a lot of Steam games, etc. work pretty well on it), and I'm planning on getting back into music. Otherwise, I would argue that:
- the IPS screen on the Air is better for text
- the Air's battery life is another class
- the Air is much lighter
- the Air's resolution is more mainstream and usable
Some Air things are dealbreakers. M1 tops out at 16GB, M2 at 24, and they only do 1 external monitor. But w/e.
Another reason the Air is a good choice for most people is that it doesn't have a fan. I've cleaned quite a bit of dust/lint/etc. out of various laptops. Without a fan, that's much less likely to be an issue.
In one recent case, a family member was about to buy an entire new laptop because their old one was feeling very sluggish. I checked it out and realized that the fan had completely died and the system was constantly thermally throttling. A $12 replacement fan (and a thorough cleaning and CPU repaste) allowed them to put off a $1k+ purchase for at least a couple more years.
Another vote in favor of the Air for most people. I have an M2 Air with 24gb and a 16" M1 Pro or Max from work with tons of RAM and while I'm aware the 16" is vastly superior in a lot of ways on paper, in practice I hardly ever notice (or actually can't tell). The M2 has been powerful enough for everything I've thrown at it and then some; CaptureOne, dev work in Scala, fairly large Rust projects in VSCode, the occasional VM, 3d modelling, never felt bogged down. 24GB has been plenty for me and I've been kinda worried since I had more on my previous machine, but even with a bunch of containers and a VM here and there plus hundreds of tabs, half a dozen VSCode windows, CaptureOne in the background, a bunch of electron apps, ... it holds up just fine, with room to spare; not sure how that's possible, but I'm pretty sure it's managing to fit almost as much into 24gb than a previous Intel did into 32gb.
Having no fan is an absolute game changer not just because of dust, but this way I can have the machine running near its thermal ceiling for extended amounts of time without going crazy from the fan noise, very handy when gaming, which I wouldn't want to do on the other Macbooks for that reason. I had to put a CPU indicator into the menubar because there's no way to hear high load anymore, so it's very easy to miss that Steam has crashed again and is now keeping all cores maxed etc. The silence really nice when using it in a quiet space, e.g. for tethered shooting without disturbing people, or in the quiet coach of a train, absolutely love it. I'd recommend the one with fewer GPU cores since that might give you slightly worse graphics, but it'll thermal throttle later and less overall when gaming or rendering.
Other differences that aren't as drastic as the specs suggest include the display, which other than the difference in real estate, I can't say I've ever really noticed to be worse, it looks gorgeous despite its significantly lower specs. The 16" has very nice sound for a laptop, but the Air sounds way better than anything that thin should, it's better than the stereo speakers in my Dell monitor, fine for a quick round of casual gaming and the like. Connectivity might be a dealbreaker (esp. the single external display), but it fits what I need well (except for the SD card reader, would love to have that). And it's pretty lightweight, very sturdy, thin, compact, which is what ultimately made me pick it, much more suitable for travelling and having it in a backpack when cycling and the like. It'll even run off a tiny 30W power bricklet!
How do you know this stuff? My old 2014 MacBook Pro has the battery swelling up. I can’t afford more than a few hundred to fix everything. It’s a 2nd comp I use for a remote coworking Zoom session all day.
I could have the same problems. I’ve never cleaned the laptop either..heh. I am a programmer so I am not afraid of hardware.
First, as raincom mentioned, an expanding battery is a major concern. I'd recommend putting it somewhere fireproof immediately (such as in a metal trash can, or on a concrete pad), and making a plan to have the battery removed and properly disposed of in the near future.
Running the battery down to below 30% might also be a good idea, but only after it's somewhere fireproof.
To answer your question, though, I've learned from iFixit guides, youtube videos, just taking things apart and tinkering with them, etc. Apple uses some oddball screws sometimes, but you can get a toolkit with everything you need for $30 or less.
I'm more of a software guy, but I mess with hardware occasionally. (I'm also willing to pay a professional if I feel like it's over my head.)
Battery swelling = battery replacement. I suggest you replace the battery. Don't wait, as swollen batteries can explode. Apple will charge $200 to replace, or try a local authorized dealer. You can try install the battery yourself by ordering from iFixit.
If I were you, I would just get a refurbished M1 MacBook Air from Apple, instead of spending $200 on the battery.
I had an M2 Air and ended up swapping for the 14 M1 Pro. The Air was plenty powerful and I loved the weight and battery life, but:
- The 14" Pro's native resolution is 3024x1964, which means at 1512w you're getting perfect doubling of the pixels. The M2 air is natively 2560x1664 but it defaults to 1470x956, so you don't get that clean doubling and the sharpness suffers IMO. You can drop it down to 1280w to increase sharpness but the cost in screen real estate is significant. I'm not sure understand your point about the Air's resolution being more mainstream or usable
- The black levels on the MPB's mini-LED LCD screen are vastly better than the M2 Air.
- As someone who's only used MPBs over the last 10 years, the speakers on the Air sound truly awful — tinny, harsh, low volume. Not just for music, but I found listening to a podcast or watching a TV show to be fatiguing and unsatisfying.
- I upgraded my Air's SSD because the default SSD is slow, so the price was $1500+tax; Best Buy was running a sale on MBP 14" that put it at $1600+tax, so all told the significant price gap was almost entirely negated.
Those are good points; I will say in particular the speakers on my MBP are amazing and have provided the music for lots of occasions, and video calls are great. But my partner has an M1 Air and they're totally fine to me. Maybe if you're sensitive you should beware, but then you should beware of all non-MBP laptop speakers.
I use the incredible SwitchResX [0] so you can get the res and refresh rate you want. I wear contacts so I tend to use screens at native resolution, and I'd probably either have the Air at native or one tick down. Native on the 14" is too small even for me (this is less font size and more low-contrast UIs though), but I had my 2018 MBP 16" at native and it was pretty good, so maybe what I'm complaining about doesn't apply to that size.
Black levels are good, but I suspect the blooming is responsible for the difference between dark on light and light on dark. If I were in video or photography this might be a good tradeoff, but for Vim it's not ideal--I'd prefer an IPS.
I'll also say my MBP was $3,700, so jumping down to the M2 Air (24GB, 2TB) is a $1,200 savings. There's a lot you can do with configurations, but higher end MBPs are just in their own class. I justify it by saying "well of course I would spend $5/day to use a computer, which is about 2 years, and if I stretch it out to 3 it's a pretty good upgrade cycle (plus resale maybe).
Yes, M1 Air still kills it. I use one for app development, and the gains of Pro just aren't worth it. Plus it's superlight and fanless, and can easily drive 1440p screen at 165hz, so I don't even miss the high fps screen.
As it is, the integer scale is 1512x982, which many websites think (including some you might build!) is a tablet.
That's odd, I've never noticed that on my 14". Websites usually choose layouts according to width and 1512 is well into desktop territory.
the Air's resolution is more mainstream and usable
The Air is 2560x1664, and at the default (non-integer, so less sharp) scaling it presents as 1470x956 which is also non-standard and lower than the MBP.
It's pretty rare; usually it's in the form of a sidebar defaulting to collapsed or a hamburger instead of nav buttons. The main thing is you can't have anything else taking up horizontal space.
Ah yeah, here [0] MUI's xl breakpoint is 1536px, which is 24px wider than the integer scaled width (1512px). You'd also ideally like a little margin too, so really 1600px is kind of a bare minimum for web dev work IMO, so I imagine most people are scaling up to (the non-integer) 1800x1169.
> The governing members of the Alliance for Open Media are Amazon, Apple, ARM, Cisco, Facebook, Google, Huawei, Intel, Microsoft, Mozilla, Netflix, Nvidia, Samsung Electronics and Tencent.
They are there now, but are they founding members? Did they push for av1 anywhere? It seems their AOM membership is mostly a signal to the license cartels that eventually "paying for codecs is not going to be ok, unless you give us a good deal".
They aren't founding members, sure, but they are governing members. I doubt they'd be allowed lead the organization built around AV1 without helping it in some way.
> It seems their AOM membership is mostly a signal to the license cartels that eventually "paying for codecs is not going to be ok, unless you give us a good deal".
Yeah, I really hope this isn't the case. A universal, free, open codec like AV1 would be great to have.
There is some evidence[1] of Apple adding support at least.
As an alternative to this, you can grab a thunderbolt dock. You can input 2/3 monitors into this, and all your other peripherals - then one cable into your macbook.
My Caldigit TS4 charges my macbook at 90W, has 2 x 27" monitors running at 120hz (i could go higher but i can't perceive the difference so i'll save the bandwidth), has 2 external drives and can read my SD card to import photos. The thing has so many ports and if you have lots of peripherals it doesn't matter how many ports your laptop has as you'll not want to bother plugging them in and out if you move around a lot (I do). I can't rate the TS4 highly enough.
I spent months researching these docks to be sure it fit my needs. You can get cheaper if it’s not thunderbolt, or has less ports or don’t want power via the cable, or don’t want to drive your monitors at a higher refresh rate.
Or just buy the TS4 and you never have to worry. It’s by far the best reviewed and reliable.
I use a thinkpad USB-C dock on my M1 Max MBP. My only complaint is that the displays do not work with FileVault. Apparently the driver will not load until the boot volume is decrypted.
It would be nice if apple made a good, trusted docking station that just works with their laptops.
Probably not. Last I looked into it, it seemed like a driver issue, where macOS chooses to setup the MST screen as a mirror of the other MST display instead of a separate display.
Well the OS level support for multiple monitors us essentially user-hostile so not sure why you would try this anyway. I gave up on dual monitors as soon as Ultrawides became available and my Mac life is better for it.
How is development setup on the M2? Specifically Docker and Python. Docker VM has always seemed rather meh on my M1 and installing different python versions with pyenv either goes well or has _a lot_ of workarounds.
Docker Desktop on Intel Mac (the "supported" config) is not super stable as is. I'm going to wait for official Docker support until upgrading to an M2 (or M3?) Mac.
Seriously thinking about getting a fast AMD Linux box to run Docker workloads network-locally, I'm working from home 99.9% of the time anyway.
Shameless plug on this topic: I've been working on a new Linux+Docker+Kubernetes solution for macOS recently! Already has quite a few improvements over existing apps including Docker Desktop, Rancher, Colima, etc: fast networking (30 Gbps), VirtioFS and bidirectional filesystem sharing, Rosetta for fast x86, full Linux (not only Docker), lower CPU usage, and other tweaks.
I have no issues working with Docker/Python on a M1 Macbooks. Docker Desktop/Colima does the job well, and most of the images I use are available on arm. Running amd64 images is slower, but works well.
For Python, Python itself is rarely the problem, but more often some older libraries that don't have wheels for M1 Mac, or incompatibilities. Upgrading your dependencies, if it's an option, is often enough.
Asdf + PDM are also nice tools to work with multiple projects/Py versions easily.
That's OS-specific, and doesn't really have much to do with the chip. Managing Python will be frustrating on every OS, but running Docker software will always be uniquely slow on MacOS. If you want native Docker performance, you need a kernel with native Linux support.
I was a holdout for a while, but try Podman. I finally upgraded my Intel Mac and it went pretty smoothly. The only thing I recommend to people is ignore the "alias docker=podman" crap you see everywhere. The only true way to make things work well is make a docker symlink (on your path somewhere) to podman. If you do the alias thing you'll quickly discover bash scripts "can't find docker" and you'll end up doing a lot of chopt crap that doesn't work.
That all being said I don't have an M1 (or an M2 for that matter) so I don't know if Podman works "well" on it.
x64 virtual machines on M2 are almost unusably bad for me. Everything else is (near) perfect. Apparently it's somewhat better on Ventura, but I don't want to take the risk of upgrading.
The M1 and M1 Pro chips are absolutely insane. Been using them ever since day 1 and for me there is no way to go back to using Arch Linux natively unless Qualcomm or someone else starts making some good ARM chips.
If Hector Martin doesn't say that they fixed the memory-mapping bug in their PCIe controller[0] I think it's safe to say that we're not getting a new Mac Pro this year.
And to be honest, the whole concept of a Mac Pro goes against the design philosophy of Apple Silicon. The whole point of those chips is to maximize efficiency by having everything on-die or very close to the chip. For example, the M2 Max has like eight memory channels in a laptop - you could not do that even with CAMM[1] modules.
A desktop machine without replaceable RAM or slots that can accommodate GPUs is going to be a nonstarter no matter how good the the SoC's integrated accelerators are. A significant chunk of the cost of the Mac Pro is just the fancy desktop case, specifically designed to accommodate lots of expansion cards and compute accelerators. If you don't care about slots, you can just buy a Mac Studio for less money than the Pro. If you do want slots, only being able to plug in storage and I/O devices and nothing else is going to be a huge drag.
[0] Apple Silicon PCIe controllers have a unique limitation in that they prohibit accessing external memory devices. This is why Thunderbolt eGPUs don't work - the M1 and M2 can't actually write to the memory on the GPUs.
[1] Compression Attached Memory Module, a JEDEC standard for a thin dual-channel RAM card promoted by Dell to allow socketed high-performance RAM on business laptops.
I was thinking that they'd go down the Risc PC route ( https://en.wikipedia.org/wiki/Risc_PC ) - allow expansion cards with both CPUs and memory on. Apple already trialled this with the accelerator cards in the current Mac Pro.
Obviously there would be a penalty for off-die memory access, but I'm sure somebody cleverer than me could think of a nice way of architecting such a beast.
> This is why Thunderbolt eGPUs don't work - the M1 and M2 can't actually write to the memory on the GPUs.
I thought the major reason was simply lack of drivers.
When I consider that I can use direct mapped M.2 storage over Thunderbolt; or use 100GBit network cards over Thunderbolt 4 (with a PCIe adapter) I wouldn't think that I was limited to mapping memory in the SoC.
ARM has several different classes of memory access mappings; the normal one is called... well, Normal. I/O and storage is mapped using Device mappings, which don't cache reads and can be further divided into how little the CPU is allowed to try and optimize writes[1]. GPUs need Normal memory because applications written for modern graphics APIs expect to be able to map GPU memory into themselves and read and write to it like CPU memory.
The ARM spec for I/O is that you are always allowed to use whatever mapping type the device needs, and that less-strict mappings should, in the worst case, "fall back" to stricter ones. Apple handles this differently; the SoC fabric requires you use the specific device mapping that it expects for a particular device, and if you try to use something looser or stricter than what it wants, it will drop the transaction and raise an exception. And of course the SoC fabric will not allow Normal memory reads or writes to hit a PCIe device.
As far as the Asahi Linux team is aware, there isn't a way from the CPU to turn off this behavior. It's also not the only implementation of PCIe on ARM that locks out PCIe memory. Raspberry Pi 4's PCIe support[3] also has the same design flaw. If it was just a driver problem, someone would have ported AMDGPU to ARM and ran it on Asahi Linux by now, and we'd be posting cool benchmarks between the internal and external GPUs.
You don't notice this problem for I/O or storage because those never need to be mapped as Normal.
[0] And probably still skipping over more details, since I'm not an ARM expert. This is just what I've gleaned from reading other kernel developers' Mastodon and Twitter feeds.
[1] Which, BTW, the M1 also screws up. You're supposed to be able to pick posted writes[2] or non-posted writes; Apple Silicon specifically refuses transaction types that don't match what the hardware expects.
Considering the complexity of a SoC like the M1, that there is zero documentation for it, and the Asahi team can only reverse-engineer it, what are the odds that they just don't know the magic bits to twiddle to get this functionality?
> Considering the complexity of a SoC like the M1, that there is zero documentation for it, and the Asahi team can only reverse-engineer it, what are the odds that they just don't know the magic bits to twiddle to get this functionality?
I'd consider that very low because you've got a lot of eyeballs trying to get the most out of a COTS easily available hardware. The goal is to make it the best-in-class Linux support. Personally, I believe it can be done.
I follow their progress with great interest, as some of the non-technical limitations of the macbook (OLED and touchscreen) could easily be fixed.
Just because something is a COTS doesn't mean it will get great Linux support. The Nintendo Switch should be the best-supported Linux board availible by that logic, but alas, even an unlocked Tegra is only so capable.
At a certain point, vendor cooperation becomes outright necessary. Asahi has come a long way, but getting to "best-in-class Linux support" takes more than community effort with commodity hardware. Stalled-out projects like Nouveau don't lend a lot of hope for the future.
One of the thoughts on how a mac pro would work was to make the SOC/Memory combo an Add-in card and then use an Infinity Fabric-like interconnect on the motherboard with some nutty controller to allow clean intercommunication to split off instructions.
By doing this, massively multithreaded workloads would not see any real impact...as long as you don't need to send anything card-to-card. Finally, (assuming the M2 Ultra maxes out at 192Gb ram ) this enables adding in 1 to maybe 6 M2 Ultras, topping out just over 1TB ram. This almost starts to work but will fail if your tasks can't be cleanly cut up into 192GB chunks, maybe that's the magic of that controller.
On the M1 Pro, the supplied power brick is a USB-C charger. You can plug it either into the Magsafe port or the C ports on the laptop, and it works the same either way.
On my 2021 MBP, I have to use the USB-C/MagSafe cable to get 140W charging.
Using a USB-C to USB-C cable (with the same charger) gets 100W charging instead because of a limitation on the laptop side (USB PD 3.1 on the MagSafe port vs USB PD 3.0 on the USB-C ports).
Hopefully they're fixed that for this iteration of MBP, but I haven't seen that explicitly claimed anywhere.
Likely not, assuming they keep the same design for M1 Pros.
On the 16" M1 Pros, you could do magsafe charging at 140W (using higher wattage negotiated USB PD), however the USB-C ports were limited to 100W (96W?) and did not offer the full 140W charging using USB-C.
Apple has MagSafe connectors for charging on laptops, which was the original MagSafe. The name comes from the connector being magnetic, and being "safe" since tripping over the cable would make it disconnect without pulling your laptop off the table. They discontinued this in favour of charging through USB-C cables/ports a while back.
They then introduced magnetic charging for iPhones, similar to (but IIRC not exactly) their Qi charging support. This reused the MagSafe name since they didn't have any products with it. The evolution of this is speculated to be incorporated and standardised as Qi2.
Then they introduced the MagSafe charging connector back to their laptops (but updated), and so confusingly there's two basically unrelated features both called MagSafe. Maybe that'll get cleared up a bit once Qi2 comes out.
Well MagSafe (the macbook charger one) is closer to USB-C in almost every way than it is to Wireless charging. I think it's just basically a different usb-c connector with a magnet attached.
This may be a stupid question, but does anyone know if you could charge through one of the Thunderbolt ports instead of having to use the magsafe connector? My work uses Macs for all employees, and they've been generous enough to supply me with 2 docking stations (one for office, one for home) that both charge through a Thunderbolt port. I'm due for a MBP upgrade and they were specifically waiting on the M2 Pro/Max versions before doing mine, and I'm trying to figure out if I can stick with my old docks or if I'll need to request new ones.
Thanks. I know that was the case with the M1 MBPs, but wasn't sure if that would carry over to the M2 versions, and I didn't see anything about charging via the USB-C ports in the press relese.
How would this compare with the best current intel based laptop (running linux). Primarily cost-wise and computational performance wise. The comparisons offered are with previous macbook models and that is not particularly interesting or illuminating if you are not an existing user.
Afaik the Intel Integrated GPUs either have a hard limit on the addressable RAM or are capped to half the available RAM.
I don’t know of any AMD G series equipped laptops with that much memory , but admittedly I haven’t looked closely at the options.
Then there’s the issue of memory sharing of actual data resources. Albeit, this is down to Software, but more software for Mac can assume shared memory to take advantage of it, versus other brands because iGPUs have historically been very limited.
The AMD laptops don't need to be in the G series to have an iGPU - they all do. The G series only exists for desktops.
> Albeit, this is down to Software, but more software for Mac can assume shared memory to take advantage of it, versus other brands because iGPUs have historically been very limited.
AFAIK there is no such software that only supports Macs. You can query these features at runtime, and it's easier to do so than to rewrite your renderer for Metal.
Ah yeah I always get tripped up by AMDs offerings.
I didn’t say the software needed to only support Mac, but if they have a metal backend (as many things have multiple backends) they know they can spend the time to get a bigger ROI due to the number of shared memory macs as a percentage of all macs.
...Not exactly and this is a silly comparison anyways
Apple dynamically scales the GPU memory while Windows GPU memory must be reserved. If you have 64GB of Ram and want 32GB of VRAM you now...always.... have 32GB of system memory.
But furthermore...that iGPU is like 1/10th the performance of an M1 Max lol, absolutely nobody should EVER do this on an iGPU. Their argument is that on a dedicated card you just can't do this.
It is not true that iGPU memory always has to be reserved anymore. Some programs and APIs don't support it, but you can address main system memory from many iGPUs and vice-versa. In Vulkan for example you can do this by setting the HOST_COHERENT, HOST_VISIBLE and DEVICE_VISIBLE for a buffer. Of course, this is subject to driver bugs, and you have to be careful with caches that may or may not be shared depending on the specific iGPU.
Beyond this, newer iGPUs like the RX680M are comparable in performance to an M1 Pro. Certainly sufficient for any model visualization task, since that's what we're talking about.
In those applications you're likely to be limited by raster performance, and I can't find any metrics for that. TFlop for TFlop, the 780M which is the main competitor should be within 10-30%, but that is not relevant for this application.
The only other application where this could be relevant would be 3D rendering, where the 780M would win by a large margin due to RT acceleration.
They are talking about how the M2 Max can flexibly address the near entirety (like 90GB) of that to VRAM, the example here is in video memory, not system memory. On that P15 you linked you have only 4GB VRAM. Comparatively it's kinda a joke for this specific tasks.
I haven't looked into this (and couldn't care less) but it probably has something to do with newer technology in the RAM. Not all RAM is created equal.
The biggest issue I have with macbooks is that they are ridiculously expensive. They are intended to be portable devices but bringing them anywhere outside my apartment or office is just a huge risk.
What makes you say this? Have you speced two systems lately? Last I did, the MacBook was comparable to cheaper for the price/performance. Of course, a 10lb mobile desktop with a 30 minute battery was closest.
The PC industry uses wildly inflated prices, especially MSRP. A more realistic price for laptops can be found on various deal sites. A comparable Dell, Lenovo, or HP laptop will cost about .5-.6 of the equivalent macbook pro price. The vast majority of laptops, even after accounting for 1-2 TB of storage and 32 GB RAM cost less than $1.5k. MBPs start around $2k.
Not too familiar with the UK market, but in the US, various AMD Thinkpad T14(s)/T16/P16 gen 3 configurations as well as some HP elitebook ones were in the 1-1.5k price bracket around Black Friday. I got an elitebook 845 g9 with 6850 HS and added a 2 TB nvme ssd and 64 GB RAM to it as aftermarket purchases for a total of ~$1200 after tax, additional warranty and a thunderbolt dock.
I think the other 364 days of the year are more relevant, since Black Friday sales often have extremely limited quantities, and are anything it predictable.
People are claiming 4-6 hours on the hidpi display version. That’s definitely not comparable. I leave my charger at home with my M1.
> The decision is about whether you can match other aspects.
I disagree. The whole package should be compared, because that's what you're paying for. If 2x battery life cost a few hundred more, it's a no brainer, for me.
edit: We're out of reply depth, but this all started with "they're ridiculously expensive" and ended with "there's nothing else in the market".
You need to decide what it is that you're looking for. If you are in the apple battery efficiency camp, there is nothing else to look at. You won't be swayed by pricing. There is nothing else in the market, and all the pricing talk is meaningless. Will you be swayed if I offered you a laptop for $500 ?
This all started with "they're ridiculously expensive" and ended with "there's nothing else in the market". Those two are intimately related.
The complete package is reasonably priced, when compared to other systems. If you completely ignore one part of the package, then sure, you can find something comparable, as my initial comment pointed out with the portable desktop replacements.
But the price on the shelf is for the complete package.
I paid $2,869.00 + $269 for a 2015 MBP 2.8GHz/1TB/16GB/Radeon R9 M370X in Nov 2015. It's still going strong and does everything I need it to do, though as of a few months ago Apple finally dropped support for their latest version of macOS Ventura.
This comes out to $440/year.
I could probably sell it on eBay for $3-400 (actual recent sales data), bringing that cost down to <$400/year.
Dunno, I dropped mine pretty hard at airport security, skidded across the floor, left a very small dent. My wife had hers placed on a lectern and someone bumped into to and it fell off stage, bigger dent but the show went on. And our kid has sat on or stepped on both multiple times. They hold up well enough.
I'm a sucker for robustness and build quality so that's why I like thinkpads. Dropped one on the tarmac of a race track as an engineer and the magnesium body held up like it should.
I do pay separately for things like my roadbike, camera, laptop, etc, and for these things to not have a deductible, but all in I am only paying like $15 a month for my entire renters insurance policy.
I discontinued the use of macOS for programming/work stuff for a few reasons, but a huge one was lack of 8K display support. When my Linux workstation got the (old) 31.5" Dell 8K UP3218K, and it ended up on my desk next to my Mac's (insanely expensive) 31.5 Apple 6K XDR Display... the difference was literally night and day.
The 220ppi Apple displays looked like deep-fried dukey to me suddenly; it bugged me all day, every day after that. YEMV, but the text rendering 280ppi Dell was phenomenally crisper, better looking, and less blurry. It's something you can't unsee.
I'm not sure that there are any HDMI 2.1 displays on the market yet at 31-32" (I haven't checked out this year's CES coverage yet), but these new Macs (MBP and Mini) mark the first non-"Intel Mac with an eGPU" machines from Apple that support 8K.
To me, that's the most interesting thing about them.
In the feature list, the most underwhelming item to me is the '1080p FaceTime HD camera'.
Aren't pretty much all cameras at least 1080p by now? I'd expect a top of the line laptop to have a multi-megapixel camera able to push 240fps for slowmo and windowing abilities to allow software zoom while still outputting a high enough resolution feed that someone viewing on another macbook pro can't see the pixels (ie. more than 1080p please!).
Heck, on a top of the line macbook, you could even have cameras on each corner so that clever processing could be done to get multiple angles and automatically pretend to have a 2nd cameraman for closeups from another angle.
Can anyone provide insight why Apple didn't put more effort into cameras?
> In the feature list, the most underwhelming item to me is the '1080p FaceTime HD camera'.
Why ?
Most people use their laptop webcams for Skype with friends or Zoom/Teams with colleagues. The camera is merely a means to display your face to others and I doubt anyone else (especially on work sessions) gives a shit about what your image looks like.
Do people really need to stare at your wrinkles, nose-hair and unshaven face in 8k ? Most people also use their webcam in godawful lighting conditions, so it matters even less.
Added to which, in relation to the MacBook Pros, the clue is in the name "Pro". People will be buying it as a work machine. Who cares about the webcam (or indeed, if you look at certain other web-forums where people are complaining about the lack of Space Gray 16-inch ...who cares about the colour !).
My understanding is that it's actually quite difficult to fit a high quality camera into a laptop lid. If you look at how thin the lid is you'll notice it's at least twice as thin as a cell phone.
Doesn't stop them using compound-eye like designs... They can be very thin and have 100+ 'cameras' over just a few square millimeters, all using the same CCD.
As a bonus, they (after a bit of processing) output 3d information, allowing them to use the same system for face ID etc.
There just isn’t much market for it. People don’t feel the need to project high res images showing every little flaw. The video conference services generally downscale the video to 1080/720 by default.
You can get up to 4K on external cameras or on a few desktop monitors with thick cases but those cameras are much too large for laptop cases.
> Can anyone provide insight why Apple didn't put more effort into cameras?
I assume because for users who need more than a basic webcam, Apple has Continuity Camera¹². And of course there are lots of people at the high-end using DSLRs as their webcam.
I use my phone for stuff. I don’t want to kill the battery and it’s usage as a webcam all day. Maybe that’s just me though, I’m in Zoom meetings all day.
Contrary opinion: I don't see the need for a high-quality camera on a laptop. I can't think of another good use besides Zoom calls and Facetime and the like, when your counterparties in the meeting are all going to be seeing a scaled-down image anyway.
Stick in some decent and cheap camera, and put the money into stuff that matters like computing capabilities, RAM, storage, networking, screen, etc.
Who says it's a small tile? It's common for me to have meetings where nobody is sharing screen. In a one-on-one meeting, someone's face might be taking up my entire screen as we talk.
Sometimes my team record short presentations or updates to each other to share in Slack and it's nice to be in higher quality. It's noticeable when someone is using a laptop camera or something else.
Until very recently, MacBooks were famously stuck with 720p webcams. So from that perspective 1080p is a feature worth mentioning
In any case, I personally don't really care how much fine detail my friends and coworkers can see when I'm talking with them. If I were a streamer or something I might care, but those people probably have dedicated hardware. I think for the average person 1080p is enough
I think any effort being put into laptop cameras these days (and for the last 5+ years) is to make them smaller in all dimensions.
I'm guessing companies have found that it's always a trade-off between camera quality and bezel size, and a thinner bezel impresses people more than a better camera.
I find Apple webcams perform better in video calls than most/all others, so they don't really have much to worry about. If you do want some insanely high quality "webcam", you could buy that magsafe gimmick they showed off last year that allows you to use your iPhone's back cameras as the webcam for your MacBook.
With the recent developments in real-time video upscaling, maybe higher resolution cameras will be made obsolete for video calls, the main use case of laptop cameras.
No idea why they don't just stick one of the iPhone cameras in there. Heck, use the last gen camera. If the camera isn't too big for the iphone then it's not too big for a laptop- especially with the notch.
iPhone cameras are absurdly thick compare to the thickness of a laptop lid. If you've noticed, they've also gotten even thicker over the year (the camera bump has gotten thicker). Nobody will want a laptop lid as thick as their iPhone + camera bump.
Yeah. The back camera sits behind the screen whereas the front one pokes through it. I wouldn't be surprised if they were fairly even in terms of thickness.
My iPhone 14 Pro is at least twice as thick (not counting bump) as the MacBook lid and the selfie camera is only 4k@30/1080@60. Not sure if there is a clever way to pack the camera or not but it doesn't sound good at first check.
If you don't care about max current performance (frankly you probably don't), then picking up a used M1 pro or M1 max is a much better value proposition. It doesn't seem like the other upgrades are very substantial, maybe Wifi 6E but displays, design, otherwise it's likely to feel essentially the same.
Honestly, unless there is any specific feature added that you find important (the wifi update or whatnot), a refurbushed/discounted m1 pro seems like a no-brainer choice under a more restricted budget.
It's a spec bump, possibly meaningfully so but if you are on an Intel machine or want two monitors and can find a M1 14inch for ~$1500-$1700 that's a killer upgrade no matter what.
Shoot you are right, they are port limited on the fourth display. (as the M1 Max has double the hardware from the M1 Pro so one can assume it can support quad 5K, but it has only three Thunderbolt 4.)
I'm happy the Apple web site shows at least one 'stock' configs for the laptops with 32GB of ram and 1TB. That means Costco, Best Buy and Amazon will be slinging these around at sale prices soon enough.
The screen protrudes up into what would otherwise be a solid black bezel on either side of the camera notch, making the screen area taller than 16:10 aspect ratio.
The OS puts menu items and icons in that bonus space, freeing up _more_ of the uninterrupted rectangular main body of your screen for your apps.
I think if you go into fullscreen mode with an app, it'll blank out the area to either side of the notch and just use the main screen space (or maybe that's configurable for notch-aware software to decide? IDK)
I never notice it. It's usually dead space of the menu bar. I agree that it is a strange design, but I don't think it should be a deal killer for anyone.
it's not magical, but if Apple's 1080p camera is 'very low quality', I have plenty of experience on potato-cams to talk about, even in the previous generation.
I also suspect that the design was intended for Face ID, but that obviously hasn't happened yet. I'd keep touch ID, thank you very much.
We’ve seen the actual camera/sensor module and its mounting hardware. It is pretty close to the size of the blacked out area in the menu bar.
That notch is a very clever way to expand the size of the screen without increasing the size of the case. There is very little downside. that middle part of the menu bar is not used by most apps. When there are more menu items, the system automatically skips over that notch.
From a marketing standpoint there’s probably a lot of users still using intel based MacBooks, and they’re trying to convert those existing customers to the Apple Silicon architecture.
Usually, those spec bumps are placeholders to allow the company to cancel more significant, riskier announcements without giving a surprisingly short presentation. Next Keynote is expected in March, which feels a bit far from now than usual for that pattern.
It could be that the recent events in China made that announcement tricky, and given the worrying speculation, Apple wanted to reassure people.
Not really -- Apple has increasingly pushed more niche updates (spec bumps and ancillary product lines like the mini) via press release, and reserves their keynotes for more notable updates like laptop redesigns and broad-appeal products like the phone and tablet.
> Is it strange these were not announced during a live event?
My conspiracy theory / gut reaction is they want to hide this launch? or at least don't want too many eyes on it? I don't know why I think this but yeah. I have no insider information. This is an unsubstantiated guess.
I think 32GB of RAM should be the base for PRO models. In a lot of countries, custom models take almost two months of waiting. The M2 Max models have 32GB, but they cost $4000.
So the battery life is alluring me to splurge the money for this. But I’m a bit worried because my wife laptop, bought in 2018, already have a 5% health life on its nvme. I don’t worry as I can simply replace it but those macbooks have it soldered.
So I guess my question is, what’s the expected lifespan of their SSDs? My current laptop will turn 11 this year, and after battery and ssd replacement it still works great. Can I reasonably expect the same lifespan from macbooks?
Are you saying it's used 5% of its total life? That seems pretty good over 3+ years, for an estimated overall lifespan of 60 years. No doubt the machine will be replaced for some other reason by then.
IIRC there was an issue early on with the M1 Macs that involved way higher than normal swap usage, and was potentially bad for SSDs. But I thought that was fixed within months.
Yeah, that sounds like a plausible culprit. And I wouldn't consider it outside the realm of possibility for Apple to replace the SSD for free - even if the laptop is now out of warranty.
Probably, but only in the higher storage models. The lifespan of SSD increases dramatically as you move from 500GB to 1-2 TB. If you REALLY want the longest lasting one then go 2TB. I went 1TB when I bought a macbook pro in 2013 and it lasted for about 7 years before the logic board had some fatal error that neither me nor apple techs were able to fix.
I find the M-Mac to best Intel-Mac comparisons humorously deceptive.
I imagine the Mac marketing team now, giddy happy at the thought of increasingly ridiculous M-Mac to Best-Intel-Mac comparisons over the next 10 years. The claims will either come across as helpfully deceptive (for Apple), or as a funny inside joke (for informed customers).
Intel desperately needs to create their own bootleg Intel-Macs to defend against this sneaky Monty Python marketing strategy!
Can you wait or do you need a machine immediately. There will probably be some lead time before the new machines are delivered.
If you have a computer you can still use the move is definitely to cancel. Then either buy the new shiny or buy the older models (unused or refurb) at a reduced price. These M processors are so good that for me, the ones currently shipping are perfectly fine.
(Depending on your definition of compression, my monitor shows more colors than there are particles in the Universe at 1 billion frames per second. It's just that there's a little bit of quality loss from the compression.)
I love the MBP with the M1 chip, but my biggest gripe is it lacks dual monitor support natively. I shouldn't have to use DisplayLink to get the second monitor working. Just been working on a single monitor since I got this thing because I heard DisplayLink is a buggy resource hog.
Ahh so that's what it is. I was confused because my coworker has the MBP with an M1 and dual monitor works fine for him. But that ran contrary to all research on my MBP specifically. Seems I'm due for an upgrade at some point in that case...
The 13” MBP has an M1 processor that only has support for 1 external display. It has the same internals as the MacBook Air.
The 14” and 16” MacBook Pros have M1 Pro and M1 Max processors and they support more video streams. The M1 Pro supports 2 external displays and the M1 Max supports 4.
Adobe products lag even in new Macs with 40bln transistors M2 Pro CPU. Hilarious. How hard it must be to scale bitmap without lags ? https://youtu.be/6Ij9PiehENA?t=457
This is Illustrator, not photoshop, so it's vector. It doesn't really lag, the bounding box moves smoothly, but the preview is absent. To get preview you need gpu acceleration[1], which is either absent on arm macs or turned off here for some reason.
I’m looking forward to picking up a second hand M1 Max mbp now that these new models are out. M1 is more than fast enough for me, and I’m specifically looking for a native 3 external monitor setup, without any DisplayLink shenanigans!
For those on the M2 Pro mac mini path, the Mac Studio looks a look more appealing. 32GB RAM, Max CPU and more ports for the same money. Will have to wait for benchmarks, but seems like more computer for the money if you upgrade the mini
Thanks for this. Yesterday I picked up a new Mac Studio M1 Max with 32G RAM and 1TB NVMe, and today I see the new Mac Mini M2 Pro with 12 CPUs, 32G RAM, and 1TB NVMe for about the same price.
Any idea the performance delta between the M1 Max and M2 Pro? Wondering if I should trade my new system in for the new Mac Mini Pro...
The sweet spot for the Mini is the 16/512GB M2 Pro at $1299. It is significantly less expensive than the Mac Studio. Unless you need 32GB of RAM and a lot more GPU cores, the Mini will do nicely.
Interestingly enough, I've noticed that Best Buy quietly cut their Macbook Pro M1 Pro / M1 Max 400 dollar discount to 134 dollars. I guess they're hoping more sales will trickle in for people looking at the new models.
I just ordered the M2 Pro Mac Mini that was also released. The Mac Studio was a bit excessive for my music production hobby needs, but I do need the extra cores of the M2 Pro models. Really thrilled they released this.
I own a Mac Mini M1 and a Macbook Pro 16" M1 (the later given to me by my employer), and as a software engineer - I don't feel that the current M1 processor is limiting me in any way.
I, however, wanted to get a MacBook for personal use to be more mobile. And postponed my purchase until the release of new Macbooks with one anticipation - lighter devices. Instead I got a new CPU with a bunch of abstract claims by Apple on performance (which for me was never an issue) - in the EXACT same for factor. Same bulky width. Same heavy weight.
Might I ask, with no intention of being rude, why you thought Apple would introduce lighter laptops? The 14" and 16" chassis are barely over a year old, so a redesign was unlikely, and the M2 Macbook Air is even more recent. Were you expecting a 12" Air perhaps?
Before I got M1 macbook, I've had the opportunity to use Intel 15" and later Intel 13" macbooks. They were thin and light.
The first thing, after the speed of M1, that comes into mind with ARM based macbooks - is the bulkiness.
And so I naturally, assumed, that Apple will try to reduce the bulkiness. I just don't see the point of pumping raw processing speed every year, because not everyone needs it. And my general feeling is that apple is moving away from mid-range machines - light, portable, fast enough - to a high range and ultra-high range machines instead. Not everyone who uses a Mac is rendering 4k videos all day long, and I'd appreciate having a lighter, thinner laptop on the expense of raw processing speed.
I have a Mac Mini M1 driving a 4k screen on my desk-facing wall. Exactly zero complaints! These M-series chips are really great.
I will probably replace the Mini with an M2 Ultra Studio, if one comes out soon, but I can't say I really need it.
(I highly recommend getting screens off desks and onto walls. Large 4K TVs are cheap. Less clutter, frees up desk space, easy to see & read from anywhere in the room, makes giving demos easy, and -< big bonus for me >- I don't need reading glasses to see it!)
The Mini is much more exciting than the MacBook, actually. Shame it only gets (up to) a Pro CPU, so I guess the Studio will get the Max/Ultra in a few months...
What the hay, only 32 gb ram max and 1tb SSD max. I'm over both for last 2-3 years on my 2019 Intel with 64/2. The 2tb was critical for a project in 2021.
I still run a 2017 MacBook Pro with the 2.9 GHz i7 (admittedly with a recent keyboard replacement.) I only do office productivity stuff on it. The heavy lifting is done on my Dell Xeon desktop workstation. Anyway I'm looking at the M2 mini. That would be cool to poke around with. I'll keep the MacBook Pro as it still runs stellar and with the new battery, I can run all day with careful use.
And the laptop comes with a one year warranty. Extending the warranty means pretty much another laptop’s price from other OEMs. RAM still 8GB in many base models (not sure if that’s so in all the lines) and then there’s this thing how easy to repair these are. I wish my laptop buying strategy was like everybody else’s — Shiny new toy, just get it! :)
An extended warranty is 300cad for 3 years and 100/year after that. I think that is more than reasonable for the best product-service available. I would be a lot less comfortable using such an expensive device if the service wasn't so comprehensive.
AppleCare is one of my favourite "features" that comes with every apple product.
Start at £2149 in the UK. I bought an equivalent M1 last year for £1729. Price increase of £420. About a 25% increase. Wonder if that’s closer to the true inflation figure rather than the official 11% narrative. Regardless, they look like good machines, surprised they went this low key with the announcement.
Yeah to be fair I did buy it off Amazon and got it cheap. I bought it the same day they released the new iPhones in October because I noticed the massive jump in prices and that they'd bumped the prices of some other stuff up so I went on Amazon which still had a bunch of stock with the old prices and bought an iPad Air and the MacBook.
* 32GB RAM would not have been sufficient, glad I went for 64.
* CPU is exceptional, but significant gains would be seen with more power (or code/process optimisation): unit tests (Robolectric) still break the 'flow' threshold (1s). Xcode compiles + full test suite runs break the 'attention' threshold (10s).
* GPU is occasionally useful, but I don't do much ML/video work
Usage:
* I have enough RAM to keep 3 IDEs open + various electron apps + Office Suite + Windows/Ubuntu VM + 2/3 phone emulators + up to a few hundred Chrome Tabs. Fans are silent and laptop is cold.
* Fans spin heavily when running a full unit test suite (JVM/Android)
* Fans spin heavily when gaming (via Parallels)
* Fans spin slightly when running Stable Diffusion
I'm an Air convert after years of using 15" macbook pros. Small, light, amazing battery life. Feels just as fast on all 'normal' tasks. If I'm doing some work that needs a lot of power, even a maxed Pro wouldn't be enough.
Agreed. Now they just need to ship a 15" MBA. It would be the perfect laptop and more than capable for most people and devs. I have the 13" M2 Air and I've realized the screen is just too small for me. Maybe we'll see it this spring?
This release literally made M1 PRO the sweetest deal on the market.
I owe first generation M1 Air and it has been an incredible machine for 99% of the population. For the MBP crowd, the 15% improvement is IMO not worth the $
No increase in max configurable storage (8TB). 96GB ram is cool but these are targeting A/V production among other things and 8TB is not huge in that regard (unlike 12 cpu cores, 96GB ram, etc).
File synchronization/backup is a thing. For those of us who travel full-time (hence, maxxed out powerful beast laptop) having dongly-things dangling off is a hazard.
But looks like I mis-spoke, the thing was actually ordered Dec 29, delivered Dec 30.
So actually two and a half weeks, a few days past 14, which does look like the return deadline in USA. I found a "check return eligibility" button... yep, nope.
Would really love to be able to run Windows natively on the M2. Anyone running Windows as their daily driver on an M2 via emulation? How does that experience compare to macOS on the M2?
Mac mini seems so cheap in my country compared a brand-new Macbook. I wonder if they want to move as soon as possible to ecosystem made by only Apple Silicon powered devices.
A maxed-out Mini is pretty much half the price of a Studio Max. I wonder which one ends up being a better long-term purchase, in terms of obsolescence and OS support.
Building a PC doesn't make much sense anymore for most people now with cloud computing, Steam Deck, Apple silicon and GPU prices being ridiculously high.
That's when I realize how much my weird hardware preferences set me apart: I mostly care about an OLED screen and ECC ram. Then, a touchscreen supporting pen entry, and a good keyboard (with the edit keys like home/insert/pageup/pagedown directly accessible, bonus is there's also printscreen) especially if it's not a foldable or a tablet.
Apple's got good CPU and battery life, but unless the core features are present, I really don't care.
Here, 0/4 of my core features are present. It's certainly nice hardware, but it's not for me.
I have a Galaxy Book Pro 360 and a 14 inch Macbook Pro. The Galaxy book has like 2.2 of those core features (OLED and touch + Pen).
The level of a beatdown that the Macbook Pro 14 gives on the OLED display is nuts, the brightness difference is crazy and gets even more crazy with HDR. OLED really isn't there yet against Mini LED for outdoor brightness performance. Something also feels wrong with the OLED white balance too but I think that's a fact of life in many windows machines color tuning, white just never feels....correct? It always feels like changing the display brightness is ALSO changing color tuning, white feels not just less bright but contrast falls in an uneven way.
Even compared to the Samsung-built OLED in my iPhone 14 Pro, just massively different tuning.
Depending on how I hold my laptop, yes I like to click and scroll with my finger, especially when taking notes or mindmapping.
@temptemptemp111 you seem to be shadowbanned, but your comment is spot on: OLED is about comfort. Sometimes I like working at night, sometimes in the daytime. I want a gorgeous screen that doesn't hurt my eyes, that can go very bright or very dark.
Having a powerful CPU is irrelevant when working on the console. Having days of battery life is irrelevant when having a plug nearby.
Would I take a powerful CPU and better battery life? Of course! But only if it was a bonus: for my specific usecase, there's 0 benefit from using a Mac. I'm sorry if that's hurtful to apple fanboys - you certainly have a wonderful machine if you care about battery life, a powerful CPU, and if you carry a cellphone with you at all times. I don't mean to be hurtful: I just have different tastes and priorities.
Like, I refuse to use or carry a cellphone except on very specific circumstances. Replacing my laptop with a Mac would be a net negative, as having a 5G or LTE modem in my laptop is more valuable to me than a long battery life: integrated cellular connectivity enable me to spend 1h at a coffee shop without wifi: there're various options for PCs (Microsoft Go, Thinkpads, Dell...) but 0 option for a MacOS laptop with an internal modem.
I'm looking at the Ryzen Thinkpad with great interest! But so far I haven't found one with both the screen I like (OLED) and the keyboard I like (pageup and pagedown around the up key), but as soon as I get 4/4 of my core requirements I'll move to AMD, as a Xeon is a bit power hungry :)
> integrated cellular connectivity enable me to spend 1h at a coffee shop without wifi
So you go a hour + without the ability to make phone calls or return texts? Can you dial 911 on your laptop if an emergency happens while on the way to the coffee shop? Does the coffee shop not have Wi-Fi or do you not use public Wi-Fi? How much is the additional data line for your laptop? Is it just data or voice too?
> So you go a hour + without the ability to make phone calls or return texts?
Yes. Actually, that's most of my days too, as I rarely have a cellphone turned on, even when I'm at home. I've done my best to remove that from my life.
So that's also true when I leave home, except in super specific circumstances.
> Can you dial 911 on your laptop if an emergency happens while on the way to the coffee shop?
I think I might be able to, even if they don't guarantee service? TBH I don't care about that much: if an emergency happens to me, either I'll be unconscious and someone else will have to call 911 for me anyway, or I'll remain conscious and then I can decide what to do.
> Does the coffee shop not have Wi-Fi or do you not use public Wi-Fi?
Most of them do, but I don't like looking for the password/asking for it/fighting connectivity issues. It's too much of a hassle.
Instead, I type Win-A, click on "cellular" - and that's it: I'm online and everything works! Even if I only took my laptop and paid for a coffee! If offers a unique peace of mind: no distractions, no interruptions!
> How much is the additional data line for your laptop?
I don't remember the exact price, but quite cheap, about 20 a month. I mostly use it for ssh outside home.
> Is it just data or voice too?
It's just data, as voice requires some specific audio and call routing that's tricky. But if you use skype or google voice, you can do voicecalls.
> Touchscreen on laptop is like touch on a phone! once you use it you can't go back!
Totally this! I'd put OLED and even 4k in there too.
And I'd say that's true even for console work: 4k is great as it gets me crispy fonts in my terminal :)
It's a net positive, as I use Windows so I don't have the DPI compatibility issues that still plague Linux distributions (especially with multiple displays)
Thinkpads. I like the Fold https://csdvrx.github.io/ but it doesn't have ECC, so it's only 3/4 (the keyboard is not ideal, but it's a foldable, so I can bring my own keyboard, which grants the "keyboard" point)
The P7x P5x and some of the previous P1 have Xeons and ECC, so it's 4/4 core requirements.
Yeah I was a bit sad to see the rain of downvotes, but 1) I did expect that, even if HN may be better that other websites 2) it's against the rules to complain
So I take it on the chin and I laugh all the way to my sofa with my ECC OLED 4K dual NVME "wonderful keyboard" thinkpad in my hands :)
Different people like different things. If they're happy to get a Macbook Pro with a M2 Pro, I'm happy for them too!
No, not weird, it is called having good taste. For OLED that means DC dimming not PWM dimming - big difference. And with all of the Ryzen Thinkpads there is no excuse for not having ECC when all of the Ryzen mobile CPUs they're using already support it. I don't get the touchscreen thing, but you can always use one of those artist pads via USB and not affect your screen and be decoupled from the rest of your system (USB peripheral).
What laptop would you recommend that use DC dimming with OLED? Also, does this compromise color accuracy? (I was under the impression that PWM was used with OLED partly because OLED's color accuracy diminishes at lower brightness levels, but I'm not sure where I read that.)
Except they didn't and recently unveiled this 13" MacBook Pro with M2 chip: https://www.apple.com/macbook-pro-13/. Granted it's not the 14"/16" model, but still a touch bar on a new model.
given Apple is now confident with their in-house processors now, i continue to wonder how much of inefficiencies are due to processor architecture v/s inefficient/bloated modern application development.
I just checked the trade-in value for 16inch M1 Pro 10 core CPU 16 core GPU with 32GB on the Apple website. It's only worth $700USD. I would have thought they would pay more for a 1 year old laptop. I am not looking to upgrade, but am quite surprised by how low the trade-in value is.
You must be jocular then. He said serious usage. (In all seriousness, I’ve had few problems as well. Sometimes I have to muck with base images but it’s not a showstopper)
It’s crazy how far ahead Apple is getting in the laptop game with these new chips. I have the first gen MacBook Pro with the M1 Pro (ok maybe they’re not winning the naming game), and it’s a perfect computer. Battery life is literally all day, every single action is instant, I have zero regard for many apps or tabs I have open. It’s just perfect.
I have literally traveled with just a 30 watt anker brick for my phone and used it to trickle charge my MacBook overnight and I’m good to go.
The hardware is indeed great. The software experience is getting worse imo, particularly when upgrading. It used to be rock solid for me.
The m1 worked without any issues for 2 months. I upgraded yesterday to macOS Ventura:
* Upgrade finished but laptop froze. Hard reset.
* On first boot, the new System Settings failed to load icons, then froze. Trying to quit it froze the OS. Hard reset.
* I left the laptop plugged in for 12 hours, connected to 2 other displays over USBC. Laptop screen didnt turn on, other monitors did. Unplugged monitors, mashed keys on keyboard, replugged, laptop screen was still off. Hard reset.
I give Apple lots of credit for switching to ARM, but the last time I had this many issues that required a hard reset because the OS became unusable was Windows ME.
I used to praise macOS for its stability. It's still on average very stable, but in the last ~5 years the upgrade experience has not been rock solid anymore. I must have used at least a dozen Macbook Pro/Air and Mac Pro in that time, so I know it's not just 1 buggy laptop.
I feel like people have ALWAYS had problems when upgrading MacOS (or OSX). However, those problems aren't with the OS itself usually, but 3rd party software that lags. For that reason I always intentionally lag 1 major version behind. I'll upgrade to Ventura whenever the next version is released. Been doing this forever and can't recommend it highly enough.
I don't entirely think it's just 3rd party software, I think Apple's decision to bundle major app and OS updates together gives people the illusion that every major release is causing problems.
If they were to separate their first party app updates and OS updates (release wise, they can still market them together), more people would be able to pinpoint "ok it's not the entire update since I was on it just fine for a month, but actually just the notes app updated yesterday eating battery randomly". At least for cautious users.
OS X user since 10.3, and while I share the opinion that their software quality has gone downhill, they've always done bundled app and OS upgrades: they co-develop major updates to the OS and its SDKs with applications that make use of them. That isn't a major delta from prior releases which held themselves to higher standards.
While holding back app updates to stabilize the early upgrade experience might work, it's also somewhat contradictory to the idea that they upgraded the apps and their features and upgraded the OS as necessary to support the apps and features.
The last version that was 100% forward progress for me was Snow Leopard, which was essentially the performance/bug fix revision of Leopard. The Windows 7 from Vista equivalent.
Every update since then has felt slow, or introduced bugs. Perhaps it was because Snow Leopard came on a physical disk, so they couldn't "just ship it".
I remember feeling like how GP (with M1 Pro) feels when I upgraded my 2007 Intel MBP with Snow Leopard and and SSD. Everything was snappy and instantaneous until the wave of Electron apps and Lion and forced convergence with iOS design...
Not just one — two release cycles focused solely on bugs & performance are really in order. The sheer amount of change, necessary and otherwise, that has been dropped on the macOS in recent years has left things a right mess under the hood. Shoot, one release cycle alone should be focused on fixing/reversing the attempts to iOSify the Mac. The controls are wrong, the spacing is wrong, the appearance is wrong. Things are not consistent, and they do not work as the rest of the OS gives reason to expect.
Just look at the Ventura settings window dude. How are they so sloppy that it cannot even scale properly, not commenting here about how ugly the experience is. At least make it scale a bit. Oh, and I did not say that it even has a lag when it loads, on a fresh login with no apps.
IMHO Ventura's rethink is a huge improvement — search is now actually useful, 3rd-party extensibility is much better, etc. (It does scale vertically, BTW. The old one did not.)
Same. It's easy to say I don't like the software when that's the ecosystem you're living in. Go do windows for a month, hell, even my ubuntu certified laptop has issues with Ubuntu. There is no perfect experience but I would argue the alternatives are vastly worse.
Before they switched to image-based updates over APFS, the issue was partially with the upgrade process.
My personal record stood at a bit over one week spent observing a dark screen with unmoving progress bar in the hopes that this time it will actually finish the upgrade.
Two beta streams - dev, where a dev account is required ($99 per year) and the public beat which is basically a few weeks behind dev, minus early betas. The availability is always announced at WWDC, normally the dev availability is on the day of the keynote.
Much like Windows devs, Apple devs need to pay attention to this. If your livelihood is dependent on a platform, not paying attention is on you.
They could provide a package manager, rather than expecting the Homebrew community to fix it for them. My issues are almost always related to command line tools
My own experience is different. \Years ago I would wait at least 6 months or more before doing a mac upgrade because of the inevitable problems. With Ventura I didn't even wait for the first point release, and had no problems whatsoever. (I did have two complete backups handy just in case.)
I'm not invaliding your experience, just mentioning that others have different experiences.
Same here. Traditionally, I always waited for the point one release before upgrading. This time with Ventura, for the first time I just updated to the point zero release and crossed my fingers. No problems, so I'm quite pleased.
For both iOS and Macos (and Windows for that matter) it's a good idea to backup everything and reinstall the OS from scratch for each major version. Yes it's a headache but it will prevent 10x more headaches that result from upgrading from a previous major version. Many of the problems people have with Apple OS upgrades are issues that happen during the process of converting configurations from one version to the next. This is especially true if you've done multiple major version upgrades without starting from scratch at some point.
If you upgrade from Catalina -> Big Sur -> Monterey -> Ventura, for example, it becomes like a game of telephone. Small issues that happen with converting configurations accumulate. This makes me wonder if Apple tests new versions installed from scratch rather than upgraded over multiple generations like what happens in the real world.
I have a mac that has only been upgraded between what 20 major versions and migrated between 6 macs since 2003, never an issue small or large. Never reinstalled from scratch. You must have had bad luck.
I do a full macOS reinstall approx 1.5 years, depending on external factors, mostly to be able to clean up obsolete apps/packages/etc (after the reinstall I move everything into a "Old machine" folder and only move back what I need. only drawback is winding up with an "Oldmac" folder inside the "Oldmac" folder inside the "Oldmac..." well one day I'll clear those up)
But I've never felt like needing to do that with iOS - the only 'reinstall' occurs when I upgrade devices (and then I restore from iCloud backup). What problems did you run into that were resolved by a clean reinstall of iOS?
Battery drain. I managed a big fleet of iOS devices around the iOS 6 to iOS 10 time period when battery drain was a common complaint (https://news.ycombinator.com/item?id=13054056). Completely wiping the phone and setting it back up without restoring the backup fixed those battery drain issues 100% of the time. If you wiped the phone and restored the iCloud backup the battery drain would persist because you restored whatever glitched config file was causing it.
This is still worth doing occasionally on Windows, but iOS users have never had to do this, and there's been no reason for Mac users to do this since the System 7 (pre-Mac OS X) days. Some habits die hard, though.
I managed a big fleet of iOS devices (hundreds) around the iOS 6 to iOS 10 time period when battery drain was a common complaint (https://news.ycombinator.com/item?id=13054056). Completely wiping the phone and setting it back up without restoring the backup fixed those battery drain issues 100% of the time. If you wiped the phone and restored the iCloud backup the battery drain would persist because you restored whatever glitched config file was causing it.
>The hardware is indeed great. The software experience is getting worse imo, particularly when upgrading.
Yeah, I feel the same. It's 180 degrees from a few years ago, when the MacBook specs were ho-hum but using MacOS was a big benefit (the physical hardware has always been great). The overall software experience with MacOS is now just "okay". The upgrade process is slow and there are a few UI quirks that other OS's just seem to do better.
Maybe it's my nostalgia/rose tinted glasses talking, but I recently used one of the old white plastic macbooks running Snow Leopard 10.6.8 and was shocked at how instantaneous the user interface felt in comparison with Ventura on a modern machine. Even small details, such as opening/closing windows and typing text into TextEdit feels like it has lower delay/input lag on the old hardware/software.
Had a similar feeling while playing with an old iPod touch running the skeuomorphic/pre-flat iOS when compared with a modern iPhone.
Snow Leopard was a high point for stability and speed. Unfortunately, changing expectations of integrations and feature demands meant they couldn't stay there.
What is rose-tinted is the OP's remembering the OSes of a few years ago being great. Mojave and Catalina had a bunch of annoyances and weren't as good at recovering from a bad install or a bad update push as Monterey and Ventura. Not that Monterey and Ventura are great either.
That's the thing. Hardware keeps getting more and more powerful, but somehow the computers its in don't seem to be getting any faster. UI responses should be instantaneous by now. They used to be. Why aren't they? How many cores do these things have? Why is one of them not dedicated to catering to my impatience?
I recently decommissioned my iMac Pro and wiped its storage, which involved a High Sierra reinstall via Internet Recovery (http netboot).
It's amazing how snappy it is. I wish Apple would offer levers to turn off all the new (and laggy) features in their OSes for those of us who just want a computer to be simple and run apps.
I feel like Apple has forgotten what makes their Macs so successful, which is the seamless experience. I always wanted to own the latest one but not anymore, will stick with Linux.
I have an M1 mini and an M1 Pro 14”, both with 16GB. I upgraded both to Ventura 13.1 and have really had zero issues with a variety of hardware setups.
The software _design_ however… yes, Apple is sliding. The new Settings is a mess. It’s so disorienting, even when I put aside the fact that I had used the old, NeXT-derived one for 20 years. You just can’t find anything without search, which adds steps.
There is a relative lack of attention to polish, discoverability, and charm. It’s all just made to be slick and shiny now.
The settings window in MacOS needed to evolve IMHO. Just like the old disgusting confusing Windows settings/profilers/doodads, it worked only for old Mac power users. When the young audience is coming from iOS, MacOS has to satisfy what they’re used to. And Apple needs to stay consistent to not become like Windows.
As I age and switch constantly between OSes and devices for work, search has increasingly become my primary way of finding settings and whatnot. The part of my brain that remembers things spacially just doesn’t bother anymore.
Yeah, I can see this. However, I’ve been on iOS from the start (as a developer) and there is not a lot of sense to why some settings are organized the way that they are.
- “General” is a trash fire of hodgepodge
- “Home Screen” & “Wallpaper” are separate top-levels
In general, the top level items are grouped by task, but the tasks aren’t labeled but instead implied:
- Radio signals (Airplane Mode, Wi-Fi, Bluetooth)
- Noise & distractions (Notifications, Sounds, Focus, Screen Time)
- Input, output, and appearance (General, Control Center, Display, … Touch ID, …)
- Purchasing (App Store, Wallet & Apple Pay)
- Apple apps (except for the first, Passwords)… Mail, Contacts, Calendar, …
- Apple apps with media (Music, TV, Photos, …)
- All the other apps
This is on iOS. Is it too much to ask for some commitment to naming of these sections?
Oh yeah agreed, hence my approach with search taking over. Who has the time to be pointing around ever changing menus. (Which fails of course when what you’re looking for can’t be easily named to search.)
But for a corporation making these decisions, I would venture that the questions around how iOS should do it better are a separate concern from this merging the iOS way into the Mac way.
The old setting app was nowhere near the mess Windows settings apps are. And I'm not quite sure what's so complicated about the old settings for iOS users? It is way more non power users friendly than the old windows control panel and imho actually more simple and easier to use than iOS settings (where I find it impossible to find anything without search and search doesn't even work that great there..)
> I used to praise macOS for its stability. It's still on average very stable, but in the last ~5 years the upgrade experience has not been rock solid anymore. I must have used at least a dozen Macbook Pro/Air and Mac Pro in that time, so I know it's not just 1 buggy laptop.
If you think that is bad... don't get me started on Screen Time in macOS. It is 100% broken, in every way, for kids and adults, in the most inexplicable ways possible. [Example: The "internet filter" fails open after a few hours. You thought your 8-year-old was only on approved sites? MacOS says yes, but actually no.]
I have never seen any of the issues you're having here and I've had probably 20-30 iterations of apple products over the last 20 years.
The last true issue I had with apple hardware or software was in 2002-2003ish with a snow iBook that had a motherboard failure, but after that I've had nearly 20 unbroken years of basically fault-free existence which is pretty amazing if you ask me.
EVERY OS release will have some issues, and the people who are most impacted are the ones who chime in. If it "just works" for someone, they aren't going to chime in.
This is true of Mac, Windows, Linux, etc. I'd be curious to see a quantitative pre-release breakdown by OS and version. That might be more useful than individual anecdotes, as interesting as those are.
Okay, sure. Fine. Humans write software and humans make mistakes
*However* Apple specifically only targets a very small subset of devices. The amount of obvious bugs that slip through in every release upgrade against such a small handful of devices from a company as big as Apple is pretty disappointing
Windows has to run on the overwhelming majority of "weird" x86 devices on the planet. The amount of work they have to put in with their hardware partners is comparatively staggering
Rolling release distros avoid this by not having these giant catastrophic upgrades.
It seems weird that these companies which presumably employ some programmers have this convention of bundling up all the bugs together for some
nice combinatorial confusion. Everybody knows you keep your code in a working state from commit to commit, so you only have to handle one bug at a time, right?
I’ll never be overly critical of anyone’s computing choices, but this just seems like the worst of both worlds to me unless you specifically want to tinker with Apple-ARM Linux.
Why not Linux on much better supported hardware from Lenovo or Dell? It just doesn’t make sense to me to buy such an expensive computer only to remove a core part of the package, Mac OS.
> Why not Linux on much better supported hardware from Lenovo or Dell?
Specific Intel laptops aside, something doesn’t work for each of them either. These days sleep is a big issue with Linux on modern Intel laptops due to Windows Modern Standby. All in all, characterizing Asahi Linux as poorly supported is probably just based on some expectation that does not necessarily match reality. It runs very well and GPU is about the only thing that is missing. Plus, Linux on Macs has always had the advantage of having a big community focused on solving very standardized hardware issues. Once they work they work for everyone out of the box.
I have a Dell Workstation from work (Specifically a Linux supported one, whatever that means) using linux and my personal Macbook Air M1 is running Asahi Linux.
I really ran into no issues, the experience was almost scarily smooth. Everything just works. Sleep is definitely worse than on MacOS but it’s better than my Dell.
I can’t name a single thing that is not equally bad or worse on the Dell.
Fair enough, I wasn’t aware they’d made so much progress. I was under the impression it was still a proof of concept and people were glossing over a lot of issues.
I guess that’s the benefit of only having to target a very small amount of hardware, it saves time for improving other things.
I remember preferring Linux on PPC to Linux on intel laptops years ago. Apple PPC laptops tended to “just work” with Linux in my experience.
Because Lenovo, Dell don't make laptops as sleek as Apple's
I still ditched Apple though, I don't regret having traded battery life for compatibility. Arm64 remains a pain over 2 years later and for many years to come.
Yes! There is an OSS implementation of (parts of) the proprietary protocol for USB communication to iTunes so you can indeed get your photos. I say parts because some functionality is not implemented, like the offline backups feature.
Note though that with only one apple device, you won't have access to to icloud if your device is stolen or you get locked out because you forgot your pin or similar. If I log in to icloud on my laptop, I first have to confirm it on a linked device... if you only own one device if that device is linked you thus can't use Find My, remote locking or similar. See also: https://news.ycombinator.com/item?id=34407087
I can't even resume my Ubuntu ThinkPad from suspend without the nVidia driver going absolutely off the rails. I'm generally happy with the laptop, but it's got basic issues that Macbooks solved decades ago. The touchpad is noticeably worse.
To be completely fair an apples-to-apples comparison would be an all Intel (GPU included) laptop or one with an AMD GPU’d laptop as their drivers are baked into the kernel and are 1st-class supported. This, to me, is akin to the Apple top-down, wholly owned model. Even Intel MacBooks stopped shipping Nvidia GPUs years ago.
But… I get you. To have a really good experience you need to have a really supported setup and then everything works swimmingly. My work issued thinkpad is the AMD version of the P14s and it’s brilliant.
You may control _more_ but not certainly everything. You choose from the options available to you. You don’t control the processor internals, chipset, networking hardware, or millions of decisions behind the software and hardware. I’d say you curate, but you don’t really control.
You’re being needlessly pendantic: of course I don’t control the stuff in the cpu or the firmware blobs in the Wifi … but I control my OS and have freedom to choose a different distro or even a BSD one if I like — I love that I’m part of a community of hackers of fellow Linux nerds (and thinkpad enthusiasts) instead of being beholden to a giant company to dictate the direction the OS will take. It has its pros and it has its cons but on the whole I like it a ton more.
It is interesting: I clarify and point out the limits of your (own) control, which you agree with, but you still call me pedantic.
You didn't have to write "everything". You could have said "more".
It is so easy for open source proponents (which I am, but not to an unlimited degree) to fall into these illusions that are entangled with imprecise language.
Perhaps the answer is not to shoot the messenger? Perhaps the cause of the negativity is the realization that we don't have that much control. You knew this at some level, but your choice of language downplays it. The big manufacturers are in the drivers seat, like it or not. This reality is uncomfortable. Better not to mention it, then!
If I was being too harsh I am sorry. I mean I am open to moving from the traditional AMI bios to CoreBoot but I am not a FOSS zealot. I mean I have to get work done. So I make compromises.
You're welcome! There's a really cool hacker that's been refurbishing old Thinkpads with the coreboot bioses -- https://minifree.org/ and the sales help fund the development!
My M1 MBP wouldn't reconnect to my monitor via USB-C until I unplugged the monitor and plugged it back in. Of course, I had to go digging on forums to find that out.
Permission issues also arise after every upgrade. I purposely wait until the .1 so that other people find all these issues but it's still super annoying.
To offer a counter anecdote: I’ve always had a rock solid upgrade experience in the past few years and remember how something would always break up until around ~5 years ago.
Upgrading used to be an event. Now it’s a meh-fest of clicking a button, waiting a little, and being unimpressed with the new features thinking “why did I even bother? Total waste of an hour”
As someone else mentioned, people have always had issues when upgrading. So much so that for a long time people recommended wiping and re-installing instead of upgrading. Some of the early macOS versions were also really rough comparatively.
I've been using macs for a long time and have never had any issues though. I also tend to run the betas very early on.
I don't doubt that you have issues though, I just think they have always been some % of users. And, now that more and more people are using macs that % turns into absolute larger numbers.
A certain percentage of ram and storage goes bad in ways that cause this kind of instability, and an upgrade will have a tendency to bring this out. I’ve had my own share of stability issues fixed by ram and storage swaps. Too bad new macs no longer allow this kind of repair.
The issue talking to Displays was way worse with Intel Macs than it is today. Unfortunately there are definitely software issues with it. I think there was a thread on Twitter about Ashai Linux and how they figured out how to make something related more reliable than macOS by simply doing it correctly.
And why can't my mac remember what screen applications were on when I unlock the screen. Everytime, unlock, drag them back to right screens, resize them. If someone has a fuc for this I'm an ears, but seems to me it should work out if the box.
Actually, I've noticed that if I plug in the displays after I log in, the windows do get repositioned based on the screen they were previously on. I haven't looked into it at all though.
I've just upgraded my two Macs, one Mackbook Air around 3 years old, and a more recent MBP M1. Both worked perfectly fine. What could be causing your issues if not the OS?
Just to leave that n=1 here, I haven't really used a laptop without an external display (and USB keyboard, Magic Mouse, LAN adapter ...) in well over a decade, and I've never had any issues upgrading macOS, neither on Intel nor on M2.
> The software experience is getting worse imo, particularly when upgrading.
Many people chiming-in about the absolutely rock-solid stability of the last few updates, and I'm adding my vote to that.
I have had zero system-crashes or freezes for maybe 2 or even 3 years now. I tend to wait to hear what other people's experiences are, but that waiting-time is getting shorter these days. And the update process is also getting smoother: almost never any need to fix update issues with apps.
If I go back 10 or 15 years, I remember regular kernel panics, tricky restarts into 'safe mode', spinning beachballs, clearing the PRAM (don't even remember what that was about anymore - but it was needed all the time), and so on. It was an ongoing task to keep the OS running.
So this idea that MacOS is "getting worse" is just complete amnesia, in my view. I have never worked with anything so stable and powerful in my entire developer experience over the last 25+ years.
If we're talking about the price of these machines (and especially the measly disks and RAM on the cheapest options) - that's another matter...
I have both a latest generation Thinkpad P1 Gen5 and Macbook Pro 16 inch M1 pro and while it is true that Macbook comes pretty close to being a perfect laptop, I still miss the productivity of a Linux laptop (for backend dev work).
From what hardware POV you are correct, I wish though Apple would adopt Carbon Magnesium chasis that thinkpad uses. For same 16inch size, Thinkpad is noticeably lighter. I can happy lug my P1 Gen5(16 inch) model anywhere, whereas 16inch macbook pro is hefty.
I have Asahi Linux installed on mine, it works great including the GPU, I’m mainly developing for ARM64 AWS Lambdas now as well so it’s nice having the same arch. Some things are still missing like webcam, microphone and speakers, but the headphone Jack works and Bluetooth is OK just a bit choppy, incredible project.
I know this isn't an Asahi Linux thread, but I cannot help to ask about it. Plus I am big fan of Alyssa Rosenzweig's work (and Justine Tunney), so I pay close attention to Asahi Linux (and LibCosmo/politan) progress.
First, it's great to hear from Real World people like you about their Asahi Linux experience. It sounds like the baseline is done and now they will pick away at the remaining pieces.
Real question: What is the driver for Asahi Linux to exist at all? Please don't think I am trolling when I ask this question. At 10,000ft, any sane person would say: "Why? It's Apple. Let them do them: Mac OS X." I expect Asahi Linux folks to reply: "Well, duh: Because."
Is it unlocking the insane performance per watt of Apple Mx chips for Linux?
Is it enabling the world's greatest laptops for Linux?
Is it the pure technical challenge of reverse engineering a closed hardware system?
Because your choice of hardware should be independent of the choice of software that you run on it.
This has been the world we've had since the concept of "IBM compatible" existed. Some people prefer Windows (because of available software, or ease of use) and some people prefer Linux (e.g. for efficiency, customisability or desire to run open source software). Why should that choice be tied to whether you've bought, HP, Lenovo or another manufacturer?
Apple has made some amazing laptop hardware, but Mac OS doesn't suit everyone. So well done to the Asahi Linux team for trying to take us back to that world of choice.
You still 100% should choose your hardware for Linux even on 'Windows' laptops.
Ideally it should run everywhere but in my experience you'll never get a positive Linux Desktop experience unless you tailor your hardware purchases to the Linux world - this usually means choosing a laptop that tons of other linux users are using, so the bugs are getting found and fixed, and documentation exists.
The key here is that it should at least run on the most popular laptop brands. It should run on Macbook Pro because it's incredibly popular hardware choice for software/technical people.
PowerPC was an attempt to standardize (at a least a subset of) the industry on a common RISC processor. There were even two attempts at industry standards for PowerPC motherboards (PreP and CHRP, the latter with Apple's active participation).
I have a ThinkPad with Linux on it that I bought for programming and software development and a 16" Macbook Pro w/ an M1 Pro chip that I bought for photography.
I only use the Macbook Pro. The speed, battery life, coolness (to the touch), and quietness make it extremely difficult to have any desire to pick up the ThinkPad.
A good motivation for Asahi is hardware longevity. Apple supports hardware for a reasonable amount of time while I want to use a system as a primary computer but is obviously the worst among the 3 major operating systems and it curtails the long tail life of a system. In 7-9 years from now Asahi (or some other linux distro) will probably be the best way to keep an M1 Mac on an up to date and secure operating system.
So it works great except half the hardware doesn't work and it's entirely unsupported by Apple, to whom you paid a significant premium for the hardware?
It's a work in progress, users are generally confident that the remaining hardware will gain Asahi support sooner or later.
The fact that Asahi is such a popular project is a pretty strong indicator of how much room for improvement MacOS has, to put it as politely as I can. Personally, I wouldn't even consider buying a new Mac if there wasn't any good alternate native OS available.
iTerm2 and lima [1] with a ARM64 Linux virtual machine w/ Rosetta (so I can still run x86_64 Linux code!) enabled me to fall in love with my MacBook Pro in this regard. i feel more productive than on a Linux machine now. MacOS is not without its developer power tools, if you dig around enough!
For me MacBookPro with M2 chips would be perfect if the 13in model allowed 64GB RAM. My workflow requires me to run Docker (intel images) heavily with several containers. Having a virtualization layer is pretty resource consuming. Usually the Docker + containers I use take around 5GB RAM alone.
Under that configuration I've found 32GB to be just within the limit of usability. 16GB are just not enough. Also, for some virtualized Docker workflows, there is that known issue of slow File interaction through the Mac/virtualized file system, which makes working with say Magento, Drupal or simlar software REALLY slow.
As a comparison, I've also got a 2011 intel MacBookPro with 16GB RAM. That one is running Linux Mint. Docker performance is almost transparent, and the system runs pretty snappy for development.
> For me MacBookPro with M2 chips would be perfect if the 13in model allowed 64GB RAM
The new 14" model is only slightly bigger than the 13" (.4"x.4" bigger and .5lb heavier) and supports up to 96GB of RAM so may be a reasonable consolation if those size and weight compromises are tolerable for you. Additionally, you'd get a better screen, keyboard, more ports, better wifi and bluetooth, and more CPU & GPU capacity. Of course cost will also go up quite a bit.
It would also help if the virtualization story on Macs improved so that you wouldn't need to have all that RAM just to compensate.
There is no M2-series MacBook Pro with a 32GB RAM limit. There is the legacy 13in version with regular M2 which has a 24GB limit (which IMO makes no sense at all - basically everyone would be better with the M2 air over the model). And there is the 14in model with M2 Pro/Max which has both 64GB and 96GB RAM options.
My personal laptop is an M1 Air (16gb)... for anything dev, I use VS Code + Remoting extensions and just work "on" my personal desktop (local or over vpn/wireguard). It means I can't really do much without an internet connection, but that's often the case anyway.
I will say that a lot of the software I run in Docker has an aarch64 bundle... there are a few things that don't. There's now a beta version of Docker Desktop that includes Rosetta support, which should dramatically help for x86_64 images. I also have found that the FS I/O is not great, but getting better.
(prior job was using M1 Max, current is windows+wsl)
What software are you using that doesn't have ARM versions available? I think it goes without saying that a machine with an ARM processor is not going to be ideal if you're frequently working with x86-specific software.
I've needed some scientific software that still doesn't have ARM64 libraries available, so thus are stuck on x86. RStudio, for example, has a lot of dependencies and only recently starting offering experimental support for aarch64 on Linux.
Really don't understand why Apple still selling 13" MBP, as MBP 14 is much better, and you can get it with up to 96GB of memory now. I would assume they keep 13" just for people who still likes TouchBar.
I bought a 13” M2 for work (freelance dev) last week. It’s a weight and size thing for me. The 14” (I’ve had one before) seems to just cross the threshold for what I can comfortably carry when I’m out and about.
My impression is that the air can’t “gun it” for as long because of heat. Sometimes I run intensive stuff and I’d prefer the machine to turn the fans on rather than throttle the processor.
With that said, the air is nice. It’s just not worth it to save 160 grams for my use case.
Chances are Docker would be a dog on that 2011 too, if you were on MacOS. I honestly don't know how Docker got so popular when it sucks so much on anything that is not Linux.
The population that uses Active Directory lives almost exclusively on Windows.
The population that uses FCP lives almost exclusively on MacOS (OSX is dead, long live OSX)
The population that uses Docker does not live exclusively on Linux. In fact, a minority of developers works on Linux laptops, even when they target it for deployment. It's a larger minority than in other sectors, but it's definitely a minority, not even a relative majority. The population of (web) developers is possibly the most evenly-distributed of all, in terms of OS.
> It's not the fault of docker or linux containers that people insist
Obviously, it's the fault of people - as I said in the comment above: I'm surprised it got so popular among people, considering how much it sucks on the platforms lots of them use.
Docker is something that should be cross platform far more than those examples.
The whole idea of using Docker for dev boxes is to eliminate the cross platform dependency issues and make it easier for everyone, without maintaining wikis for each OS version. We don't use Docker at work because of the non-Linux performance issues.
Shameless plug on this topic: I've been working on a new Linux+Docker+Kubernetes solution for macOS recently! Already has quite a few improvements over existing apps including Docker Desktop, Rancher, Colima, etc: fast networking (30 Gbps), VirtioFS and bidirectional filesystem sharing, Rosetta for fast x86, full Linux (not only Docker), lower CPU usage, and other tweaks.
Don't forget about the related project colima[1] that makes it easier to run Docker containers from a Mac command prompt using a lima VM to host the containers. I'm not convinced on using volumes with colima yet, but it does make using dev containers a lot easier with Mac native VS Code.
It would be perfect if Rosetta supported AVX instructions. For instance, I could just install pre-built tensorflow binary packages in a x64 container on Lima then.
Man, I wish Rosetta (or Microsoft's ARM emulation) could support AVX. I understand it might be a patent issue. It's the one significant thing keeping me from running some of the software I'd like to on my MBP.
As a heavy Linux user for two decades I disagree. You can't even talk about Linux vs MacOS because the actual distribution you use matters a lot more than the kernel when you talk about productivity.
Using Mint, Elementary or whichever GUI distribution you would like to use with the Linux kernel has completely different tradeoffs in terms of GUI, which implies different productivity levels.
There is more. Even with the best case scenario you need to customise the GUI much more than you need to customise MacOS, for example energy management is a bit a of problem for most Linux based distros.
This simply happens because Apple owns the whole vertical from HW to GUI and they focus on the user. They have done this for decades. There is no such entity in the non-Apple world and this means there is more work for the user when trying to create a working environment.
And Linux keeps failing because of companies like System76 which cannot seem to build, or partner with a company, to be this competition.
Lenovo would crush Apple if it partnered with the Linux Community and made a cheap (in price!) everyday laptop for the consumer and focused on privacy and no lock in.
“Lenovo would crush Apple if it partnered with the Linux Community”
No it wouldn’t.
Users (the 99%) don’t care about privacy, tech, etc. They happily give their data up to TikTok (the CCP), Meta, Google, Amazon, Apple, etc. The Snowden revelations were met with a shrug or “I have nothing to hide”. This is well beyond established at this point (these platforms have billions of users).
They also don’t care about lock in - they just want stuff that works and adds as much as it can to their lives in as little effort as possible. The Apple ecosystem is incredibly successful because it does this exceedingly well. They view lock in as a positive - “All my stuff just works together - here’s more money Apple”.
The average person cares as much and is as involved in all of this as I am in understanding how my light switch is related to the function and operation of a nuclear reactor. The amount of care, time, thought, and energy I put into that is the cumulative total of the three seconds I spend interacting with light switches everyday.
I’m not disparaging users. I use the nuclear reactor analogy because I don’t care, I don’t need to care, and I shouldn’t have to care. I, (like most people) have only so much bandwidth for time, energy, and passion. I leave worrying and caring about all of that up to everyone from the people who mine uranium to the electrician that wires up the switches.
On HN we're the uranium miners, nuclear physicists, power companies, linemen, and electricians. We're the few people who do know and care about any of this "tech stuff".
"They also don’t care about lock in - they just want stuff that works and adds as much as it can to their lives in as little effort as possible. The Apple ecosystem is incredibly successful because it does this exceedingly well. They view lock in as a positive - “All my stuff just works together - here’s more money Apple”."
Probably the biggest myth about Apple these days, that it just works. Can I tell you the countless hours I have spent helping friends with their Apple and iPhone issues?
And besides, as technicians, our job is to GET PEOPLE to care about this stuff and lead by example. What good is trying to convince someone to not use Apple products if you are buying the latest Apple product?
And in my opinion, Google products "just work" much better and are more open to multiple routes of access to your data. At that, they are way better than Apple.
For another anecdotal datapoint - almost all of the people in my personal and professional circles heavily use Apple products (iMessage is 95% blue).
As the "tech guy" for most non-tech savvy people in my life I can count on fingers the number of times anyone has asked me for help with their Apple products. For the most part I look around and see people easily and happily using their Air Pods, iPhones, iPads, iWatches, and Mac Books.
Yes there are occasional minor issues and bugs but everyone knows to do a little dance of "turn it off, turn it back on", disconnect and reconnect, etc. Just like when I get in my car occasionally, I turn it on, and the infotainment display is wonky. I shut the car off, open the door, turn it back on, and say "Hmm, that was weird". Then I move on with my day. I simply won't accept that these kinds of occasional minor issues don't exist in the Google/Android ecosystem.
In the rare occasion something more significant has come up with Apple products and it's not a five minute or less fix I point them to the Apple Store and let them deal with it. Where's the equivalent for Google, Android, etc? I don't want to get into yet another tired Apple v Android "debate" here on HN but (for me) that's reason enough for me to recommend Apple products for most people.
As far as access to data, again, they don't care. How many GDPR or other requests do you think Facebook has gotten for dumps of people's data? I'm sure it's 1% or less of their user base. Again, as long as people can sign-in to their iCloud account that counts as "access to their data" to them.
It's perfectly fine if you see part of our role is evangelizing these things. For me I don't want to be "that guy" who's a half-step from the cryptobro cousin at Thanksgiving trying to get everyone to buy bitcoin. I'll occasionally drop things like "Oh Tiktok, say hi to China for me" or "If the service is free it's because you're the product" but I avoid harping and evangelizing. To use my power analogy I wouldn't walk into someone's house and start talking about how they should really be using Cree bulbs or Leviton switches. If someone has interest and wants to get into getting big tech out of their life, Linux, or XYZ light bulbs and power switches more power to them but I'm not going to push.
I don't think any amount of education or preaching I'd do is going to get someone to switch from Android to Apple (or vice-versa), Windows/Mac to Linux, pull Alexa out of their house, cancel their social media accounts, etc. Plus, just like crypto, if they do follow my advice I'll get to hear all about it in the event anything negative happens. No thanks but again - if you want to try to fight the good fight I support it.
What really pushed me towards Apple is that Lenovo offers a very limited set of configurations to us Europeans. No way to get the best machine from them, even if we're willing to pay for it.
I had the same problem with Lenovo here in India. I bought an Asus instead. got a Ryzen 9 6800 machine with 32gb ram and a 3050ti. amazingly good with Fedora.
I was looking to buy a Lenovo Thinkpad T14 Gen 3, and I could not get a better screen resolution than 1920 x 1200. I just checked and 3840 x 2400 is now available. They may have had supply chain issues. But it's too late for me.
Also, their ARM laptop is only available with Windows.
What is your Linux laptop giving you that your MBP doesn’t? And how does it hurt your productivity? Genuinely curious as I don’t see how it can be any different.
For my experience, as a longtime Linux user who got a MBP:
I really miss the desktop environments available on Linux. KDE is great, and even Gnome beats MacOS. You can't even move windows between desktops with a keyboard shortcut on MacOS. The desktop experience on MacOS is a "death by a thousand cuts" situation. E.g. Popup dialogues will rearrange your windows (e.g. if the popup is wider than your window, to remain centered, it will move the underlying window and I'm not joking.) E.g. Mouse acceleration can't be turned off, which makes MacOS pretty awful for any mouse-centric workflows. E.g. Blocking animations that fundamentally can't be turned off.
That's on top of things that aren't necessarily faults with MacOS, such as getting used to different keyboard shortcuts.
On Linux, I get native Docker, up-to-date coreutils (they're different on MacOS), more precompiled versions of software I use, and having the Linux desktop software that I prefer. (Finder is frustrating, Gedit or Kate for text editing is great, GIMP is much nicer on Linux, etc.) I also miss KDEConnect.
I don't use Xcode, but as I understand, updating it messes with git and python.
But the battery life really is amazing enough to make it worth it. I'm really excited for Asahi to progress to a point that I'm comfortable using it.
Apps like Keyboard Maestro or BetterTouchTool can resolve almost every Macos usability complaint that I've heard. Keyboard Maestro can move windows between desktops with a keyboard shortcut, for example, and there are multiple ways to disable mouse acceleration. For almost every missing feature or annoyance in Macos, someone else has had the same thought and developed a solution.
I should note that I consider this is one of the biggest flaws with MacOS. It really should not require someone to pick together disparate pieces of software to come to a state of usability.
It's like using Arch Linux, except the software costs money, is proprietary, and people choose Arch because they would prefer their own config over the comforts and defaults provided by other distros.
Configuring a MacOS machine might require spending over $100 on usability software, providing personal information to a myriad of companies (Tools like IINA or iTerm2 are the exception and not the default.), and even after all that you still have a variety of unfixable usability issues.
KeyboardMaestro is $36 and BetterTouchTool is $22. With KeyboardMaestro, it's not clear what the license is (which makes it concerning for use in the workplace.)
> For almost every missing feature or annoyance in Macos, someone else has had the same thought and developed a solution.
I do appreciate the effort, but this isn't true. You can no longer disable blocking animations in MacOS, there is no Spaces API for instantly moving a window from one desktop to another, etc. And any of this can break with a MacOS update, and there's no easy way to automatically configure a fresh install. (IME, MacOS users use Time Machine backups rather than a fresh-install bash script.)
From someone used to the comforts of Linux, MacOS takes a huge amount of effort and expenditure to only get 20% of the way there.
KeyboardMaestro is $36, Hammerspoon can do roughly the same and is free. Best part is: there is no pendant in linux, mostly due to the moving target of system configurations and DEs.
I've never heard that word used this way, thank you!
That said, that makes sense. I don't have experience with AHK or HammerSpoon, but I'd expect this functionality to be very dependent on the display server and overall desktop environment.
Can it resolve me not being able to use MATE as the desktop environment?
My biggest gripe with Mac OS X is the window manager. I want to be able to ALT+right-click anywhere on a window to be able to move it around, alt+left-click anywhere on a window to resize it. I want to be able to click the dock icon for an app I'm using and have the most recently used window of that app come to the foreground instead of however it makes that decision. I'd love to get preview thumbnails of windows when hovering over icons in the doc so that I can select the window I want. Right-click plus reading window titles takes longer to find the window I need.
I want to be able to customize my fonts because I have a hard time reading text on my QHD external monitor.
I want a terminal that doesn't suck (and yes I use iterm2, it still sucks because I can't quickly jump to the end / beginning of a line to edit a command).
I want to be able to select text to copy to clipboard and use my middle mouse button to paste and I want that to work for all programs.
I want to be able to hold ctrl plus use the mouse wheel to zoom in / out of web pages on chrome ... something Linux and Windows both do out of the box but it just doesn't work on Mac OS X.
I want to be able to customize all of this and not feel like I'm locked in to "the Apple way" of doing things.
I lot of this is just familiarity and getting really used to a particular DE over decades of use and taking little things for granted. If all you've ever used is a Mac then I'm sure you've figured out how to be hyper-productive on that DE. I just find it strange that, from a company that somehow positioned itself as UX leaders ... I find that I'm 1/10th as productive on my work Macbook as I am on any nix device (and while Mac might use a heavily modified NetBSD kernel IIRC and have zsh and bash ... it feels very different from a nix machine to me).
In the terminal, use emacs-style bindings like ctrl-A and ctrl-E to move to the beginning and end of lines. On the Mac, Home and End are for beginning and ends of documents.
Use USB Overdrive or Steermouse to bind mouse buttons to whatever you want, including Paste. Select-to-copy might be impossible, though you could certainly do select-and-click-to-copy.
ah, focus stealing. and the lack of focus stealing prevention.
Yes. This is why I could never settle down and marry MacOS. KDE amd tiling WMs do this right, everyone else is just rude.
- No nagware (no Apple Music pop-ups, advertisements for safari, login nag in settings, et. al)
- Built-in package manager
- Having (relative) parity between production and development
Between those three, you probably couldn't pay me to go back to MacOS. Adding my own package manager, disabling ads and making my Mac into a Linux-equivalent machine is possible, but it's a lot of work to maintain and set up.
If I was a creative and used Adobe/Microsoft tools, I might be a little nicer to MacOS. As a programmer though? I haven't felt the desire to use a Mac since Mojave existed.
I have an M1 Pro MBP and Linux running on a Framework laptop.
The Linux built in package manager is only ok. It often lags behind in versions of things I need. I ended up using Homebrew on both Mac and Linux. For the cases the Linux built-in package manager is too out of date I use Homebrew. It's not perfect on either system.
> - Having (relative) parity between production and development
For certain classes of development this is a big deal.
For my container work it doesn't really matter. I'm running Rancher Desktop and doing container based dev in the VM. Windows, Linux, or Mac doesn't matter as the host.
> - No nagware (no Apple Music pop-ups, advertisements for safari, login nag in settings, et. al)
I must have learned to ignore this as I've had Macs for a couple decades now.
On the flip side, a lot of business software I must use for work isn't available on Linux. I think this is the biggest problem for GNU/Linux as a general OS. There's some biz software that just doesn't run there.
I have a 13" Macbook Pro and a Thinkpad model I forget the name of.
Homebrew is down-right bad. There are certainly worse Linux package managers (pacman... looking at you), but if you're using MacOS I'd highly recommend giving Nix a try. Less muss-and-fuss, and stopped me from sending my Macbook on a swim in the local river.
> For my container work it doesn't really matter.
That's fine, it doesn't really for me either. The nice part (for me) is the native Docker and fantastic filesystem support. Whereas MacOS feels like a product I'm turning into a tool, Linux systems tend to feel like a tool out-of-box. Different strokes for different folks though, it really just depends on what you want out of a computer.
> I must have learned to ignore this
I must have learned to appreciate living without it, then. It's pretty jarring returning to a monetized OS like Windows 11 or Monterey for me.
> a lot of business software I must use for work isn't available on Linux
Oh yeah, for sure. Like I said in my previous comment, I wouldn't use Linux if I was a lawyer or a video editor. That being said though, pretty much everything I've used in the modern enterprise is browser-based. You don't need a native Jira app or a custom .DMG to run git. Arguably, everything you need is shipped right with most Linux distros.
I won't (and haven't) argued that Linux is perfect, but MacOS is converging with the Windows and Google school of desktop design. It worries me, and it's part of why I left MacOS in the first place. Photoshop is nice, but living on a computer that feels like a rented hotel room isn't very satisfying to me. Again, different strokes.
A few macs ago (maybe around 2017), I switched to a strategy of “either the AppStore or brew”. I’ve never had a problem with anything from brew since. I install some productivity tools, standard OS tooling (Inkscape, Gimp, Libre), everything I need to develop for Python, Android, various embedded arm platforms, Elixir/Erlang. I even add some extra tools for Swift development.
I'd recommend checking out some other package managers for Mac. I'm being a bit harsh on Homebrew, but Macports is generally a better option IMO. The real crown-jewel is truly what everyone says; Nix. It's just a brilliant, next-generation package management tool that does what it says on the tin. It works on MacOS, allows for granular package installation/upgrading, ephemeral shell-based dev environments, declarative system management and more.
It's a bit like comparing cakes. Homebrew is a frosted sheet cake, whereas Macports is that nice double-layer box mix your mom used to make. Nix is a coconut-dusted 6-layer wedding cake that hides a 10 course meal under the fondant. They're all delicious, but I have a hard time going back to the sheet cake nowadays.
The thing that gets me about Linux package managers is how easily they can wreck your desktop. Granted mac package managers aren’t perfect here either but I think it would make a lot of sense for there to be some way to designate things like audio systems or your DE as “system” packages and as such be protected and very difficult to accidentally screw up with e.g. dependency resolution gone awry.
I’d also not be opposed to a package manager more geared towards making sure things work without fuss than trying to reduce redundancy. I don’t really mind if there’s multiple versions of whatever lib installed if that’s what it takes. Storage is cheap, my patience isn’t.
In theory flatpak and such should meet that need, but the implementation is so much more quirky and troublesome compared to e.g. Mac application bundles.
> some way to designate things like audio systems or your DE as “system” packages and as such be protected and very difficult to accidentally screw up with e.g. dependency resolution gone awry.
I do this on NixOS (and used Nix to do the same thing on MacOS). It's really great, but the up-front work of configuring everything can be a bit steep. The end result is pretty nice though - your environments are all sym-linked together from a common package store, and you can group together certain environments/package sets to update independently of one another. The icing on the cake is the rollback feature, where you can go back through the generations of your environment (until the packages get GCed).
It's not perfect (and it would test your patience), but Nix is an interesting commitment to the philosophy of using as much disk space as possible. I'm hopeful that someday it will be the de-facto package manager for Mac systems.
There's no such thing as "the Linux package manager". If you want something traditional, dnf runs circles around anything available on macOS, and nixpkg is from another universe altogether.
On my old MBP I get regularly nagged to update to Monterey. Despite it not being supported.
On my iPhone it wouldn't stop nagging me to accept changes to the iCloud T&Cs. There was no permanent opt out. You could say no for a bit and then it would go back to nagging you.
Same with Apple Music.
Currently my iPhone nags me to disable background running of Garmin Connect, so that I loose integration with my Garmin watch.
None of this endears Apple to me and is definitely a consideration for my next purchase.
I have a Macbook Air M1 2020 and a Thinkpad X1 Carbon 7th Gen (4 years old!) running Linux Mint.
I have upgraded the Thinkpad's Battery, RAM (16GB), and SSD (500GB) for the cost of the parts. I will not be able to do that on the Macbook. Ever.
For what I do (browsing, videos, some writing and research) the only benefit to the Macbook is slightly better battery life. I have more software option on the Thinkpad for sure and it does not want to control everything I do.
I paid WAY less for the thinkpad and I get pretty much the same performance for my needs.
The THRUTH is that most people are being way oversold computing power and paying a premium for it because they are locked into the platform.
And just yesterday for some reason Safari vanished and re-arranged my bookmarks for no reason.
Getting my MacBook ready to sell as a matter of fact.
In my mind it is stupid (and poor marketing) that the linux community is not crushing Apple with cheap, fast laptops.
A ThinkPad X1 Carbon is cheaper, lighter, has a better keyboard, and on Linux I can run the DE that has the defaults/customizations and keybindings that I am used to.
I also don't have to worry (or at least I think that I don't) that the Linux kernel or my distro silently introduces a hack for their programs to bypass my firewall and VPN because they couldn't fix some bugs by the company-mandated release date:
I work on container stuff, so may be my POV is bit different but:
1. I had hard time fighting openssl installed by Homebrew and getting Python to use it. On Linux - this is never an issue. In general IMO using homebrew is fairly tedious.
2. Debugging of stuff running on Linux. Sometimes logging is not enough and while remote debugging can be made to work (I mainly use Goland), it is pretty fiddly and does not work reliably.
I am not new to Mac or anything tbh. I used to use Mac about 5 years ago exclusively and then Linux for next 5 years and now using Mac again. IMO for kind of work I do - Linux is just leaps ahead of Mac.
Some workflows on a linux system can be totally different than what is possible on a mac. Even with apps like BetterTouchTool, Hammerspoon, Amethyst and others. Customizing window management, advanced keyboard shortcuts, general system behavior in mac is going against the tide. It kinda works, but never as good as you'd want because you can't really get rid of the default window manager and default global behavior.
Some window manager in linux are more like window manager frameworks, like AwesomeWM, that lets you customize its behavior via lua scripting. It's extremely powerful and allows you to get exactly the behavior you want.
But this part of linux is pretty niche stuff for sure though haha.
I wouldn't say I'm more productive thanks to this, but I'm way happier using a system I can set up so that it behaves how I want, instead of having to follow rules I don't agree with and can't change.
There are some really weird things. The only sustained usage of Macs I've had is a Mac Mini (x86) that I had for app development. Even just plugging in a plain old UK layout USB keyboard (not an Apple one) and having it behave itself and give me the right characters was surprisingly difficult.
Biggest things for me
- I've never found a good WM that can replace what bspwm can do.
- Something as useful as pacman, brew is no where near as good.
- Full control over my system.
Plus I can run my set up on a £30 chromebook and be nearly as efficient as my ~£1800 Work laptop.
Others have filled in some of the productivity benefits, but it also avoids a thousand little papercuts. It doesn't have uninstallable crapware like Apple News.
A VM on the Mac will run longer on battery with less heat than a Linux installation on a native Linux x64 laptop.
Of course if you need to have your laptop be x64 to match cloud architecture then that's different. Or you could just use ARM in the cloud and save even more carbon emissions.
If you need to interact with connected peripherals from the linux environment a VM is still usually a much less productive experience. I do a lot of embedded development, and while it is often possible to do USB passthrough to connected boards, lots of things don't work the same as they do when running Linux natively, and I have never found it to be a seamless experience (on VirtualBox or VMWare)
Exactly. My strategy for this for my personal projects is mostly ARM VMs. In a couple of cases I have to run x86, and I run a used HP thin client at home as what I call a “physical VM”. I then use tailscale to connect from anywhere.
I’m not the person you asked—but a ton of developer tooling just runs better on Linux. Installation is usually easier too—on a Mac, there’s Homebrew, but Homebrew doesn’t have the best user experience.
All the GUI productivity apps run better on a Mac—image editors, IDEs / text editors, apps like Slack or Zoom or whatever.
Fwiw you can use nix (the package manager) and the nixpkgs package repository without ever learning or writing a single line of nix (the language).
My introduction was to use nix instead of brew and ignore the whole "declarative configuration" aspect. It took me a while to get comfortable enough with that part of the ecosystem, but that's irrelevant to my point.
Give nix a try, use it like you'd use brew. Ignore the declarative configuration stuff for a bit.
If you're happy with brew, by all means continue using it. If asdf is getting you all the languages and tools you need without issue, by all means continue using it. My comment is just adding some details to the parent comment about nix running well on macOS.
EDIT: I doubt your question is in good faith considering the 'cult' comment, but to answer your question at face value regarding benefit the first one that comes to mind is being able to have multiple versions of the same package and being able to rollback to previous versions if something breaks. This also means you can have package A depend on package B v1, and package C depend on package B v2, and both can coexist. If this is not something that's valuable to you, that's fine too. The other killer feature is being able to install dependencies for a project/repository if it uses nix - just clone, cd, and run `nix develop` and you'll have all the dependencies available.
The cult comment was in jest. These are just package managers don't take things so seriously, it is all in good fun.
Yes, I understand the multiple versions of the same package thing - Go/Rust have package managers that quite reliably solve that problem. Pipx also to a certain extent solves that problem.
Brew is useful mostly for casks (browsers, mac apps, fonts) which don't usually call for multiple versions.
I'm sure it is a hairy problem for some combination of languages/tools but I guess I'm somehow completely side stepping it. Perhaps I'm more likely to encounter it if I treated my laptop as a server because that seems more like where Nix might shine as a sort of ansible/chef/puppet on steroids ;)
Fair enough! Yeah, I was using brew as a replacement for something like `apt` or `dnf` on Linux. For example, installing packages like htop, neovim, emacs, etc. For things like Rust I stick with cargo (which is awesome), though I do manage my Go install through nix.
I’m constantly fighting Brew on all three computers where Brew is installed. Yes, I have wiped them, the problem is Brew, it is not the particular installation. If it were a problem with one particular installation of Brew, then it would not suddenly become a problem on a new fresh install of Brew.
The idea that you would need to wipe anything to start over is just bizarre to me in the first place. One of the many problems with Brew. Brew is extremely slow and it is prone to doing things that I do not want it to do without explaining why it is doing them or how I can alter its behavior (why is it installing package X? why is it updating right now?). Sometimes when I install a package, I get a spurious failure, and I need to re-run the installation command. Sometimes I just want to install one package, but it goes through “brew update”—which is extremely slow.
I used brew for years and it's a fine tool but I found it lacking. It doesn't give you control when you need it, and telling a user that something got b0rked along the way and asking them to start over isn't good developer experience (in all fairness, neither is the current status of the various nix cli's, but trade-offs am I right?). I've lost so many hours because something broke and sometimes it was because one dependency got updated and broke other packages and other times it wasn't obvious what broke, and I would have to start from scratch.
I never used brew to install or manage my python precisely because it gave me so many issues.
Can I just do “nix install foo”? I looked into nix ages ago but opening a text file any time I wanted to install some random package just felt too weird.
I doubt the chassis is adding that much weight. MacBooks are largely battery.
The MacBook Air is also significantly lighter than a pro despite having the same basic chassis.
I upgraded from the 13" M1 Air to the 14" MacBook Pro M1 Max
While Blender performance (rendering especially) is much faster, I sorely miss the MacBook Air's portability. The 14" is built like a brick. Compile times are faster, video editing is nicer with more RAM in the Pro, but it's a really tough call given just how different the weight is
I just got my work machine upgraded to the 14in Pro from the 13in M1 Air recently. I think overall the extra screen pixels are worth the weight tradeoff. It's only a little heavier and I don't really notice it on my back when I'm cycling
I did love the M1 Air though! Probably the best laptop I've ever used pound-for-pound.
Agreed, a high end machine in the Air form factor would be great. The M2 is more than enough compute-wise for my SWE workflow, but it's not a great deal once you start bumping the RAM and adding storage when compared to the base 14 inch (+ the extra benefits you get with the 14" screen, etc).
They aren't though. The fact that they are ridiculously comparing performance to the 3 year old at this point final Intel MacBook (which was a dog turd) in their press release with some very cherry picked benchmarks suggests things aren't quite as good as they could be. Why are they talking about 3+ old parts and not the current competition?
My bet for why they're comparing it to the Intel models: They want to advertise more heavily to the Intel folks so they can more quickly reduce / remove support for multiple architectures. They are trying to convince the Intel folks that Apple Silicon will be much more performant. Also, they probably want people to spend money on their products sooner than they otherwise would have.
That being said, They are still comparing against the M1 chips here. They aren't only comparing against Intel.
“With M2 Pro on MacBook Pro:
Rendering titles and animations in Motion is up to 80 percent faster1 than the fastest Intel-based MacBook Pro and up to 20 percent faster5 than the previous generation.
Compiling in Xcode is up to 2.5x faster1 than the fastest Intel-based MacBook Pro and nearly 25 percent faster5 than the previous generation.
Image processing in Adobe Photoshop is up to 80 percent faster1 than the fastest Intel-based MacBook Pro and up to 40 percent faster5 than the previous generation.”
In other words, they're absolutely using the Intel machine to bury the lede.
Apple is now facing the same struggle as AMD and Intel. They have a generational processor product that relies on new silicon to get faster, and they can't get new silicon fast enough. The M2 was basically just a wattage-bump of the M1, and looking at the performance increase/processor count, I'm inclined to call the Pro/Max the same thing. You're telling me you added 2 cores to a 10 core system for a 20% performance increase? Wow, color me surprised.
The whole thing feels like Apple's lack of humility here is biting them in the ass. There's nothing wrong with paving your own path in the tech industry, but mocking your competitors is not what a reasonably-educated analyst does. It's what someone in marketing does.
Marketing essay (that I wrote with the help of the https://www.trustmypaper.com/ ) is a type of writing that requires careful consideration and a good understanding of the topic. Its structure is defined and follows a specific pattern. Various elements such as data, statistics, graphs and charts are also presented in the body of the essay. Using the information provided, the writer should develop a marketing strategy.
File it under the "botched marketing campaign" folder, then. If you're selling a machine to professionals, the copywriting ought to speak their language.
They are selling to professionals, but probably not ones that speak your exact language. Having a solid understanding of hardware puts you into a small small minority of professionals that this was meant to be for. This is the same for Intel and AMD marketing: hardcore tech people are not the target audience for this material.
If you know what "color grading" is (and your job depends on doing it) you're probably a "hardcore tech" person that knows what the mystical abbreviation "GPU" stands for.
Professional writers using a word processor might not understand what "hardware-accelerated H.264" means, yet there it is on their website under Air! Are those for "hardcore tech" people?
You're referencing marketing material. All marketing material is to sell the product, usually to a somewhat targeted subset of the buyers. Looking at their sales, and around any office, their marketing, and "Pro" products, appear to be extremely appealing to a large range of professional users. I think this shows that their "Pro" label is appropriate. I'm having trouble understanding why you think it isn't appropriate.
Do you have a notebook in mind that is worthy of a "Pro" label, so I can see what you're baseline is?
You defined a "Pro" as someone that's "using your computer to make money", like writers using a word processor. Let's suppose all they do is use Google Docs in a browser.
Should they buy the Air, or pay the premium for the Pro? After all, they're "hardcore tech" people.
Because then only people with Intel Macs care. People like me, who remember the 2019 i9 Macbook Pro being one of Apple's least-recommended laptops, are kinda unfazed by favorable performance comparisons. There are probably hundreds of laptops that can destroy it in a fraction of it's power envelope; it wasn't a great machine.
Mostly, it's ironic that Apple is comparing themselves to a company that they want so desperately to escape. And, once you do run the math on their actual year-over-year improvements, it's not that different from AMD or Intel. And Apple is increasing their power consumption to hit those numbers. It's quite literally square one in a sense, just with a different CPU architecture and a different manufacturer.
I just got an M1 Pro MBP for work last week, so I was kicking myself a little when I saw this. Then I was less disappointed when I saw the only difference is a 20% speed increase for some apps, no changes to things I would actually care about, like adding a couple USB-A ports, or if the speed increase was > 50%.
I do all compiling and heavy testing on servers, so the laptop is just for SSH, analysis software, and regular desktop stuff, for which it is fast enough.
You’ll have to wait until 3rd parties do those comparisons?
Apple has literally never posted “here’s a ream of benchmarks vs the Dell XPS XYZ and HP modelnumbersneeze738462” — pretending this is indicative of some grand conspiracy to hide performance deficiencies of these new M2 models the way GP is, is silly.
Realistically it’s expected it will perform relative to competitors now as the old models did in the past.
More realistically, this is one of the more drab ways apple announces products. They didn’t even have a presentation or anything. So they’re clearly not marketing with the same goals in mind as a presentation. I’d say the intended audience here are people already going to buy the latest model because they’re in the apple ecosystem.
Not that I’m inclined to defend a multi trillion dollar company, of course. It’s just pithy to hop into this sort of thread and grind the old axe against apple.
I am not sure if most people buying MBPs are interested in the synthetic benchmark performance as much as they are interested in energy efficiency and vertical integrations.
Things may be great on the high end of Apple's M1/M2 laptops given they actually have enough ram for simple tasks. But those are also insanely expensive. At the low end Apple is actively ripping off users providing computers which can't even edit photos.
My father, a long time apple user, decided to get an Apple M1 to replace his old Apple macbook. He bought the M1 8GB model (without consulting me). Turns out that the Apple M1 w/8GB uses ~4GB of that just to run the OS and if you happen to install adobe creative cloud there goes another 2GB and if you use some other "darkroom-alike" program to open a 20MB photo suddenly you're out of RAM and the software starts crashing.
Apple M1 8GB computers, stuffing the cache with their rosetta translation and lack of native software, cannot even edit photos. They're only good for surfing the web and sending email. Apple selling these as actual functioning general purpose laptops is a fraud.
This is kind of a crazy story to me, and likely pointing to other issues. The "4GB just to run the OS" is probably just macOS preemptively caching... and all Macs use swap extensively so using e.g. 10GB of memory actively is totally fine. I've run out of memory before, but only with serious usage (100s of tabs, VSCode, a 100 MB+ pdf open, etc), and even then there was no actual crashes, I was told that I was using too much memory and suggested some programs to close. What was the "darkroom-alike" program?
Sounds a lot like Lightroom. Last time I used it (on Windows) it was a couple of years ago and while it was incredibly convenient for editing photos quickly, it was massively unoptimized. Drop a couple of pins and Lightroom starts hogging cycles and memory. I wouldn't even joke about running it with 8GB RAM let alone 4.
All that aside, it's borderline criminal any new computer is being sold with 8GB RAM. This is 2023 for crying out loud. Do you want a 56K modem to go along with your miniscule amount of memory?
Sounds like you have Activity Monitor open constantly. Personally I wouldn't worry so much about RAM usage and instead let the computer do it's thing. I have done intense tests with my 8GB and it handled them just fine (compiling linux while having dozens of tabs open + recording screen + UTM VM, etc.).
The only reason I got a Mac for the first time is because the 8GB had good value. I would never buy the higher-end Macs because spending that much money on a fragile thing that cannot be upgraded is ridiculous to me.
Interesting. I have one of those computers with 8gb and have not had problems with Lightroom, Photoshop, or Autodesk fusion360. I'm not saying that there are no issues, but the m1s in my experience seem to need less RAM to do the same things.
I've been on an 8GB M1 for two years, running Adobe CC the entire time. I've edited absolutely massive photos - Gigabyte TIFFs, batch processing, etc - and never had any crashes.
In fact I've done the above while playing HD video from YouTube with 40 tabs open and not experienced so much as a hiccup.
To me this sounds like issues on the Adobe side. I worked for 6 months on an M1 iMac with 8GB ram - it ran Xcode, iOS Sim, figma, slack, safari, terminal, calendar and zoom simultaneously for work on a large iOS app with zero issues. I used it because it was faster than using my i9 MBP 16” with 32GB of RAM - the MBP would be blasting its fans constantly with the same workload.
Now I’m on an M2 Air and I don’t see myself needing to upgrade in the next 2-3 years.
Tell me you don't have anything to contribute other than a weak, ten year old meme without saying that you don't have anything to contribute other than a weak, ten year old meme.
Yes. I feel that the tech media has done a terrible disservice by recommending the 8GB Apple M1 laptops. I am not sure what it is about that architecture but they are completely unusable under load in a way that's not true of my 2015 i5 laptop which also has 8GB of RAM (which is slow all the time, but not as bad at the worst times as the M1 was).
I believe that the M1 architecture is such that you don't actually need as much RAM as an older model computer. It is a system on a chip so moving data between RAM and storage is so fast that it can swap between the two without much performance degradation.
> It is a system on a chip so moving data between RAM and storage is so fast that it can swap between the two without much performance degradation.
How much faster? I’m not sure what your specific point of comparison is.
Sure, the SoC provides many advantages, many due to Apple’s UMA (Apple’s Unified Memory Architecture) but the system is still limited by storage speed and latency.
But if we’re talking about the relative performance difference between solid state memory (i.e. SSDs, if that is the correct technical term here) and UMA RAM, I’d expect* a significant difference, perhaps (1) an order of magnitude in terms of latency and (2) 2X to 5X in terms of bandwidth. Not to mention that excessive SSD writes would be unwise.
Now I will admit, if there are large parts of the operating system that don’t need to linger / lurk / creep around in RAM, they could be swapped out. And they might even be kept on SSD and not have to be rewritten at all except for patches and upgrades.
* This is a guess based on ‘usual’ data-locality rules of thumb, hopefully allowing for Apple architecture somewhat. I’m happy to be educated / corrected.
> At the low end Apple is actively ripping off users providing computers which can't even edit photos.
> open a 20MB photo suddenly you're out of RAM and the software starts crashing
Have an M1 laptop with Photoshop and Lightroom and need to edit 16-bit film-scans from a 6x12 that are more than a GIGABYTE each and have had zero issues.
Def the opposite experience for me software wise. Latest joy is chrome hanging whenever tabs are moved. Things randomly seem to crash all the time.
Maybe it’s normal for OS X but my Linux experience is pretty much the opposite - rock solid, everything is fast, and nothing breaks. Granted the hardware does seem really good (except for slow anything to do with x86 and a useless GPU that can’t play games or do VR).
For me, it's the WiFi that is completely botched. The status bar will frequently display incorrect statuses and trying to select a network from the status bar hardly ever works. I have to go into system preferences which has a tendency to totally freeze up for minutes at a time when I select the network tab.
Then it'll occasionally get into a state where it tells me it's connected to the network but doesn't have Internet access and it must be a problem with the network (I know it's not because I have other devices connected to the same network that work fine). To get out of this, I have to turn WiFi off and back on again but Apple in there infinite wisdom has decided if I turn it off and back on too quickly, I must've done so by mistake and so they don't actually ever turn it off. So I have to turn it off, wait some unknown period of time (I usually wait 30s), and then turn it back on. Drives me crazy.
In the end, every morning when I get to work, I expect it to be roughly 5 minutes between when I arrive and when I'm actually online.
My personal computer running Linux has never had such a ridiculous issue (it does occasionally have issues but they're usually fixable after a quick Google search) but somehow everyone claims Apple just works and Linux is to much of a pain.
I got an M1 for a recent job change, I've never interacted with one before that and my personal laptop was a relatively old macbook. The thing that amazed me the most was the speed. I would actually get confused because it did things so much faster than I was used to, I would sit and wait for something to happen only to realize it was so fast I had missed it occurring.
Look into Gallium Nitride (GaN) chargers. I got a 100W charging brick on Amazon for a little over $100CAD (a bit pricy yes), with 3 USB-C, 1 USB-A. There are cheaper models with less ports / wattage as well. A single brick fast-charges my M2 Macbook Air, my Phone, headphones.
Those things are crazy compact. I have a tiny Anker 30W single USB-C charger for my M2 Macbook Air and iPhone for when I don't expect to need a charger and it's hard to not lose it because it's so lightweight and small. The cable is a lot heavier and bulkier and I already bought the lightest one I could find.
I'm also still on the MacBook Pro M1 and it's amazing. But this new one makes me drool. I am happy to see MagSafe back and the SD card slot and USB-C ports on both sides.
Now someone grab my arms before I dump water on this keyboard.
The comment you replied to says that their MacBook M1 does not have those features.
It does not have an SD card slot not MagSafe because it was a 13” M1 MacBook Pro. Nobody claimed that their MacBook was the latest chassis, just that it had an M1.
> The M2-based 13" still has two USB-C on the same side and no SD
One can easily interpret their comment to say that they are happy it still has USB-C on both sides. Most non-MacBooks have one or two ports, generally on one side only. With the addition of other ports, Apple could have very well removed some of the USB-C ports.
I’m not sure why you are trying to insist that they don’t know their own laptop. They even provided you a link showing the model did in fact exist.
Enjoy it as long as you can. As soon as this kind of hardware is close to becoming commodity, software will get as slow as it is currently on commodity hardware.
software experience is funky, and I am really disappointed by the lack of support for external monitors...
I happen to have 2 identical monitors, and a third I can't use on a macbook pro with m1 pro. They're in H configuration, portrait, landscape, portrait (last two are apparently the same model though different years)
The mac constantly confuses the two identical monitors (one is rotated 270 degrees so it ain't great). I've tried two USB video card adapters; one wouldn't work at all, one from Plugable couldn't rotate the missing monitor (rotated 90 degrees). If I asked it to drive the middle horizontal monitor, it mirrors with the matching monitor (again an issue because it's rotated 270 degrees)
drives me nuts, there's no fix, it's a gamble which orientation it'll choose. making things worse - it also won't wake from sleep without a lot of trickery with the lid closed (only reliable method I've found is re-attaching the power cable... not great when I need to do this daily).
my old intel mac? hit the spacebar on my usb keyboard to wake up. no issues with 3 monitors, and could even drive the internal display with it.
I don't care about 2 external 4k screens -- some of us are still on 1080p and don't care. how did we regress like this?
My 2 year old tuxedo notebook costed less than half of what the M1 MBP costs and renders any given blender scene 3 to 4 times faster.
I consider getting a MBP, but quite frankly I wonder if I could really let it crunch "pro" workloads over night, because either they have infinitly silent active cooling (doubtful) or they throttle the thing (bad). If I pay for top notch hardware I want to have it crunch at full blast for as long as I desire, if I need it.
I’ve never had an issue with a 20w Anker Nano charging my non-pro M1. Feels like the future charging my laptop with an adapter the size of an iPhone charger.
I wonder how long until Javon's paradox catches up and degraded software quality eats all of the performance gains afforded by the new hardware. The 2020s version of "Andy giveth and Bill taketh away".
>It’s crazy how far ahead Apple is getting in the laptop game with these new chips. I have the first gen MacBook Pro with the M1 Pro (ok maybe they’re not winning the naming game), and it’s a perfect computer. Battery life is literally all day, every single action is instant, I have zero regard for many apps or tabs I have open
I can say the exact same thing for my Linux XPS, in fairness. My M1 battery will last longer, for sure, but both will last a whole work day. But because the M1 can't run Linux to a good standard, it's mostly just used to test/debug OSX related issues.
Losing external GPU support with apple silicon has become a major deal. In the age of AI tools like stable diffusion, performance per watt matters less if you lose the capability to scale up these kind of workloads. Gurman's report about the mac pro tower gave us reason to hope on this (that the upcoming mac pro will have eGPU support), but when it comes to MacBooks, let alone the pros, no one knows when/if the support will arrive and when/if nvidia support will be added.
You typically wouldn't use a Macbook for that though. If you want to do AI research, buy a gaming rig (< $1000) or rent a cloud GPU instance and then remote desktop or SSH in from the laptop. Then you get the portability and battery life of the laptop, and once you start training you can put it to sleep and the remote instance will just keep humming away.
> It’s crazy how far ahead Apple is getting in the laptop game with these new chips.
Sure, but developing with Java up-to-16 is much slower with M1/M2 due to ports not being available, and Rosetta having to emulate. Datapoint: We develop plugins for Confluence and it’s about 1 minute to load each page.
I don’t think developing Java 11 is a corner case of using a Mac.
Depends on the resolution of your external display. I had a 144hz 1080p monitor working as expected with a usb-c to Display Port connection (It didn't work with an HDMI cable).
It is if you're using Safari. I have two M1 Max MBPs. My personal machine runs Chrome, and my work machine runs Safari. I can run my work machine for well over a day doing iOS development in Xcode with an external monitor attached and not run out of battery. The personal machine will last almost all day (without a monitor attached) but not quite. I'm sure if I used Safari it would last much longer.
I've tried both. Safari does some neat tricks to reduce usage when pages are not active. But if you have a heavy webapp - you have a heavy webapp, and power will be used.
> It’s crazy how far ahead Apple is getting in the laptop game with these new chips.
For people who don't mind an ultra-brittle screen or who don't mind paying an extra 20% Apple tax to get their screen replaced when it breaks. Had a M1 Mac. Screen broke overnight. Meanwhile my LG Gram feels way more sturdy and hasn't let me down in years. M1 Mac lasted about 13 months.
Not happy at all with Apple asking 680 EUR to replace the screen on a M1 Mac I paid... 1 000 EUR. 68% after 13 months to replace their shitty screen.
And to get what? Another brittle screen.
Sure, it looks nice. But my LG Gram wins, hands down: I can count on that thing.
> For people who don't mind an ultra-brittle screen
Have had an M1 Macbook Pro for 2 years now. Dropped it several times from various heights up to 4ft. Dropped books on it a couple of times. Screen still intact. "ultra-brittle" is nonsense.
(And in the 10 years I've owned Macbooks, never have I broken a screen despite dropping, etc.)
The M1 Air I'm typing on has been shot down from desk to floor due to a golden retriever tripping on the charging cable running after a ball. It flew about 2m horizontally, and the height of the desk vertically. It has a very noticeable bump on the corner of the case. The screen though, is good as new.
Since you say EUR, I suppose you're in Europe. Which means that your broken screen should be covered by the EU rules mandating at least two years guarantee:
" You always have the right to a minimum 2-year guarantee at no cost, regardless of whether you bought your goods online, in a shop or by mail order.
This 2-year guarantee is your minimum right, however national rules in your country may give you extra protection.
If goods you bought anywhere in the EU turn out to be faulty or do not look or work as advertised, the seller must repair or replace them free of charge or give you a price reduction or a full refund. "
> For people who don't mind an ultra-brittle screen or who don't mind paying an extra 20% Apple tax to get their screen replaced when it breaks. Had a M1 Mac. Screen broke overnight. Meanwhile my LG Gram feels way more sturdy and hasn't let me down in years. M1 Mac lasted about 13 months.
Not being pithy but is this a common issue? It's not one I've previously heard people complain about, the screen on my current gen and prior (2012?) MacBook pros but seem very solid so I'm surprised that this is considered a weak point...
That’s unfortunate you had a terrible experience but in all fairness, you’re one data point. You can simply google “most reliable laptop” and see the that Apple is consistently ranked #1.
As for my own single data point, I have the following in my house and none have had any issues whatsoever:
- 2012 inch MacBook Pro
- 2017 12 inch MacBook
- 2019 MacBook Pro
- 2021 Intel MacBook Pro (Company machine)
- 2021 M1 MacBook Pro
They're not even that expensive as tools go, either.
My Festool Domino was $1200. It makes one particular hole in a piece of wood. It's very, very good at it--but it doesn't replace my tablesaw ($600) or my bandsaw ($700) or my tracksaw ($400) or my drill/driver ($300) or my jointer ($600) or my planer ($900) or my drill press ($500) or or or--and I did not buy particular spendy variants of any of those, except the Domino. (I could've easily spent $2000 on a tablesaw alone.)
Contrast this to a $2500 Macbook Pro, which is a future computer from beyond the moon that can do a ton of revenue-relative work without even kicking on its fans, and can watch Netflix besides. It is also small; it provides all those benefits in a thing that can fit on your coffee table, without the ongoing costs of storing actually expensive tools (edit: as mentioned downthread, for me that's about $550 a month based on my mortgage's square footage calculation).
Computer people complaining about computer costs when they're paid six figures for what they do with them is...it really needs to stop. It's silly.
That's an interesting example. Festool are definitely premium equipment.
I (try to) buy quality tools but I can't really justify Festool for my bodgy DIY attempts. I can barely justify the Bosch Professional tools I tend to buy instead.
My friend, an actual professional who fitted our kitchen, exclusively uses Festool. He can definitely justify it and swears by it. He says it's quicker and the tools suck up more dust which means less time cleaning up and less exposure to lung disease.
He'd never consider buying a MacBook Pro though as he thinks Apple is for poseurs ...like me.
I'm a professional software engineer and not a hairy arsed builder[1].
I spend roughly 8-10 hours a day on a laptop and I want a nice one. I'm probably going to switch from my Dell running Linux back to a MacBook Pro.
(That's for the shiny logo to make me look cool in coffee shops and not the battery life, performance or general attention to detail obvs.)
With regard to useful lifespan, I have an MBP from 2013 that's still going strong even if the OS updates from Apple are getting sparser.
-
1. This, apparently, is the vernacular used by friends in the trades to describe a "proper" tradesman. You live and learn I guess. ;)
I only bought a Festool Domino because there's literally no other tool that does that. It's amazing; the thing is like cheating for panels or for trim. Everything else is a mix-and-match. My tracksaw is a Makita (uses the same rails as a Festool, feels fine, isn't Quite As Nice). My shop battery tools are DeWalt, my house battery tools are Skil 12V (which are really nice for the price, even if they're not as small as other 12V brands). My tablesaw is a Delta (also sold as a Ridgid). Though I've thought a lot about going to Festool for a hand sander, mine are...Bosch. ;)
I do agree with your friend about dust collection though, which is why I use the tools I do use. Dust is the absolute worst, and it is worth spending money to solve that problem IMO. The thing about Festool (and also Fein)is that it's good at solving those problems portably, and in that light they're not that expensive. If you aren't a contractor and don't need to bring your entire work life around in a Systainer stack though, something like my 1.5HP Harbor Freight dust collector and a simple dust cyclone can get you very far by themselves. (They are, however, loud.)
I've also gotten into 3D printing I've been building better dust accessories for my other stuff. (For example I've got a set of 4" dust couplers in development that are a WIP but are noticeably better than any of the options I have found online for a reasonable price and that work in a home shop without ducting.)
Consider how expensive it is to store and operate those tools. Square footage isn't cheap most places; my wood shop probably costs me about $550 a month in used space! In that light, refreshing a laptop every five years or so (on the low end for my usual cadence for Macs; both my 2016 and 2011 Macbook Pros are still in use for dedicated tasks) amortizes out a lot more pleasantly.
I haven't thrown out a MacBook in my entire life. All three are all still in use, and two of those were bought used in the first place. I also have three iPads, going back to the iPad 2, that are still in use. I cannot say that about the Android devices I have had in a similar length of time such as the Nexus 10's putrefying back case that makes it unpleasant to touch. And while I have the PC cases I have had for a decade, the rest of both of them have said hello to Theseus at least twice.
You maintain them because theyre made to be maintained.
It's pretty hard to wear out silicon. That's why the batteries are glued in and they're made out of glass. To encourage them to artificially wear them out.
... Huh. That is one Apple complaint I've never heard. Had various Mac laptops since 2004 and never seen a screen crack (on my own one; for what it's worth, I do know two separate people whose Apple laptop's screen cracked because their _cat bit it_, but I assume that can happen for any laptop with a sufficiently weird cat).
And, really, a screen is a screen is a screen; any laptop screen is likely to be about as fragile as any other.
The problem isn't that the screen breaks; the problem are their ridiculous repair prices.
The repair prices are artificially inflated because there is no competition and they don't sell spares, and I think they keep them high on purpose to push their insurance product Apple Care. (Which is a great business for Apple, because the repairs are much cheaper for them than what they charge uninsured customers)
For what it's worth my Macbook Retina has survived mostly unscathed since 2012 with heavy daily use. Two battery replacements, fraying charging cables and some worn out key faces, that's about it. Then again, I'm sure later MacBooks might be less sturdy?
I’ve probably the same model as you - and yeah, apart from two battery swaps and a new power brick, and the occasional fur removal (cats), it’s a trooper. I’ve dropped it plenty, so it has aerodynamic corners, but it has never let me down. Battery swaps are far easier than ifixit etc. describe - I just used fishing line and a couple of sticks as handles to cut through the glue without having to remove anything but the bottom case.
That said, I have heard frequent gripes from people with newer models about things like hinges, ports coming loose, and keyboards failing - I mean, from my experience with their software quality over the years, I wouldn’t be surprised if their hardware design, while having plenty of great new tech, lacks attention to detail.
My toddler would stomp and throw my old MBP 2015 and I wouldn’t worry a bit since I know this model is durable enough. I can’t say the same with the latest models.
"They tend to not sacrifice longevity over flashy technologies. "
What?
In what universe is this true?
They spent generations of macbooks on flashy keyboards that didn't work, etc.
Trying to pain Apple's inability to keep up in certain area as them trying to care about longevity is hilarious.
In the end - it's not about longevity at all, it's almost certainly about whether they can get it at the price point they want, in the volume they want.
Well, when they screw up design, they do it in a spectacular fashion, yes.
Considering I buy a new MacBook every decade and they just endure the torture I put them through (I don’t abuse them though), I can say that they are pretty well built machines.
Same is true for my 7 year iPhone cycle.
I pay similar price to top tier laptops of other big brands, and get what I pay for. Use it for a decade, move the machine to spare and get a new one. When one does this with a thinkpad, we applaud them. When someone does it with a MacBook, we don’t believe them. Funny.
No, they make engineering mistakes while trying new things. We call companies stagnant when they don’t try, and we forgive them for trying if they are not Apple. But everyone loves to hate Apple.
I’m currently using a 2014 MacBook Pro and a 2020 M1 MacBook Air. I have also seen and supported a lot of intermediate machines at work. We don’t service them unless somebody spills something liquid on their keyboards.
So, while I owned a few of these things, I have seen and supported way more machines, and my judgement stands based on this experience.
No. Let's not move the goalposts. Those are two different things. One is not incorporating not mature technologies into the product, the other is trying to push a known technology further. They tried and failed for 4 years, and gave up.
If OLED was already a mature technology for large HiDPI panels and TVs, people wouldn't have been doing panel endurance tests, and TV manufacturers wouldn't hide panel-life sucking balance and recalibration features into their products.
I have used a couple of Macs with butterfly switch keyboards. The problem was not the switch idea itself, but how the key depth has been reduced to make it thinner. They have gone beyond the comfortable depth for speed typing and made a mistake. Also, the minuscule tolerances made it susceptible to jamming due to dirt I didn't like the keyboards with butterfly switches, to be honest. The depth and weight in 2020 MacBook Air (M1) is much, much better.
Yes. We criticize every company, but I think everything should be balanced. I'm not a fanboy of any company. There are ones which I enjoy using, and ones I don't prefer to use. The gist is, I try to objectively reflect my opinion on my comments, and I get replies in the tone of "my experience does not match to yours, so you're most probably lying", I'm genuinely disappointed. Because I never think the other party is not telling the truth. I'm just pointing out and saying "I'm in that state, so it's possible". But I never discount others' state and/or experience.
You can leave personal attacks out of it (it only weakens your position)
> The first we and second we is different [sic]
Nope, objective critics criticize Apple and other companies when they make terrible decisions. Apple's vendor lock-in means that people that use a lot of Apple products are typically defensive and will make excuses for them. Google fans do the same. I'm typing this on an Apple product...
Butterfly keyboards and MotionSense both worked poorly.
I have used these two "we"s in different contexts. Let's not move the goalposts, again.
As far as I remember, I also criticized butterfly keyboard switches up there. I'm also not afraid to criticize the tools I like a lot, too. I have also praised and criticized IBM, Lenovo, HP, Apple and numerous devices at other places.
I have never experienced MotionSense, so I can't talk about that.
My daily driver is the "irreparable" 2012 MBP Retina. You may remember this machine as the one everybody bitched about because the RAM and SSD were soldered into the mainboard and weren't upgradeable or replaceable?
Let's see, it's 2023, hmmmm, seems the machine is coming up on 11 years. IDK about you but I think 11 years is incredible longevity for a laptop. You know, it just might be that making the machine "irreparable" has been key to its longevity.
My 2013 MBP is a car wreck. Screen broke at some point had to be replaced. The second battery is now so bad it won’t even turn on anymore when plugged in. I think Bluetooth is dead too. All USB ports are dead too - although that one was “my“ fault since I plugged in a broken cd Rom drive that apparently nuked the USB ports - didn’t think that was even possible. When on, the thing often had the fans blowing at full speed… Im on like my fifth charger, they’re all frayed garbage. definitely not a fun experience at this point.
I guess if u keep a laptop stationary at your desk and plugged in at all times, then it could last 11 years. Although the very expensive 8gb of ram from back then is basically useless today.
> I guess if u keep a laptop stationary at your desk and plugged in at all times, then it could last 11 years.
I've taken my MBP everywhere, criss-crossing the United States. I've connected it to many different presentation devices where I've been a speaker at conferences. I've connected to many different networks - including hard-wired ethernet. Not exactly stationary and kept on a desk. I can't even remember how many times it's been dropped while in my backpack - from table top heights. I think it has taken quite a bit of abuse.
> Although the very expensive 8gb of ram from back then is basically useless today.
Yes it is - and it has nothing to do with being soldered into the motherboard. It's too slow for today. I'm not going to take that RAM and install it on an M2 machine, for example - too slow. Honestly, on this point you're just whining.
Obviously we're swapping anecdotes, but you'll see there's a lot of people still running the 2012 MBP Retina machines. Those things were tanks! The other machine that also appears to be a tank is the 2015 MBP. Lots of people are still running those machines too. Maybe Apple has "on" years and "off" years? Wouldn't surprise me as many other manufacturers do.
> I plugged in a broken cd Rom drive that apparently nuked the USB ports
While USB has resettable fuses and whatnot to protect itself, every protection circuit has its limits. There are even tools which can nuke the whole mainboards by charging capacitors and dumping all that energy to data lines.
> I guess if u keep a laptop stationary at your desk and plugged in at all times, then it could last 11 years.
Yes. Mine has seen quite a few countries and spent considerable amount of time trying to cool itself off while I'm developing my Ph.D. and try to get every bit of performance from its processor, all while using my first Mac's power adapter.
While I take care of my computer the best I can, it's not a "house PC" in any sense.
I had three devices with 30-pin connector, and have another three with Lightning connector. I never replaced a female connector, or any cable which connects to these connectors, made by Apple or otherwise.
From my preliminary research, 30-pin lasted a little less than a decade. However, from my understanding, the replacement of 30-pin with Lightning is not because of mechanical reasons. One of my 30-pin devices is still with me, and it works flawlessly.
30-pin connector is in fact a compact ribbon cable with plethora of interfaces and signals (analog audio i/o, FireWire, USB, Video, etc.) [0]. Lightning replaces this with a single digital bus which can be configured for plethora of things [1]. And the connector is both much smaller in both host and cable side, too.
For example, Lightning to video out adapters download a firmware from the host and boot first, then they negotiate the protocol and they start talking. Similarly, Lightning to 30-pin converter is a high-quality A/V station inside a small shell. It contains all the hardware required to expand lightning port to a full-capability 30 pin connector [2].
This is why also Apple is hesitant to convert Lightning devices to USB-C. Because it's not a simple USB port disguised in a reversible 8 pin connector. It's a whole software/hardware stack which allows port and connection to morph and adapt.
The port's introduction was indeed flashy, but I guess it was iterated for a couple of years inside Apple before introduction, I guess. So, the technology was already mature and boring before it's rolled out.
Another similar technology is "quick burst charging" (Charge 5 mins for 2 hours of use kind of thing). Pioneered by Sony almost a decade ago, now offered by everybody else (Apple, JBL, Philips, etc. on their rechargeable devices). The feature is flashy, but it's matured for over a decade by Sony before everybody jumped the bandwagon.
Apple laptops unfortunately give me severe eye strain and headaches; I don't know why. I'm on a laptop with a beautiful OLED screen now, and my eyes are much, much happier. I would switch back to Apple if they offered OLED. I doubt they will, though. They're keen on mini-led, which is inferior in my opinion.
I think they're aiming for micro-led, which is basically the same as OLED but not using inorganic LEDs so they don't degrade. They're using mini-led as a stop-gap.
The Mini-LED screen is not great though. Sometimes I actually miss LCD. The brightness isn't uniform (it vignettes towards the edge), and you get lighting artefacts when quickly switching from a black screen (eg switching tabs).
It doesn’t help that they are shiny mirrors that force your concentration to keep focussed on the content on screen and not yourself or the stuff behind you. I wish they had a properly matte option. Back when I still used an mbp 12h a day I had these seasons of eye strain that I mostly worked-around by setting the screen black and white.
I had the same problem. For me it was font antialiasing: eyes try to focus on the blurry text. After disabling the antialiasing I can stare at the screen for ages without any eye strain.
My work machine is a 14 inch Macbook Pro and my personal machine is a Samsung Galaxy Book Pro 360.
The Galaxy Book has a Samsung OLED (I also had a Galaxy Chromebook 4K OLED), it is a beautiful display with up to 400 nits brightness....
The 14 inch macbook pro lays a massive beatdown on it in an extremely hardcore way: Up to 1000 nits full display brightness with HDR capabilities up to 1,600 nits.
OLED displays are limited on peak brightness and it is brutal when you see it, the Macbook Pro 14 actually reduces peak brightness to 100 more than a standard OLED display to save battery. but grab a tweak like this: https://www.getvivid.app/ and you can crank it all the way up to 1000.
Manufacturing such large volume of high-end OLED displays with parameters Apple needs is not economically feasible at the moment. Other brands can afford it because a) they ship much fewer OLED-equipped laptops than Apple does mini-LED equipped ones and b) the brightness/dynamic range is not the same.
There are strong rumors that Apple plans to bring OLED to the iPads and MacBook Airs starting in 2024. It’s not clear if they will replace the mini-LEDs of the MacBook Pros. Apple promotes the HDR brightness of the mini-LEDs over OLED.
The rumors say that Apple plans to use a two layer OLED with essentially two OLED arrays one of top of the other. The reason for this is to stabilize the display as medium sized OLED are made with flexible substrate and can warp under some conditions. The dual layer is also suppose to allow greater brightness and to help prevent burn-in (perhaps by running each layer with less brightness).
Where is the new 27" iMac? That's what I want to know. I ended up buying legacy Intel Macs for staff because the extra screen real-estate is really useful.
Pro, Max, Ultra, any combination of these. I can't understand any of these. Not that it means anything more than an artificial fragmentation to bring in more upsell opportunities. Looks like 90s Apple again, at least from this point of view.
I understand the old (simple and brilliant) four-quadrant is not going to fit anymore, but this is just the opposite extreme.
So which is better, the 13500HX, or the 1365UE? What about the 13620H?
What about the 13900/13900F/13900E/13900H/13900HK/13900T/13900TE/13900HX/13900K/13900KF/13900KS? Which is better? And the 13905H is clearly better than all of them because higher = better right?
Apple Marketing under Joz has been in love with this Pro-Max-Ultra tiering, and it's mind boggling. Very very confusing to have Max(imum) be the middle tier. If Ultra was Ultimate, maybe it'd make more sense, but Pro-Ultra-Max just makes more sense.
But even Max kinda stinks because it's basically the same word as "Macs". Max Macs, Macs Max. Ugh
Also: Pro is messing up with the well-known and familiar "Pro" as in Macbook Pro and Mac Pro. I.e.: "Hey, look at my new Macbook Pro M2 Pro!"
Then, this is also used - completely unrelated to the CPU in there - in the iPhone line-up, in a whole new mixed-up level ("look at my new iPhone Pro Max Ultra!"), as well as randomly for other hardware (AirPod Max?).
What a mess. Actually, this is worse then 90s Apple.
I really couldn't be happier. I run a small WFH / Music Production / Photography studio in my basement and I really wanted an always-on always connected computer, for which the Studio has been perfect.
The M2 Air is pretty close to a perfect laptop. My complaints (low audio volume, slighty cramped screen) are absolutely dwarfed by the long list of decades-old issues this laptop just eliminates:
- All day battery life. Basically like the Apple Watch - charge it overnight and it's really good all day with normal workloads (e.g. not rendering video). But it was absolutely surreal the first couple of weeks watching an hour long show while running a dozen apps and seeing the charge go from 100% to like 98%. It makes no sense and I had to adjust work habits.
- Extreme snappiness. It's in every way snappier than my previous computer, a 32GB 2019 MBP. It _might_ be slower on long CPU-bound or RAM-bound workloads, but photo editing is both and it feels faster on the Air.
- Form factor. It's my first Air. I'm not going back. I'm 40+ and keeping extra pounds off my lower back matters for me. The Air, if it's all you're carrying, is barely noticeable.
It's just fantastic. If you were wondering if you can replace an MBP with the Air yet, it's almost certainly a yes (though I wouldn't try all this on 8GB RAM). At $2000, what I paid, it's a revelation.
Also if like me you work on a desktop/laptop Apple setup at your desk, the new continuity features are totally incredible. Working with the studio/air together is totally seamless (You can extend the screens and use the same keyboard/mouse). It just works. Apple really is getting a lot of things exactly right on the software side in the Apple silicon era.