Hacker News new | past | comments | ask | show | jobs | submit login
M3 Macs: there's more to performance than counting cores (eclecticlight.co)
220 points by ingve on Nov 3, 2023 | hide | past | favorite | 400 comments



I want a better system load metric now that we've got heterogenous cores in CPUs. The Pixel 8 Pro has 3 types, efficiency, performance, and one single ultra performance core.

If your efficiency cores are always close to max usage, but your performance cores are idle, is your system being heavily or barely used?

I understand that system load isn't really a useful metric between systems, but it's useful to compare on a single system I think. I just want a better at-a-glance thing to communicate to me if my computer is under or overloaded.

(Additionally, do you have to specify that a particular app can run on efficiency cores, or does the process scheduler do it all without input from a human?)


linux has pressure stall information[0] which tracks the time where some processes couldn't be run because they were waiting for contended resources (cpu/io/memory). An N-parallel compute-bound job on an M-thread CPU will stay at nearly 0 CPU pressure if N=M (assuming no background tasks) because they're not stepping on each other's toes. At N>M pressure will start to rise.

[0] https://docs.kernel.org/accounting/psi.html


Note that for memory, as I understand the documentation, it's less about "couldn't be run" than "couldn't have the real memory allocation necessary to avoid paging".

This is in contrast to another contended resource, memory bandwidth, "waiting" for which manifests at the process level as CPU cycles like any other code. It's possible to use profiling tools to distinguish memory waits from other CPU activity, but as far as I know nobody has built the infrastructure to bubble that data up in some form so it could be tracked systemically by the OS.

I can guess why: The memory hierarchy is complicated, and what you can derive about it from CPU performance counters is indirect and limited. Still, even some basic estimation for how many cache lines a process was responsible for driving over the memory bus would be helpful for those building high-performance systems and applications.


PSI is exported by the kernel to track context switches: switch to idle process = io wait, switch to kswapd = mem wait, switch to other runnable process = cpu wait. these are already visible to the kernel, it just needs to increment some counters and expose those to userspace. you can get the perf counters with something like `perf stat -ae cache-misses sleep 60`, it doesn't need to be in /proc.

additionally, context switches are the same on everything that can run Linux, whereas PMU counters are highly CPU-specific (potentially even different in each CPU stepping), so given the current state of affairs, a generic interface would be very limited.


Isn't this just a subset of the existing capacity monitoring problem where I want to know how loaded a multi-core system is under the following scenario:

Some critical path, single-threaded task is at 100% of capacity of a core; the system has 10 cores -- is my system at 10% utilization or 100% utilization?


Easy, measure the TDP usage instead of CPU usage, which is impossible to correctly measure at a given time anyway.


Well, I certainly track capacity by power used by servers under management, but a system that's pinned on one thread is at 100% utilization, for very real definitions of "capacity" but the power consumed will also be some small fraction of how much power would be drawn if you were lighting up all the cores at 100% also.


Sounds like power usage might fit the bill.


Raw power usage is very skewed. Getting that last 20% of performance the chip is capable of might take as much power as the first 80% did, and not each type of workload may have the same skew depending on which components are being stressed and how. Say you did have an adjusted map of power<->utilization though, it still assumes "system utilization" means "how much of the total possible capacity of this system is being utilized". This is a valid take, if you can get such a mapping, but it's not necessarily the only valid take of what system load is.


Huh? # of running processes is the usage of the system.

---

If your system is running it on CPU, GPUs, high-efficiency, low-efficiency, etc....that's the performance level of your system, not the load on your system.

This is nothing new. We've had power-saving CPU throttling for ages. Carry on.


This seems to be a lot of effort to rationalize the surprisingly small performance increase from M2 to M3. Initially the assumption was that M2 to M3 would be a bigger step than M1 to M2, not a smaller one. Perhaps TSMC 3nm is showing the limits of scaling?


It's a mixture of several problems:

* TSMC N3B being a bit of a flop (yield issues, too expensive)

* Brain drain from Apple's chip design teams over the last few years

* Tim Cook trying to push the average selling price up to keep revenue growth going in the face of sales declines (e.g. hobbling memory bandwidth, reducing the number of performance cores for M3 Pro)

I don't expect there to be a M1 style generational leap for a long time, expect 2010s Intel style yearly performance gains from here on out.


The GPU is increasing a lot though.

Another point is the CPU improvements are constant. In about two years the base M3 is now on par with the M1 Max.


It’s true but the problem with the GPU is Apple’s addiction to RAM money. Yes the GPU is improving in performance but it does you no good if it has to share a tiny amount of system RAM with the CPU.


Tiny? My M2 Air has 24g, which is more ram than any other laptop video card I've had.


Sure, if you pay for it. The 14” starts at 8GB. Eight! 200$ more for another 8GB.


The GPU is critical to future profits though. Apple really wants your monthly subscription to Apple Arcade and they want to expand in other game areas which is why they've been paying AAA companies to optimize for Mac. This also ties into their VR headset where gaming will be one of the core features.


Hardly any AAA companies are optimising for the Apple GPU. MoltenVK is where all the interest is.

Even if they did, there is very little in Apple Arcade which taxes the GPU, most target the lowest common denominator in terms of supported iOS/phone combinations.

The original Apple Arcade strategy was for AAA titles, but for whatever reason that wasn’t pursued, so now we have a tonne of casual games and re-releases of old titles.

Apple just seems to run hot and cold on gaming.


> The original Apple Arcade strategy was for AAA titles

Can you prove that statement? I bet you can't, cause it's simply not true. Every Arcade title is playable from an iPhone to a Mac by way of Apple TV. It was never going to get AAA titles.


Here you go. A simple google would have found that.

Apple wants more Grindstones.

https://9to5mac.com/2020/06/30/apple-arcade-strategy-shift/


How is cancelling contracts with some developers equal to they wanted AAA titles but messed it up.?

There’s no correlation.


I mean, they keep putting desktop class CPU+GPUs in their mobile devices - all just to play an Angry Birds remake?

I own almost every Apple device in the ecosystem, but I never game on them…


> The GPU is increasing a lot though.

Is it? Does it matter? Unified RAM is cool. But I’d rather have an Nvidia 4090.


In a laptop?


I was thinking about the desktop variants.

Apple Silicon is amazing. But if you’re doing heavy GPU it’s not very exciting.


It's very exciting in the local LLMs space, where the unified memory allows fast inference of large models.


I read that the bandwidth is 50% lower on the M3 combined with a lower CPU core coun. This reduction will impact inferrence performance. It maybe better to stick with M2 series if you really really think spending $10k on a laptop mac to do inferrence slowly (but faster than a plain PC of course) make sense.


Weren't all of the new M3 Macs announced the same price as they previously were or lower? Same with the recently announced iPhones? Or am I mis-remembering? Seems like prices not increasing given all the recent inflation are actually a price decrease pretty much across the board not an increase on the average selling price?


Entry level pricing for a MacBook Pro now starts at $1599 rather than $1299.


That thing had the same CPU as the MacBook Air. There was nothing "pro" about it. I'm so happy it's finally gone!


the new one also has the non-pro, non-max M3.


And it has a downgrade from a Pro chip to a non pro chip.


No it doesn’t. The previous (cheaper) entry level MBPs were non-Pro M2 (and non-Pro M1 before that).


13" Macbook Pro had M2 not M2 Pro


> expect 2010s Intel style yearly performance gains from here on out

Intel saw very little gain this year at all in return for a 400 watt power draw under load.

The plain old M3 saw a 20% performance gain along side efficiency gains.

Having a 22 hour battery life is insane and you certainly aren't going to manage that with a 400 watt power draw.


400 watts is on a desktop chip where there is no concept of battery life.

20% increase on performance is compared to M1 not, M2 - which also had 20% increase in performance on M1.


> 20% increase on performance is compared to M1 not, M2

Nope.

> The M3 chip has single-core and multi-core scores of about 3,000 and 11,700, respectively, in the Geekbench 6 database. When you compare these scores to those of the M2's single-core and multi-core scores (around 2,600 and 9,700, respectively), the M3 chip is indeed up to 20% faster like Apple claims.

https://www.laptopmag.com/laptops/macbooks/apple-m3-benchmar...

> 400 watts is on a desktop chip where there is no concept of battery life.

Yes, and in exchange for that ridiculous 400 watt power draw, Intel saw negligible performance gains.

> In some areas, the extra clock speeds available on the Core i9-14900K show some benefit, but generally speaking, it won't make much difference in most areas.

https://www.anandtech.com/show/21084/intel-core-i9-14900k-co...

Intel only wishes they could hit a 20% gain in exchange for all that increased power draw and heat. As that review noted the best improvement they saw in any of the common benchmarks was just 6%.


I wonder how much thermal throttling is going on with these benchmarks? 400W seems ridiculously difficult to cool. The 13900 was difficult if not impossible to cool for throughput without water cooling.


For Apple, yes incremental increase but not for Huawei and recent Elite X.


Apple seems to be focusing on efficiency more than anything, and maybe a little more market segmentation where the lesser chips are less competitive with the bigger chips. The Pro offers fewer performance cores, trading them for efficiency cores. It has reduced memory bandwidth.

From a generational perspective, the M3 Max is offering the same level of performance as the M2 Ultra. That's amazing as the m3 max is 12p/4e vs 16p/8e in the m2 Ultra. The M3 Ultra should be a substantial lift.


Isn't it a 15% increase? I really don't consider any double digit percent increases to be small.


> Isn't it a 15% increase?

At least according to Geekbench, it's a 20% performance increase.

> The M3 chip has single-core and multi-core scores of about 3,000 and 11,700, respectively, in the Geekbench 6 database. When you compare these scores to those of the M2's single-core and multi-core scores (around 2,600 and 9,700, respectively), the M3 chip is indeed up to 20% faster like Apple claims.

https://www.laptopmag.com/laptops/macbooks/apple-m3-benchmar...

Along side a battery life increase to 22 hours? It's been a pretty good showing.


Benchmarks are optimized for specifically.


Are you alleging that a chip which was in development for years was optimized specifically for a benchmark that was released a few months ago?


Actual IPC increase is 1-2%. The rest is from ramping the clockspeeds. This is a problem because power consumption goes up exponentially with frequency. Go up too high and they'll be doing what AMD or Intel does where a single core is using 50+ watts to hit those peak numbers.


How is this a problem? It just looks that they didn't try to improve the general architecture but make one step to ramp up the clock speeds without increasing the power consumption thanks to the process step. Which is a great achievement, because any tape out on a new advanced process - here for the first time a "3nm" - is a big achievement. One has to consider that Apple now has yearly updates in its processor lineup. The next step will probably introduce more architectural changes. Only if those would stop showing up for several years in a row I would get concerned.


With each recent node step, you basically get +10-15% clockspeed or -30% power consumption. They just blew their entire node on a small clockspeed ramp.

Now, if they want a wider core for M4, that means more transistors and more heat. They are then forced to: not go wider, decrease max clockspeed, hold max clockspeed for a pitiful amount of time, or increase power consumption.

On the whole, I'd rather have a wider core and lower clockspeeds then turn the other power savings into either battery life or a few more E-cores.


Power is polynomial (it goes with the square of frequency), not exponential.


I don't know where you get the idea of small performance increase. What I'm seeing until now, also in pre-review unites, is actually the opposite, especially when ray tracing and mesh shading get taken into account. For anything GPU related these new machines are a giant leap.


Are these mainstream use cases? These always feel like they are chosen to demonstrate the processor's strengths, instead of talking about the ways the processor helps in real applications.


Is playing games with insane light effects and mesh shaded scenes on a family computer mainstream enough?


Not really? What are some examples of said games, which are available on macOS?


Stray, Resident Evil Village on iOS, Myst, Lies of P, and a long series of other games that will potentially follow because of this upgrades, and because Apple wants to make iOS - iPadOs - macOS a giant unified cross device platform powered by Metal and proprietary silicon that offers hardware accelerated ray tracing and mesh shading across the entire lineup.

In case you didn’t notice, A17 Pro on iPhone 15 has hardware accelerated ray tracing as well.


10-15% increase in single threaded code every year is pretty good these days.


I heard from a a hardware developer that the main wins for the M* architecture is really that they integrated the memory onto the cpu die and then the unified memory architecture between the GPU and CPU which was "holy grail" type stuff a decade ago.

I don't think we're going to see the performance increases past the ~10-20% each fab cycle now.

Performance per watt and having dedicated HW for video decoding / other tasks(AI?) I think might end up being the answer to the continuation of Moore's law. Especially considering that we are in the era of heterogenous computing.


The memory is not on the die. It's in the package just like pretty much every other big boy mobile ap out there.


Scaling based on node size has been limited for a long time. It feels like it started at 13nm that every shrink was getting less and less performance uplift, but it's likely it's been going on for longer and we just had much more room for improvement on chip design.


It's a tick and some people... especially marketing departments want to sell it as a tock.


Wait, is that the sound the minute hand makes?! I never realized.



Assuming you don’t have a single-hand clock.


Everytime Apple oversells there are apologists putting a spin on it. Nothing new to see.


And there are detractors downplaying improvements. Nothing new to see.


The surface area of the M3 Pro vs M2 Pro tells you everything you need to know.


have a handy link and few more sentences for my friend who does not gets it?


Search "M3 Pro" on Twitter and you'll see some decent comparisons.


What's Twitter?


you made the assertion so you provide the link, I am not doing this do your own research BS


Lol it's not my job to inform you. If you care enough do your own research.


It seems like the M3 favors efficiency and thermals over raw power. This makes sense for the devices they've put it in.

I wonder if we're going to see different processors (M3X or M4) for the next release of the pro desktops, that favor power over thermals and efficiency?

Maybe there will be a kind of tick-tock, with odd numbered processors favoring efficiency and even numbered processors for power?

M3, M5, M7 for the laptops

M4, M6, M8 for the pro desktops

?


Doesn't prioritizing efficiency and thermals make sense for any device? Even a desktop has moments when it's doing almost nothing but has tasks that can adequately be handled by an E core where powering up a P core is overkill. Doesn't being efficient mean that more of the thermal budget can be held in reserve for when it's actually needed? Don't desktop users desire for their powerful machine to also be quiet?


Not really, no. Case in point:

https://www.notebookcheck.net/AMD-Ryzen-9-7940HS-analysis-Ze...

Adjusting the TDP from 80W all the way down to 35W costs you relatively little in performance and gives you about the same efficiency as an M2 Pro. That is not done because it does not sell.

> Don't desktop users desire for their powerful machine to also be quiet?

No. That is not an "also" but an "instead". People claim they would make that trade-off but that is just not the case. Same for "make it heavier but give me more battery life".


> That is not done because it does not sell.

It's not done because it doesn't sell in the PC market. Apple's kind of your only choice and if they say you're getting perf/W you're getting perf/W :)

> People claim they would make that trade-off but that is just not the case.

It's not 2 dimensional, tdp vs noise. It's tdp vs size vs noise vs price. You get a large ATX mid-tower and throw some Noctua 140mm fans in there and you won't hear a thing as it dispenses 1KW of heat into your feet area.


If I can tuck away my desktop under my desk and it acts as a large space heater in the below zero winter I don't mind if it's helping me get my compiles down faster.

There's some jobs I want a fast desktop for and other jobs, like browsing cooking recipes, I'll take my M1 macbook air for.


>Same for "make it heavier but give me more battery life".

Consumers purchase large, heavy phones more than one would expect given their pricing and one of the reason they do so is that they have better battery life.


> Doesn't prioritizing efficiency and thermals make sense for any device?

Intel did just ship 14th Gen Core i9 chips with a 400 watt power draw under load and very little in the way of a performance gain.

> In some areas, the extra clock speeds available on the Core i9-14900K show some benefit, but generally speaking, it won't make much difference in most areas.

The most significant win performance for the Core i9-14900K came in CineBench R23 MT... the Core i9-14900K sits 6% ahead in this benchmark

https://www.anandtech.com/show/21084/intel-core-i9-14900k-co...

Having your best benchmark result only see a 6% gain in return for a more than 400 watt power draw makes it pretty clear that just as with the Pentium IV of old, Intel isn't on a sustainable path.

It also makes the M3's 20% performance gain along side efficiency gains look pretty darn good in comparison.


> Doesn't prioritising efficiency and thermals make sense for any device?

No. If you have the cooling to sustain the higher power mode it can be worth it to pay the power bill to get work done faster. Compared to the productivity gains for compute limited workloads the increased power bill is little more than a rounding error or even a net power saving because you need less supporting infrastructure per worker to accomplish the same assuming that the tasks are even efficiently parallelizable to multiple (human) workers.


The blue-logo CPU company gives you a choice. You can adopt their most aggressive power profiles and get marginal performance gains if you want them. Or, you can tune them down to levels where they are pretty fast and fairly quiet. Or, you can dial them all the way down to their most efficient operating point, which pretty much nobody wants because that point is around 1W/core and they are pretty slow.

Apple is dictating how much power the CPU is allowed to draw and they don't let it scale up. They are also making a static choice about how much die area and power to spend on the GPU, which might not suit every user. I know I personally don't give a rip about the GPU in my M2 mac mini, beyond the fact that it can draw things on the screen.


For the M3 and M3 Pro, they kept the same number of cores or in the case of the Pro actually decreased them.

On the other hand, the M3 Max gained 4 Performance cores for a total of 16 cores (12P 4E) vs the M2 Max's 12 cores (8P 4E).

The M3 Max beats the M2 Ultra (2x M2 Max dies) in single and multicore benchmarks.

Seems like Apple is leaning towards efficiency on their lower end chips, but also using the extra transistor density to push performance on M3 Max, increasing the gap between the low and high end chips.

That M3 Max is presumably going to be turned into an M3 Ultra sometime next year with 32 cores, which would roughly double the M2 Ultra multicore performance.


The M3 Max beats the M2 Ultra in single-task benchmarks, that measure how well various kinds of software can take advantage of the hardware. Such benchmarks don't scale well with the number of CPU cores, because many tasks involve sequential bottlenecks.

With higher-end hardware, you should get more meaningful results by running a few copies of the benchmark software in parallel and reporting the sum of multicore scores as the true multicore score. That would reflect the common use case of running several independent tasks in parallel.


I think it's more accurate to say that M3 favors smaller chips because N3B has atrocious yields (not to mention cost savings).

When they switch to N3E and get better yields, I imagine that we'll see chip size grow. As we're getting close to the number of cores that the average consumer can use or would want to use in a desktop, I imagine that we'll be seeing an increase in GPU, NPU, and other specialized units instead of more CPUs though.


> I wonder if we're going to see different processors (M3X or M4) for the next release of the pro desktops

I don't think they care much about pro desktops given the pretty embarrassing MacPro. They just don't mean much for the bottom line compared to consumer lifestyle devices.

I expect they would be happily rid of the customer segment - annoying power users wanting low-level access and customisation. They keep their machines for too long and don't give much upselling opportunity for subscriptions, fashion accessories, house and family trinkets.


Apple just created a whole new Mac last year. It's a desktop. Last time they had created a new Mac was so long ago no one remembers.

For so long it was just Macbook Air, Macbook Pro, iMac, Mac Mini, Mac Pro. Now there's a whole new family member and it's a desktop. Apple is managing the slow death of desktops very well.


Apple is favoring yields and margin over everything else. They cut the M3 Pro as much as possible so it's half the area of the M2 Pro.


That would explain it. Reduced chip size, presumably due to increase in price per transistor. But do you have evidence for this theory?


Why else go through the trouble of a custom layout for the M3 Pro with multiple custom reductions (caches, vector units, cores, memory controllers) vs lopping off half of the Max GPU like in the two previous generations?


It can make sense for desktops too, like in the case of the Mac Studio. The role it plays there is keeping fans inaudible while maintaining good thermals rather than conserving battery.


You can just stack more of them for performance, no? You can't magically cool them down for idle periods though.


Dollars are never unlimited though, so you can't just add more chips/chiplets/cores.


Feels like we’ve gone from “it’s just better, it’s obviously better you can see with your eyes and feel it as you use it” when M1/2 dropped.

To trying to justify why the M3 is 20% faster than the M1 and the M2 was also 20% faster than the M1 and weirdly Apple is only comparing it to their older processor.

Like people, maybe it’s just an underwhelming update… no need to pretend it’s not.


I bought my M1 MacBook Pro not long after they were released. It's approaching 3 years old which is a pretty standard age for people to start looking at a replacement - I guess that's why they're comparing it to the M1.

Slight tangent - even though the MBP has been struggling to handle my dev work (partly because the codebase I'm working on has grown significantly in the last two years) I won't be upgrading. Work bought me a MacStudio (M2 Max), which mainly runs headlessly on my desk (also running Homebridge and plugged into my big speakers for AirPlay). This means I'm keeping the MBP as my portable machine (using VSCode's remote extensions and/or CodeServer to do dev work from wherever). I also have a late-2015 27" iMac at the office (with OpenCore Legacy Patcher so it can run whatever the latest macOS is called nowadays). This also works perfectly well now all the hard stuff is done on the MacStudio - and the screen is still lovely. (My previous 27" iMac was in active service for 10 years, although it was almost unusable towards the end).

Another tangent - I bet that's why they're not doing the larger "pro" iMacs. If it weren't for VSCode Remote Extensions and OCLP, I would have ended up with a beautiful monitor that was essentially useless (as was my original 27" iMac). People who need that extra power should probably avoid all-in-ones and I won't be getting one again.

But for most people, who tend to have a single machine, a comparison to the 3 year old equivalent seems pretty fair to me.


Having been all over the apple range over the years, the M1 Max Macbook Pro i'm currently using has been I would say the best of their laptops since the original G4 Powerbook. A couple of years into ownership of this machine, and it's been an excellent experience.

I develop on it every day, and i've no need to upgrade it, so i'll wait for the M4 :)


> It's approaching 3 years old which is a pretty standard age for people to start looking at a replacement

Is this really true? Three years sounds very new to me, and no one in their right minds would replace it.


i think a lot of personal users are seeing what corporate does and mirroring them. 3 years is pretty standard for companies to switch their leased/fleet laptops for optimum returns.

there's also the relatively static pricing of these and generally growing wages, making short upgrade times more common.


Comparing against the device 2 generations back does seem to make more sense from a “should I upgrade” point of view, and seems like a fine thing to put in marketing slides (as well as comparisons vs current peers, since that’s relevant to the question of “what should I upgrade to.”

The lackluster year-to-year is a bit of a bad sign WRT the long term trajectory, but they’ve been doing this successfully for quite a while, they’ve recovered from worse I suspect.


This is the way, portable machine should have a good battery life, good screen and good keyboard for me, everything else gets done on a remote, network connected machine with gobs of RAM and Linux.


Oldest MBPs with an M1 are from late 2021, so only 2 years old.


Those were the redesigned M1 Max/Pro versions.

Mine is the M1-with-touchbar (same design as the intel version) which was released at the same time as the M1 MacBook Air - late 2020.


The M3 Max performs similarly to an M2 Ultra. That feels pretty big to me as a current M1 Max user.


Exactly, how are people overlooking this. Its like saying I am going to drop a near silent windows laptop, that is thin and cool, oh and has BETTER performance than last year's desktop windows machines.


I agree it's impressive, in the same way Ferrari announcing a new super car is impressive. But, I drive a hatchback, and was just hoping they'd cave on 8gb/256gb for ram/storage, or support for multiple monitors.


Well they just caved on the 128G storage last year (2022 [0]), so it seems premature to think that might happen. The last bump in base RAM was 2017 [1] and before that 2012, so one might think that would increase soonish. However, the shift from Intel to Apple silicon changed the nature of RAM in the machines. Its too soon to tell what a base-level upgrade would look like.

0. https://everymac.com/systems/apple/macbook-air/specs/macbook...

1. https://everymac.com/systems/apple/macbook-air/specs/macbook...


It’s also more expensive than a M2 Max or M1 Max if I’m not mistaken.


14" MBP with the highest M2 Max configuration, 64GB RAM, 1TB SSD was $3899.

A new model with the same configuration is also $3899.


Interesting, they seem to be bit more expensive where I live, although I cannot compare them 1:1 (spec'd one with M2 Max 30 Core GPU and 64 GB RAM some months ago, however for 64 GB RAM I would need to go with the M3 Max 40 Core GPU now, so not a fair comparison)

Edit: But in any case, the difference would not be huge from what I can see, so I guess my point is moot


So, with inflation, the actual price got down


Wow, that’s overpriced for such low specs.


For 16 core CPU, a dedicated GPU equivalent, 400GB/s LPDDR5 memory, 3024x1964 Mini-LED, and 18 hours of battery?

The closest I can configure a Dell XPS 15 is $3099, and that's a 14 core CPU and likely lower memory bandwidth and lower performance SSD. They claim 18 hours of battery but only with the base screen, the upgraded screen is presumably less.

And from personal experience using an XPS 15 is a significantly worse experience in stability, heat, fan noise, and real-world battery life.

And here are benchmarks for the XPS vs the M3 Max:

https://browser.geekbench.com/v6/cpu/3367184

https://browser.geekbench.com/v6/cpu/3372431


Yes I guess if you absolutely need those specs in a small laptop and you don’t care about value it makes sense. I think it’s more that the laptop is not a good deal compared to other Apple laptops.


Oh for sure. The value return per dollar gets worse the higher you go up Apple's options list. Especially their RAM and SSD upgrade prices.


Please show me a laptop from another manufacturer that has similar specs, similar screen, similar battery life that costs less.


It’s more that those specs are not worth the price bump. I have a M1 that is slightly less capable but cost much much less. I use it to access machines that are faster than this M3 laptop when needed.


20% performance increases generation over generation, every 18 months or so, seems pretty great to me.

If people were expecting gains like the Intel to M1 transition on an annual basis I'm not sure that was ever realistic.


>Like people, maybe it’s just an underwhelming update… no need to pretend it’s not.

"Buy the new M3, a small incremental improvement on the M1/M2" would probably get a lot of people fired as a marketing campaign.

While I agree with you, we live in a reality where hype, hyperbole, misleading, and intentionally manipulative information is accepted and common place for selling things. It sure would be nice to look at a product or service and get a clear comparison without having to read between the lines, pick out subtle vague language usage, and keep up with the latest propaganda techniques but alas, it's everywhere.

On the bright side it should teach everyone you shouldn't trust the majority of information at face value, which is a useful skill in life. On the downside we continuously erode trust in one another and I worry it's wearing on social structures and relationships everywhere as more people see and mimic these behaviors everywhere for everything.


1. I think you set your expectations too high.

2. I assume Apple compared the M3 to M1 because most of the customers they are targeting are still on the M1.

M2 customers bought a computer less than a year ago and that pool of people is relatively small compared to the number on M1 today.


> 2. I assume Apple compared the M3 to M1 because most of the customers they are targeting are still on the M1.

I feel like this is giving Apple the benefit of the doubt.

Most people here wouldn't say the same of Intel if they started comparing to 2 or 3 generations ago.

No, most likely they're comparing to M1 because it's the biggest difference that is still plausible to explain.


> I think you set your expectations too high.

Well, they did announce it in an event called "scary fast". I mean there's always marketing exaggeration, but I did expect more than trying to understand if it's even faster at all. Otherwise I'm not entirely certain what the point was.


Literally the only point of confusion here is M3 Pro versus M2 Pro, where Apple seems to have simply made different choices around what the Pro chip should be.

M3 Max is a massive gain over M2 Max. And M3 is a nice improvement over M2.

Hope that helps.


That does, actually. Thanks.


  > Well, they did announce it in an event called "scary fast".
It was the night before Halloween.

  > I mean there's always marketing exaggeration
Apple's marketing people are notorious for exaggeration.

  > I did expect more than trying to understand if it's even faster at all
Their stated performance improvements seem to be accurate. Although I'm surprised at how much faster the M3 Max seems to be.

  > I'm not entirely certain what the point was.
The point was this year's revisions with a speed bump. They do this almost every year.

That said, early benchmarks are indicating that the M3 Max performance is on par with the M2 Ultra. I consider that “scary fast” myself.


It felt like half the event was listing different types of workload and saying that the new chips were 15-20% faster than the M2 generation at that task, and 30-40% faster than the M1 generation.

The only real exception to that is the M3 Pro, which is closer to the performance of the M2 Pro at a reduced core count.


Why would you market to people who just bought an M2 Pro this year? Those who are going to upgrade, will anyways. Seems silly to market to people telling them to buy new multi-thousand dollar laptops every year.


Yeah and in addition, people know Intel to Apple silicon was a massive leap, so why market how much fast it is than the intel version. That play has been played.

The stats compare to m2 are there. They are just not the headline/summary.


The apple marketing compared their m3 to both m2 and m1, saying it was 50% faster than m1, and 30% (or 20%?) faster than m2


Most of the customers are going to be Mac users, and I’ll bet that most already have M series machines.

Comparing M3s to old Intel Macs would also be lame, and a fairly meaningless comparison.

It would be interesting to know how many people compare M series performance to Intel performance when buying, as those are usually going to be different markets surely?


Every Apple update that’s not a new product/major revision people rush to HN to say how underwhelmed they are. This says more about the high bar they set for announcements and people’s own expectations… rather than how steady and efficient progress works in reality.

This is what product iterations look like, M2 only came out a year ago and M3 is a notable speed increase and just as important bump in battery life (22hrs is insane for the performance you get).

They compared to M1 and intel because people almost always wait 1-2 cycles before upgrading because MacBooks easily last 2yrs under heavy use and it’s an expensive upgrade.


My metrics are simple: Does the fan turn on? Can it go faster than I think and type? M2 with lots of ram has been a dream.


The graph needs a legend. Explaining the graph in the several paragraphs below and then cross referencing shape names and spatial relationships, is not a fun game to play.


And yet the air can’t support more than one monitor. I know Apple is a small company with only a few engineers so it must have been too hard.

Sarcasm obviously. As an Apple enthusiast it’s so annoying how they sniff farts.


While some engineering compromises I can understand, most appear to simply be anti-consumer/cash-grabs. Need more than 8gb ram? Hope you've got sufficient money. Want a high refresh rate screen present in cheap android phones? Sorry not for our base models.


Agreed, especially the increasing greediness with regards to RAM and corresponding upselling is unfortunately real at this point. I just upgraded to the iPhone 15 Pro Max because of the 8GB RAM and I will probably need to do so for my two Macs as well. 16GB/32GB RAM I “fear” not going to cut it in 2024 for my kind of snappy productivity work anymore. Especially due to unified memory architecture.

QED seemingly works out well for Apple though.. :P

I’m still hoping for some ex Apple folks to create some new version of NeXT computers. I would switch immediately to someone’s alternative offering trying to actually play at Apple’s level of quality.


> I just upgraded to the iPhone 15 Pro Max because of the 8GB RAM

This is a bad reason to upgrade. If you were supposed to know about it, it'd be in the specs.

It's not there to improve performance or be more forward compatible, it's because the camera upgrades need it.


OOM kills are basically the sole reason for me to upgrade iPhones. Nothing more annoying than force-reloaded/lost state in/between apps while on-the-go and under tight time constraints. Also somewhat true: new lenses are amazing but “computational photography” defaults sometimes less compelling than my old iPhone X’s more “honest” output. I’m going to play around with RAW some more if I get the time for it though.


Just curious, I had a 2020 SE daily and never had a OOM, how do you do it?


How does software do this usually? :) I guess it’s a combination of heavy multitasking / using it for work + private plus - again - constant-drum of increasingly bloaty websites and applications following whatever the current ceiling is. “Simple” apps like podcatchers come to mind, don’t know how often Overcast bailed on me with maybe 100 podcasts subscribed / updating? But again there are many more cases and sometimes not only third party.


I still use a 2020SE and experience apps reloading all the time. I cannot keep more than 2 apps reliably in memory at any point. This includes Safari, youtube, reddit (honestly a poorly made app), spotify etc - fairly "common" ones.


Reddit is an especially appalling platform and the only thing more appalling than that is their doubling down on “app vs web” all the while having terrible UX and performance as you say. Something something about Conway’s Law I’m guessing :P


Yeah that's the actual truth, but Apple fanboys will tell you otherwise. Even though I use an iPhone/Apple Watch/Mac Mini I regularly have to explain to many of them what kind of nonsense Apple puts their user through considering the price they sell their devices. They always have some pretty dumb excuse, so sad it is really frustrating. And it's all in the name of margin. They put severe limitation in their already expensive base device for the sole purpose of maximizing margin and upselling. Because 100B of profit per year is not enough of course. If you look at it as a user, the massive profit they have been making did not translate to anything better, in fact quite the reverse. Personally, I'm out of there. They finally squeezed me out, I guess I'm a bad customer, I'm not willing to spend enough money. But fine so I am going to spend my cash elsewhere.

In defense of the poorly made apps, it just is a reflection of the general market. Mid-range Android phone have double the amount of RAM of entry level iPhone at half the price. The pressure to optimize is not very strong and I can't blame them. It's not worth spending hours of expensive engineering time on optimisation. It would still have limitations and at some point when you need more RAM to fit more data, you just need more RAM, you cannot "optimise" the problem away. All of this would be solved by more/bigger RAM chips that cost a dollar at worst. If Apple wasn't so greedy it would be a non-issue. So I think those developer have the right mindset, fuck Apple and if you are an Apple user either you spend more for the real deal or go look somewhere else for a more sensibly priced device.


Apps exiting in the background actually depends on storage size as well as RAM; there's a kind of swap for them but it has a daily disk writes budget.


Yeah I was never above ~180-200GB used storage on my 256GB iPhone X. I splurged on that one as usual.

Sorry ranting, not aimed at your comment at all just as I came back to this context: So I'm definitely a long standing and good customer (often and over many years an "evangelist" for free FWIW). IMO they have been milking us on the memory side of things for a while now though.

Even if I can gladly afford maxed out options I now increasingly do so because I feel "coerced" not because I just want the best. It's not great and gives a sour feeling for no good reason, I used to just love what they have on offer for the money spent. At this point they should just eat into their own margins and bump up to 50% more RAM across the board for most of the devices they are selling. That will not only make entry point customers more happy but also the ones buying their highest margin products. Why does the iPhone 15PM only have 8GB of RAM? I know that in no time my apps as a normal (?) multi tasker will get OOM killed again. The top of the line model just doesn't buy me much longevity in that department. The "toy" that is the Nintendo Switch had 4GB of RAM and came out the same year as my "premium" iPhone X at 3GB. Same kind of memory also IIRC. The next "toy" Switch 2 will apparently have 12GB when it arrives next year, I'm somewhat positive that Apple is already planning for their next PRO to have 10GB RAM max, maybe it will even stay at 8GB... the memory situation has really been ridiculous with them for a while.

Rant over, enjoy the sun :D probably / hopefully wrote this into /dev/null haha.


I will say 8gb and m1 has been LIGHTNING fast for regular dev work with my personal stuff. Go, docker, etc. never had a ram issue

But yeah 16gb is so cheap Apple stop it


Even the lowest tier models probably beat any PC laptop when it comes to usability and comfort though.


Ah yes, the 8gb ram models in almost 2024 with crippled SSDs are beating equivalent PC laptops for their price... NOT!

8gb wasn't even enough in 2015!

Apple competes and demolishes anything from PC in battery and build quality. Certainly not in usability and subjectively comfort.

Also, low tier PCs support multiple monitors. Try getting that on the macbook Air.


Strong disagree.

UI is very buggy and inconsistent.

CLI tools/shell is attempting to follow more-or-less standardish unix/linux setup but fails pretty hard, with many of the same commands existing but behaving just differently enough to be annoying.

Uses it's own set of shortcuts, different from what's standard on windows/linux and most other operating systems.

Not sure what people mean when they say completely vague and ambiguous things like 'usability' and 'comfort' - are you referring to the fact that it has rounded buttons and corners and pretty ads?

It basically falls in between a standard windows and highly modified ubuntu setup, but does far worse at both. The UI, apps, etc is far more buggy and unintuitive than windows, and the os is far less 'unix' and far less customizable than ubuntu.


> Unix tools/shell is attempting to follow more-or-less standardish unix/linux setup but fails pretty hard, with many of the same commands existing but behaving just differently enough to be annoying.

I'm pretty sure this is simply you expecting GNU and actually getting FreeBSD-based tooling. It's not wrong, it's just different than what you expected.


Not at all. As a single example - macos comes with an extremely outdated version of GNU bash (3.2) from 2007. And yes, this has consequences in that many modern bash scripts depend on bash 4+ (current latest GNU bash is 5.2+).


> Why? Why do I need to use 3rd party package managers, or manual installations from 3rd party sources to install common tools that aren't a decade+ outdated?

There actually is a reason. It's not lack of maintenance.

It's because Bash 3.2 is the last version licensed as GPLv2. Bash 4.0 and later changed license to GPLv3.

Apple decided it's not safe for them to ship any software with GPLv3 with macOS, because of stronger legal conditions in GPLv3. This is also the reason they stopped updating Samba.

They changed the default shell to Zsh long ago. The old Bash is kept around, so that users who want to stick with Bash can still use it as their shell, and so that existing scripts for macOS (for example in installers) continue to work. If nobody cared about Bash, I expect they would have dropped it from macOS when switching to Zsh, rather than keeping the old version.

So from a certain point of view, Bash 3.2 is the most recent version they can ship.

As for other tools like "cp", "ls", "rm", "touch", etc. I agree they are annoying on macOS, when you are used to the versatile GNU/Linux command line options. I sometimes type options after filename arguments due to habit on Linux. macOS commands very annoyingly treats those options as filenames instead. And I miss options like "touch --date".

However, this is not due to old tools. Those differences are just how up current (up to date) BSD commands work. They are just a different unix lineage than Linux.

The GNU tools were intentionally written to be more user-friendly than traditional UNIX™ tools, which is why the command line options are generally nicer. GNU/Linux systems come with GNU tools of course. macOS never did because it is derived from BSD and comes with BSD tools (with Apple enhancements, like "cp -c").

You get the same on most other unix environments that are not GNU/Linux. People install the GNU tools on top, if they want the GNU command line experience instead of the default. Sometimes with the "g" prefix (like "gls", "gtouch" etc.).

These days, even if Apple decided it's worth the technical fallout of switching from BSD command line tools to GNU, they wouldn't do it for the same legal reason as what keeps Bash at 3.2: The current GNU tools are licensed as GPLv3.


> So from a certain point of view, Bash 3.2 is the most recent version they can ship.

Apple could choose to be GPLv3 compliant tomorrow, if they wanted to.


They also seem to be about the only ones who seem to think this is a problem. Even Microsoft has the GPLv3 bash in WSL.


> However, this is not due to old tools. Those differences are just how up current (up to date) BSD commands work. They are just a different unix lineage than Linux.

What up-to-date BSD are you referring to? The commands on different forks of BSD like FreeBSD vs OpenBSD are not the same. And quickly looking at the man pages of latest FreeBSD vs MacOS versions of ls, for example - they are not the same.

> These days, even if Apple decided it's worth the technical fallout of switching from BSD command line tools to GNU, they wouldn't do it for the same legal reason as what keeps Bash at 3.2: The current GNU tools are licensed as GPLv3.

What actually prevents them from including GPLv3 software?


Using tools from the base OS is really holding it wrong. Just like ftp.exe/Internet Explorer/Edge and Safari should only be used to download a usable browser, the Apple provided CLI tools should only be used to download the tools and versions of tools that you actually want.

Otherwise, you have no way to control versioning anyway. When the OS version is tied to the version of so many other tools, it's a nightmare. Apple is trying to push you in the right direction by basically not updating the CLI tools ever, even when upstream didn't change the license to something unacceptable for them to distribute as in the case of bash.


> Apple provided CLI tools should only be used to download the tools and versions of tools that you actually want.

Why? Why do I need to use 3rd party package managers, or manual installations from 3rd party sources to install common tools that aren't a decade+ outdated?

> When the OS version is tied to the version of so many other tools, it's a nightmare.

It doesn't need to be tied to anything. If the OS needs specific libraries/tools then they can be installed in a separate location. If there are new major versions of user-level tools (like bash) those should come by default, not some 15 year old version of the same tool, with the same license. Multiple linux distros have solved these problems in different ways 10+ years ago.

> Apple is trying to push you in the right direction by basically not updating the CLI tools ever

No, they are just neglecting the CLI ecosystem, the package management, and the many many outstanding bugs that have existed and been ignored for years.


Because the newer tools changed the licensing terms, and the corporate lawyers won’t let anything under GPL3 anywhere near anything if they can help it.

GPL2 was viral, but the terms were easier to stomach. Apple is allergic to GPLv3 code because there is a clause in the license requiring you provide a way to run modified version of the software which would require Apple to let users self sign executables.

This is a simple result of the GPL going where Apple will not, so OSX is stuck with whatever is MIT, BSD, Apache, or GPL2 licensed.

I like the GPL, I license my open source stuff under it, but it doesn’t work for all cases. Apple is fine with not using software licensed under the GPL, when it conflicts with other company principles.

Simple as.


> Apple is allergic to GPLv3 code because there is a clause in the license requiring you provide a way to run modified version of the software which would require Apple to let users self sign executables.

Wow, I don't know this before. Good job FSF! This, should, be, a, basic, right.


You can of course sign your own binaries. You can’t alter them sign a system binary. I’m good with that.


> You can of course sign your own binaries.

I don't believe the license allows them to charge an extra fee for this, and for good reason. It deters hobbyists from doing it.

> You can’t alter them sign a system binary.

What's a "system binary"? The bash executable runs entirely in user mode.


> Apple is allergic to GPLv3 code because there is a clause in the license requiring you provide a way to run modified version of the software which would require Apple to let users self sign executables.

Does that really add up though? You can install and run 3rd party software on macos without any signing needed.


> Apple is fine with not using software licensed under the GPL, when it conflicts with other company principles.

Apple can be fine with whatever they want and choose whatever principles they want. Doesn't mean it leads to good software or that I need to like it.


Right, but you’re not important in this context.

- Apple is shipping stuff out to customers.

- Whatever is shipped has to satisfy corporate policy.

- The GPLv3 does not do so.

That’s what matters. The only way what you care about has any impact on the above is if there is a meaningful impact for Apple for following the policy, and Apple suffering some sort of loss (be it monetary, PR, legal etc).

Thus far, the non-presence of GPLv3 software does not appear to have hit that bar.


Of course I'm important in this context. The context is literally me talking about issues that _I_ find with macos. Saying 'company X does Y because company X's policy is do to Y' is a useless truism that adds nothing to the conversation.


> Why? Why do I need to use 3rd party package managers, or manual installations from 3rd party sources to install common tools that aren't a decade+ outdated?

Because you want to control the version of 3rd party software.

> It doesn't need to be tied to anything. If the OS needs specific libraries/tools then they can be installed in a separate location

The OS installs tools in /usr/bin and you should install tools in a separate location. Apple provides a commercial UNIX, not a Linux distribution. /usr/local or /opt are traditional locations for you to place the 3rd party software you want to use on your commercial UNIX.

If you want the OS to ship with updated tools, of course it's tied to the OS version. Then if you want bash 70, you'll need to run macOs Fresno or later, or install bash 70 in /usr/local. You should just install your version anyway, and then you won't have to worry about OS versions (unless bash 70 requires kernel apis unavailable before macOs Fresno, in which case you're stuck; but most software isn't intimately tied to kernel versions)


We're not talking about keeping up to date with the latest revision of python / java here, it's a shell that's woefully out of date. Apple is not RHEL backporting fixes so binaries continue to run for 10 years - in fact they seem quite cavalier about backwards compatibility.


Upstream changed the license. Apple doesn't distribute GPLv3 software. Apple's not going to distribute a newer version. That's how license changes work. The old version continues to work as well as it always did; scripts written for the new version don't work, but why would you write a bash4 script for macOs?

Not that they were going to update bash regularly anyway. So if you needed a new version in the base, you'd need a new version of the OS. And then you're back to the same problem you have now. Apple doesn't distribute the version of the 3rd party tool you want for the OS you're on.


> The old version continues to work as well as it always did; scripts written for the new version don't work,

Right, so scripts written for bash4+ don't work - which is most of modern bash scripts..

> but why would you write a bash4 script for macOs?

The problem is that you have to write special scripts (among other things) just for macos. This is very real problem because most modern software companies run their stuff on linux, while the dev laptops/workstations are often macos.


If you want your server environment to match your desktop environment, macOs is a poor choice.

It would be better to have a dev vm or use an os that you can run a real server on. FreeBSD, Linux, Windows, maybe Solaris.


Apple should declare Bash as deprecated, because they did everything to deprecate Bash, including replacing the default with Zsh, but classify it as such.


Bash is only there for backwards compatibility with old scripts. The default MacOS shell is an up-to-date zsh.


Maybe I’m and oddball, but I download third party package managers (or wrappers) on literally every OS I use.

Scoop, Chocolatey on Windows. Brew on MacOS. Amethyst on Arch.

Linux package managers eviscerate whatever is available on MacOS and Windows by a long shot.

Frankly if there’s one thing I want my OSes (excluding Linux) to keep their greasy paws off of, it would be my CLI and build environments.

I want to install and manage what I need, not be force fed whatever sludge comes out of Microsoft’s or Apple’s tainted teats. Manage and update the OS, don’t touch anything else.


Linux and MacOS package management is very different, because linux distros generally have first-class supported package management that comes with the OS.

> I want to install and manage what I need

Good luck installing what you need if someone hasn't made a port explicitly for macos.


Unfortunately this will never be fixed. It's not a technical problem. They refuse to ship software licensed under GPL v3, and bash 3.2 is the final GPL v2 version.

I hate it.


Didn't they stop 'shipping' bash by default awhile ago? Now the default shell is zsh?


bash is still shipped on macos 14, the default shell is zsh since 10.15 in 2019.


I wonder why they haven’t fixed it yet by removing bash and the other GPL-licensed tools. Even if they currently need one to boot the system, they have the resources to change that.


I refuse to drop Samba and go back to AFP


Are you sure you’re using Samba? AFAIK, Apple removed that years ago (I think because it’s GPLv3 licensed)

https://en.wikipedia.org/wiki/Server_Message_Block#Netsmb:

“NSMB (Netsmb and SMBFS) is a family of in-kernel SMB client implementations in BSD operating systems. It was first contributed to FreeBSD 4.4 by Boris Popov, and is now found in a wide range of other BSD systems including NetBSD and macOS“

https://appleinsider.com/articles/11/03/23/inside_mac_os_x_1...:

“The upcoming release of Mac OS X 10.7 Lion Server will remove the formerly bundled open source Samba software and replace it with Apple's own tools for Windows file sharing and network directory services”

I also think they complete removed AFP support.


Zsh has been the default shell on MacOS for ages now. If you need a more modern bash then you can easily install it using brew or other common package managers. This is a total non-issue.


Of course shipping a very outdated tool with your OS is an issue. And the fact that you need to use 3rd-party package managers to update/replace it is also an issue.

But if you want to contend that anything that is solveable via customization/3rd party packages is a 'total non-issue', then I fail to see how you could argue that Linux isn't superior to MacOS in every single way.


It's not an issue even if you say it is. Takes 1 minute to setup brew on a fresh Mac, third party or not who cares when it's open source, apt is also open-source and in the same sense a third party someone develops, you just get that with the base Debian like systems, if we start counting the minute wasted to setup brew then you waste more time to install Debian in the first place, Macs come pre-installed.


It literally is an issue, and the fact that you're pointing out a potential solution should make that obvious to you. Many software companies (including FAANGs) have basically given up trying to support building/running most of their software on mac for these exact reasons, and they have really tried - dedicating hundreds of experienced engineers to the problem.


I call BS on the FAANG statement. I work at FAANG and 80% SWEs use MBPs, actually if we go by literal meaning of FAANG company list, they do not even focus on desktop software and are mostly web companies and the tooling is all backend where SWE machines do not do any heavy lifting.


You work at FAANG and you can run your entire stack on MacOS? Which FAANG?

Or you mean, you work at FAANG and you can run your web application which is 0.01% of the stack on your mac?

> they do not even focus on desktop software and are mostly web companies and the tooling is all backend where SWE machines do not do any heavy lifting.

The web frontend/ui is a relatively small portion of the development that goes on at typical FAANG. And the reason the macs 'do not do any heavy lifting' is because they literally can't - most of the complex backend systems, low low level/high performance code, etc can't be run on macos.


The complex backend code isn’t going to work on your little laptop either. This is a stupid position to take, because people care more about whether they can spin up a development environment, not run all the code that goes to production.


> The complex backend code isn’t going to work on your little laptop either.

Of course it will if it's a linux laptop. Companies even give exceptions for linux laptops for exactly this reason.

> This is a stupid position to take, because people care more about whether they can spin up a development environment, not run all the code that goes to production.

You think being able to run the code you're working on is not a requirement for a good development environment? Or you think nobody writes this code, it just magically appears in production when you deploy your html page? Lmao.


This was the original comment

> Many software companies (including FAANGs) have basically given up trying to support building/running most of their software on mac for these exact reasons

Why would Amazon or Google waste a single minute trying to get software to build on Macs that are targeted to run on Linux servers?

Exactly what point are you attempting to make?


Just because your machine runs Linux and the server runs Linux doesn't mean the code is going to work on your computer. How are you supposed to test failover on a cluster that kicks in when hitting 100k QPS? It's just not practical to run the whole stack locally on your machine.


This is total bullshit. Amazon and Microsoft both support the Mac thoroughly. From what I saw when I was there, most developers at Amazon use Macs.


You're completely wrong. I worked at Amazon (AWS) for years, yes developers generally use macs, but they generally cannot run their entire stacks on mac. Most of the actual software people work on is run on remote linux (AL2) ec2 instances after syncing the code from mac.

Running the entire AWS stack on macos is literally impossible at this point because major pieces aren't even built for macos (and can't be without major rewrites).


No one is going to run the entire AWS stack on their Mac when it’s targeting Linux.

Amazon “suppprting” Macs mean outside developer support like AWS CLI, CDK, SAM, Amazon Chime, etc.

Of course we all spun up Isengard accounts when we needed a lot of CPU horsepower or to run some dependencies instead of running something locally. Why wouldn’t you? You worked at the only company that never had to worry about a large AWS bill.

Also all of the internally written MDM tools also supported Macs.


> No one is going to run the entire AWS stack on their Mac when it’s targeting Linux.

You literally just claimed everyone at Amazon uses macs for development and runs their code on their macs. This is not the case for the vast majority of code written at Amazon.

> Amazon “suppprting” Macs mean outside developer support like AWS CLI, CDK, SAM, Amazon Chime, etc.

Nobody said anything about 'amazon "supporting" macs' and what that means except you, just now, when you moved the goalpost.

> Of course we all spun up Isengard accounts when we needed a lot of CPU horsepower or to run some dependencies instead of running something locally.

What? Using remote development environments/ec2 boxes is literally the default at AWS and it has nothing to do with 'CPU horsepower'. Most of the ec2 instances used for development are far less powerful than a macbook pro.


I’m trying to understand what point you are trying to make? If you are going to run software on Amazon Linux, why would Amazon waste time trying to get software to run natively on Macs?

When a developer puts a laptop in their Amazon issued backpack, what type of laptop do most of them use?


Does brew still require mangling OS permissions?


No. This is answered on the Homebrew installation page: https://docs.brew.sh/Installation


That page appears to very much describe that it does, in fact, still fuck your systems permissions…

I’ll never install homebrew as long as the dev continues this amateur hour bullshit.


Could you explain what you mean by this? I don’t see it on the page, and I’m not sure exactly what you mean by “OS permissions”. See also here from 2016: https://apple.stackexchange.com/questions/253404/how-does-ho...


As someone who uses Windows, Arch with Hyperland and MacOS almost daily. I am immensely curious about what you find inconsistent and buggy about macOS? In my experience that’s been the least buggy and inconsistent of the three.

My Mac and Linux shortcuts align more than my Windows and Linux/Mac ones. For the most used ones it’s CMD + whatever the standard key is for that shortcut, instead of Control. Overal I prefer that the Super key is more useful than what has been the default for many years with Windows.

I also use the CLI for virtually everything on Mac and Linux, MacOS isn’t all that different and feels more like another Linux flavour than it’s own beast.

The only UI gripes I can think off immediately is no window snapping and closing the window doesn’t mean you’ve exited the application. The first requires a third-party tool (I recommend Rectangle), the latter is a change in behaviour.

Frankly I’m not really all that interested in defending MacOS, but I hope you realise that saying “very buggy and inconsistent” without naming anything specific isn’t any less vague and ambiguous than “usability and comfort”.

Your reaction comes off as “I’m used to this, therefore the other thing is bad and unintuitive.” I’m sure this isn’t your intention, so specifics would be illuminating.


> I am immensely curious about what you find inconsistent and buggy about macOS?

I posted a list of recent issues I've run into elsewhere in the thread, but there are many bugs and inconsistencies in macos:

  - Doc breaking/becoming inaccessible.

  - Running app windows disappearing/becoming inaccessible.

  - Cursor/caret disappearing when editing text.

  - Can't move windows between workspaces.

  - Constant issues with external monitors.
  
  - Switching between windows via command-tab regularly breaks. 
These are all well-known issues that have existed for years. If you google them you will find threads and workaround from 5+ years ago.

> My Mac and Linux shortcuts align more than my Windows and Linux/Mac ones

Because.. you changed your linux shortcuts to match macos? The most obvious example is that ctrl+c/s/v etc are instead command+c/s/v on macos.

> I hope you realise that saying “very buggy and inconsistent” without naming anything specific isn’t any less vague and ambiguous than “usability and comfort”.

There are literally dozens of examples of issues listed in this thread. I hope you realize that saying 'give me examples' when there are many many examples all around the post your replying too is a bit disingenuous. I can't keep repeat ing what I've already posted in the thread in every single comment.


> I posted a list of recent issues I've run into elsewhere in the thread, but there are many bugs and inconsistencies in macos: > > - Doc breaking/becoming inaccessible. > > - Running app windows disappearing/becoming inaccessible. > > - Cursor/caret disappearing when editing text. > > - Can't move windows between workspaces. > > - Constant issues with external monitors. > > - Switching between windows via command-tab regularly breaks. > > These are all well-known issues that have existed for years. If you google them you will find threads and workaround from 5+ years ago.

I am not pretending MacOS is bug free, again I was just asking for your issues with the OS. But from what I've seen in your replies it seems you post-hoc justified your opinion by googling for known issues.

Unless you're claiming you have all of these issues, constantly and consistently?

Even a Mac does well with the occasional reboot, I have never had a system that was completely issue free. Be it Windows, Mac or Linux.

>> My Mac and Linux shortcuts align more than my Windows and Linux/Mac ones

>Because.. you changed your linux shortcuts to match macos? The most obvious example is that ctrl+c/s/v etc are instead command+c/s/v on macos.

I literally said in my post that the most common ones have CMD instead of CTRL, this is more than "a bit disingenuous".

The reason I prefer the Super + key, is so it doesn't mess with using CTRL+key in the terminal. As that behaviour is running contrary to expectations set by every other OS. CTRL+C doesn't mean copy in the terminal, it means terminate the currently running program. CTRL+A doesn't mean "select all" it means, jump to the start of the line.

Programs shouldn't be overwriting OS level shortcuts, though I completely understand how it came to be. It is really counter-intuitive for new users. Super + key makes more sense for OS level shortcuts, as it's literally the Windows symbol on most keyboards.

>> I hope you realise that saying “very buggy and inconsistent” without naming anything specific isn’t any less vague and ambiguous than “usability and comfort”.

>There are literally dozens of examples of issues listed in this thread. I hope you realize that saying 'give me examples' when there are many many examples all around the post your replying too is a bit disingenuous. I can't keep repeat ing what I've already posted in the thread in every single comment.

I don't know how you read, but I read from comment to comment, top to bottom. I'm in the top thread where you made the claims without any examples, I don't know why you expect me to look elsewhere for the examples that you might have posted or not. Nor why you think I would look for other people's issues.

If you think that's disingenuous that's your prerogative, I think that's how most people read these thread's including you. You don't need to keep repeating the same thing, you can literally just link your comment with the specifics.


> But from what I've seen in your replies it seems you post-hoc justified your opinion by googling for known issues.

Really? I 'post-justified' my opinion by posting the same set of issues in another comment hours before you replied to me? Why are you even trying to psycho-analyze my comments and come up with your own imagined motivations for them?

> Unless you're claiming you have all of these issues, constantly and consistently?

I've experienced every single issue I've mentioned. Now you're trying to quantify how 'constantly and consistently' issues appear? You're already going in circles - either they are so constant and consistent that I 'post-justified' my entire criticism based on the popularity of these issues, or they are so rare and inconsistent that they should be ignored?

> I have never had a system that was completely issue free. Be it Windows, Mac or Linux.

What is this statement supposed to represent other than blanket blessing of macos bugs? If windows has a bug - macos bugs are okay?

> as it's literally the Windows symbol on most keyboards.

That's not really a good reason.

> I don't know how you read, but I read from comment to comment, top to bottom. I'm in the top thread where you made the claims without any examples, I don't know why you expect me to look elsewhere for the examples that you might have posted or not. Nor why you think I would look for other people's issues.

I don't care how you read and I don't expect anything from you. If you're in a thread where many macos issues are mentioned, you can attempt to read the other comments in the thread, read the other replies to the same comment you're about to reply to, look at my other recent comments, etc. Instead of pretending that macos issues don't exist, then claiming that my criticism is 'post-justified' after I give you examples, and finally waving the issues off because 'linux and windows also have bugs'.


I think the buggy apps and UI isn’t emphasized enough on Macs. I honestly can’t remember the last time I was on Windows and performing an action failed to give me any visual indicator whatsoever that something happened, but that’s common on my M2. I’ll click something, have no feedback, and then a few seconds later the thing happens.

Just a couple concrete examples, if you open an app in a workspace, then switch screens to a different workspace, then click the app in the dock, nothing happens. I would expect to be taken to the screen and have the app made visible, but instead, nothing. I kept thinking that the app must be frozen or something until I switched screens and found it.

Another example, I was carrying my laptop to and from work. I put it in my backpack with padding that protects the laptop. I didn’t jostle it around, I literally just carried it to and from work. I get home and the screen is in some sort of weird flickering bugged out state. I had to forcefully restart it just to get it working again.

With all that said, the trackpads and gestures on Macs are amazing. The displays are also very visually appealing. The performance is good.


> I honestly can’t remember the last time I was on Windows and performing an action failed to give me any visual indicator whatsoever that something happened

I find that more often than not I can't make it through the Windows setup without worse janky stuff happening. Pretty often when toggling off all the bullshit privacy settings that shouldn't be opt-out to begin with, I'll get a visual indication of the switch starting to move after my click, then turning around and going back to the default—so my click was definitely received, but rejected somehow. That seems worse to me than a correct response delayed.

> if you open an app in a workspace, then switch screens to a different workspace, then click the app in the dock, nothing happens. I would expect to be taken to the screen and have the app made visible, but instead, nothing.

There is a visual indication in that the contents of the menu bar change to reflect the newly active app; unlike on Windows a Mac app can be active without having an active or foreground window. There's a system setting to control whether to switch spaces in this scenario, but I don't recall whether the behavior you describe is the default or something you accidentally configured to annoy you.


By default it jumps to the workspace that application is opened up on.


I’m not about to factory reset my machine to see what the defaults are, but I haven’t messed with any settings. It’s probably just another bug.


> CLI tools/shell is attempting to follow more-or-less standardish unix/linux setup but fails pretty hard, with many of the same commands existing but behaving just differently enough to be annoying.

macOS is UNIX certified and POSIX compliant. You're probably expecting GNU commands, but macOS is based off of FreeBSD (and is not related Linux).


First of all unix/freebsd/linux are all closely related. Second, macos does not come with current freebsd set of CLI tools, and in fact comes with several very outdated GNU CLI tools.


Yes, macOS does include some GNU programs, and both the FreeBSD and GNU programs that it includes are rather outdated.


I was always kind of puzzled that big tech companies seem to exclusively use Macs for software dev. Then when I joined one, I discovered that everyone uses remote Linux VMs for actual development. Which makes a lot more sense - Macbook as a glorified thin client with great battery life suits it pretty well. Although, I still sorely miss Linux/Windows window management.


> Uses it's own set of shortcuts, different from what's standard on windows/linux

Mac OS predates both.

Also, this is personal preference, but I find engaging Command with my thumb far more comfortable than Control with my pinky.


    CLI tools/shell is attempting to follow more-or-less standardish
    unix/linux setup but fails pretty hard, with many of the same commands existing but behaving 
    just differently enough to be annoying.
They wasn't following unix/Linux standards they are following unix/bsd standards. Standards that predate Linux and for some of us are very familiar indeed.


What standards are those exactly?

Everyone knows that macos (and windows) took a lot from unix/bsd. But that does not mean they are actively following any kind of reasonable modern standard. And saying they might be following some 30 year old supposed bsd standard is pretty hilarious for an OS that portends to be cutting-edge, intuitive and well-integrated.


Hum, BSD is still very much around.


> Want a high refresh rate screen present in cheap android phones?

What is this for? Games? I avoided this to save money the last time I bought a phone. I honestly can't think of use case where a phone display needs to update higher than 60 Hz


High refresh rates aren't just for games, they make the entire interface smoother. Things like frame rate or latency are quality of life things that you don't need, but once you upgrade, the difference becomes clear. Sort of like going from a slowly accelerating station wagon to a Porsche/sports car even when (capped by the speed limit) if you've experienced that. Hopefully someone else has a better analogy.


More importantly VFR is not useful for just high refresh rates. I would argue that for most people on a phone it's the least useful part of the thing.

Instead VFR allows saving battery by slowing down the refresh rates, down to as low as 1FPS for static or high latency content. That's part of why always on display is only available on the 15 Pro: at 60Hz it's a battery killer.

> Sort of like going from a slowly accelerating station wagon to a Porsche/sports car even when (capped by the speed limit) if you've experienced that.

So... functionally useless.


> So... functionally useless.

(Disclaimer, I haven't driven a sports car but I have driven cars between "nice" and "struggles to reach the speed limit")

While it might appear to be very similar, having a sports car that you're capable of driving well/defensively can literally be a life saver. There was this video from a few years back on reddit of this guy in a 911. Vehicle in front suddenly stops/crashes, he needs to rapidly change lanes at highway speeds. The little 911 unsurprisingly handled it excellently. A slow/heavy car would've very likely spun out and crashed.

Having more power if you don't need it doesn't hurt (if you're trained well). Less can hurt.


> While it might appear to be very similar, having a sports car can literally be a life saver.

Or it can be a life ender by planting you into the side of the road, something mustangs and BMWs are well known for, amongst others.

> that you're capable of driving well/defensively

If you need to be a skilled pilot to face the situation, it's less the car and more the pilot which does the life saving. And even skilled pilots who know their cars well can fuck it up. Rowan Atkinson famously spun his F1 into a tree, after 14 years of ownership, having been racing for 2 if not 3 decades.

> There was this video from a few years back on reddit of this guy in a 911. Vehicle in front suddenly stops/crashes, he needs to rapidly change lanes at highway speeds.

Something which ends up in a collision as often as not.

> The little 911 unsurprisingly handled it excellently. A slow/heavy car would've very likely spun out and crashed.

Or not. Or maybe with a less capable machine they'd have driven more prudently and would have kept more space.

> Having more power if you don't need it doesn't hurt (if you're trained well).

So having more power can literally hurt. And routinely does.


So basically while more sportiness can make you safer in theory, in practice you have a human in the mix who is all but guaranteed to drive less safely. I agree.


IME, the difference is negligible. I went from a iPhone 12 to a Galaxy S21 Ultra and yes, there is a difference, but it's not something I notice, even when occasionally using my wife's iPhone 12.

The bigger difference I notice is in loading times for apps.


> yes, there is a difference, but it's not something I notice

Anecdotally, I use the last gen pre-120hz iPad Pro as my daily driver and I notice the difference immediately when I switch from my phone to the iPad. It’s not annoying enough to force me to upgrade immediately, but it’ll absolutely be a requirement for me going forward.


For me the difference is visible, but the added smoothness is more of a cherry on top than anything. I switch between 60hz, 120hz, and 240hz panels every day and the 60hz panels don’t bother me a bit. On desktop and laptop displays I’d take integer-scaling-friendly pixel density over high refresh rates any day.

Variable refresh rates should be standard across the board however, not restricted to high refresh panels. There’s no more reason for a static screen to be redrawing at 60hz than there is for it to be redrawing at 120hz or 240hz.


I'm not sure how the Galaxy handles it but if it's similar to the stock android situation, you're likely often only seeing a 60hz screen rate. You can go to developer settings (or perhaps screen settings) and force 90hz or 120hz on. If your eyesight is otherwise good (especially if your younger than 50) you should be able to notice the difference quickly.


To stick with your car analogy though, with screens it's more like the performance vs. gas mileage trade off. Your sports car will get you the same place at the same time (capped by speed limit) with more fun (if you like that sort of thing) but cost you at the fuel pump. This is fine if you don't care about $/gallon, but sucks if there is gas rationing.

This analogy is also flawed, of course.


sounds like something I'd rather never want to get used to then. More expensive, uses more power so kills battery life faster and worse for the environment. Where's the win?


> uses more power so kills battery life faster

not necessarily. on promotion iPhones (fancy name for variable refresh rates), the panel is not stuck at 120Hz all the time. it varies from 1Hz when the screen is static up to 120Hz when it's getting animated, down to whatever framerate your content is at (e.g. movies and videos). it actually is a battery saver whenever the phone's screen is static (common on most apps with text) or with <60Hz content (youtube, movies, etc).


> uses more power so kills battery life faster and worse for the environment. Where's the win?

Depends on the device, but often the difference is a very minimal few point percent, perhaps 3-5%. I would think it can easily be compensated for by undervolting/powerlimiting on laptops.

The win is lesser eye strain/headaches for a lot of people. No harm in turning it off if you don't need it, but many do benefit.


I've recently splurged on a new phone (old one was >4 years old and started to run a little slow) and a high refresh screen just adds to the general "snappiness" of everything. I'm also using a high refresh-rate monitor for work and it's always a little weird to go back to a 60hz screen.

This is obviously not a critical feature, mind you, but it does add to the experience of using a device, even if it's just productivity-related tasks.


I can certainly see a difference between 60Hz and say 120Hz. The latter is way easier on the eye.


Only 1-monitor support seems like a market segmentation decision, not a lack of capability.

(Much like how the iPhone Pro has faster USB-C data transfer speeds vs base iPhone)


I agree but the iPhone is a bad example. They used the previous year chip in the non-pro phone (or a binned version IIRC) which didn’t have the USB3 speed support. I guess we will see if next year’s base iPhone has the faster data speed support or not.


Give me a "pro" which is thinner than and without a fan.

or

Give me an "air" with two external monitors and 64GB ram.

The pros are clunky and heavy. I'm on an air, and will stay there for a long time because if this


Aren't the pros and the airs very close to the same weight? the 15" air is 3.3 lbs and the MacBook 14" with the m3 is 3.4lbs. The heaviest 16" with the m3Max is 4.4.


15 air significantly heavier than 13


Yeah, you can actually use these dongles to connect more monitors.

https://m1displays.com/


A DisplayLink is not a real dock. It's basically compressing the display output using the CPU and decompressing it using an external GPU in a box to output it to more displays. The performance is terrible, you get weird artifacts, there's lag, you can't use it for gaming, etc.


Sure, don't use it for gaming, but I wouldn't call it terrible. See my comment here: https://news.ycombinator.com/item?id=38131364


The regular iPhone 15 has last-year's Pro chip, the A16, and the 15 Pro has the new A17, so it's not simply a matter of binning or disabling features.


Yeah it absolutely is and that’s why they sniff farts. Two monitors is such a strange thing to segment but it’s a smart business decision. It just sniffs farts


It sucks but at least DisplayLink works well enough. Driving 4 monitors this way with my M1 Mac Air with no issues.


Can you name or link any specific hardware or software you're using? Last time I looked into how to run 2 monitors on my M1 Air I gave up after seeing $400 docks that seemed to have performance issues for some people.


I have this for my m1 pro:

https://a.co/d/gD5q8G3

It uses the 6950 DisplayLink chip, which can do 4K@60Hz.

It's 90 dollars and drives 2 screens.

Software is less good. I had to enable Rosetta2 to get the pkg to install. Then you have to boot to recovery mode to allow signed 3rd party drivers. And you get a creepy notice that someone is watching your screen, which is how the display driver works.

Performance is pretty good, just a bit laggy on mouse movement.


Is that because the screen generation is software based? I noticed that when using display link and couldn’t bare the input lag or worse the compression from moving a window too rapidly. Switched to a monitor with thunderbolt output and my coworkers thought I was being a diva.


My understand is that yes, the external displays are both rendered by the CPU and so there are lower frame rates and frames dropped which is a concern of mine since I am pushing this computer to its limits with some applications I run. I've heard that for casual use it isn't too noticeable.

Unfortunately I don't think the M1 Air supports TB daisy chaining unless you're ok with mirrored displays. I still can't decide if I want to ask my job for a new computer, a new dock, or new displays lol.


macOS doesn't support daisy chaining, and never has. You can't use daisy chaining with any Mac. It's weird, it's stupid, but it's true. That's the sole reason the dock market is so complicated.


Yeah, it's definitely not as good as native, but it's not bad. One other significant restriction for DisplayLink is that it's unable to display DRMed video content.


Thanks, I'll probably end up trying this!


I was using a secondhand Dell D6000, hoping to connect one 4k monitor natively (dp alt mode) and the rest via DisplayLink. That didnt work, now I’m using cheap docks I bought for ~€50 secondhand from a company called i-Tec (have one at home and one at work).

The way I have it set up: the i-tec dock is connected to a regular USB-C hub (from Ugreen). The USB-C hub is also connected to a 4k monitor. This way I connect 1 cable to my Mac and get everything: power delivery, Ethernet, peripherals, 4k monitor running natively and 2 2K monitors running on displaylink. It is… perfect. I havent touched my desktop computer for about a year.

Basically it is better to use the Displaylink monitors for static content - running 4K video on a 4K monitor on Displaylink would probably take about 40% CPU on my M1 chip. So everything dynamic goes onto my native 4k monitor, whereas the Displaylink monitors have easier content - browsers, vscode, notes etc.

But like, I regularly also play video on my 2K displaylink monitors and honestly I never ran into issues, or noticed a difference. Just dont do this on 4k monitors I guess.

Sorry for the delay, still havent figured out a way to get notified about replies on HN.


Your solution seems like the best so far! I had not thought of doing a mix of native and DisplayLink outputs, that's definitely the best of both worlds.

Since I also don't get notified of replies I've been checking back on this comment in hopes that someone had a clever solution like yours. Thanks!


Glad to help! It's really good, especially considering how dirt cheap the M1 Macbook can be these days. It's what I recommend to most people around me shopping for a laptop.


Sadly it doesn’t work well enough with 4K displays for me.

At least, not the last time I tried.


The answer is pretty straight forward: it has no fan and limited ram.

It supports a 6K screen in addition to the built-in screen. It's reasonable to say that if you need 3 screens to get your job done, then that is not representative of the market for apple's lowest-end laptop.


My super old hp elitebook from 2015 was a piece of shit and yet had to trouble driving two external monitors

This is arbitrary market segmentation. I need two external monitors mostly because I share the same single cable to my dock with work and because macOS is so stupid about monitors it randomly connects to one or the other, not both. So I have to manually power off the wrong monitor each time and turn it on for work. Really annoying.


Then don’t buy the lowest-spec Apple laptop?

Btw a quick check on Elitebooks start at $1,760 and support at most a single 5k screen.


The basic office setup in most companies is two screens with a dock (lid closed). That's a super common way of working.

RAM and fan has nothing to do with it, the cheapest low-end Windows laptops can do this with ease, it doesn't stress the RAM, CPU or GPU at all.


> RAM and fan has nothing to do with it, the cheapest low-end Windows laptops can do this with ease, it doesn't stress the RAM, CPU or GPU at all.

- M-series uses unified memory, the ram does in fact matter.

- Heat also matters, because throttling. The Air uses a fanless, heatsink system.

- Whatever cheapest low-end PC you’re imagining is not fanless and not running an external 6k monitor at 60Hz.

- None of those systems are optimised to maintain performance while possessing an 18 hour battery life. None of those even have an 18 hour battery life.

The Air serves a clear market segment. It’s their entry level laptop, the problem isn’t the laptop.


Dude my old galaxy phone from yesterday could drive a 4k display


Filing this under irrelevant. My dude.


I have the Air, it does support its own monitor + a 6k monitor.


It's not hard. The pro can do it. It's simply market positioning to make you spend more.


It's intentional. Apple is going down the path of SKUs to segment their price points.

It may also be they have chips that have bad display controllers and they are using this as one way of offloading those chips with the bad controllers lasered off.


I mean, the support for even one monitor is by far worse than windows or linux IMO.

- Monitor randomly resets/readjusts for no apparent reason. Like multiple times every hour.

- Windows disappear, become inaccessible after moving between monitors, even though it still open and active according to doc/task list.

- Why can't I move a window to a monitor/workspace that has a maximized window? Like, you can do it by un-maximizing the window in question, moving the other window over and then re-maximizing the window again. But why is this nonsense needed? What problem could this restriction possibly be solving?

- Lots of other monitor/workspace related problems (quick google will show dozens related problems without any obvious solutions or explanations for the completely nonsensical behaviour).

I really don't understand how people can say macos has good and consist overall UI? I mean, yeah - it's consistently buggy and un-intuitive:

- Doc constantly breaks/becomes inaccessible. This is a known problem for at least 5+ years - the solution is to manually kill and restart the process???

- Text cursor/caret randomly disappears when editing text, so you can't see where the cursor is, and you can't fix this unless you restart the app (happens to pretty much all apps).

- Was working with unicode characters recently, now whenever I press command+s to save something, it gives me a visual unicode character selector popup, with no apparent way to stop/cancel this behaviour from the popup itself.

- Bad defaults in terms of keypress repeat times and rate. No apparent way to change this from settings - need to run commands and re-login to test new behaviour.

Just an overall crap OS and UI imo.


I’ve been using macOS in a multimonitor setup for years without much issue. A good bit of it boils down to macOS expecting monitors to be well-behaved, e.g. each having unique EDIDs (many don’t, instead sharing one across all units of a particular model) and initializing in a timely fashion.

> Why can't I move a window to a monitor/workspace that has a maximized window?

Because it’s fullscreened, not maximized. macOS doesn’t really have window maximization in the traditional sense out of the box, you need a utility like Magnet or Moom for that.

That fullscreen mode was introduced in 10.7 Lion and a lot of long time mac users have found it silly from day one. Personally I never use it.

> Text cursor/caret randomly disappears when editing text, so you can't see where the cursor is, and you can't fix this unless you restart the app (happens to pretty much all apps).

I’ve seen this, but only in Chromium browsers and Electron apps. Seems like it might be a Blink bug.

Regarding other OSes, multimonitor on Linux is mostly fine (so long as you’re using Wayland; X11 is another matter especially if you’re doing something slightly uncommon like using two GPUs, in which case xorg.conf mucking will likely be necessary).

By far the most frustration I’ve had with multimonitor is in Windows, which is generally weak there. IIRC it only recently gained the ability to set per-display wallpaper; before you had to glue wallpapers together into a single image that spanned across them, which is silly.


Interesting - it's one of the killer features of macos for me.

I use it to compartmentalize stuff I'm working on, and 4 finger swipe between them.

I have a couple instances of VSCode, some app/debugging browser windows, some chrome profile windows etc... all running full screen and I spend my day brush swiping between them.

It may be different for me because I work on a 14" macbook pro, mostly at coffee shops or unusual work locations, so I don't have a large monitor setup. I don't think I'd use it much if I had more of a traditional desk/multimonitor config.


But you can have workspaces/maximized windows without having the restriction of preventing moving windows to another monitor/workspace that has a maximized window.

I use workspaces on my linux laptops all the time - this 'feature' has existed for ~20 years, and is not hard to implement. The difference being that macos seems to force the restriction of having only 1 window per workspace and not being able to drag a window from one workspace to another. If that's the behaviour you want (one window per workspace), you can easily operate in this way without having the forced restriction.


You can have multiple monitors and multiple workspaces with multiple windows that can be freely moved between monitors and spaces. The only restriction is that when you make a window full screen (not the same as maximizing), it becomes its own new single-occupancy space. You seem to think that making a window fullscreen is the only way to make a new space, but it's just a special case of a larger system that already has the functionality you're asking for.


You're one of the few. Most "pro"-users I know immediately go for 3rd party applications like Rectangle do have a better window-management setup. I never understood how Apple thinks this kind of window management is still OK. It's one of those stupid little Apple quirks like not adding calculator on iOS for so long.


I could definitely see it being more useful on a small screen, especially something like the 12” Macbook.

Generally I’m working at my desk with at least 2x 27” displays. If I’m out somewhere it’s instead 16” MBP + 12.9” iPad with Sidecar.


> A good bit of boils down to macOS expecting monitors to be well-behaved, e.g. each having unique EDIDs (many don’t, instead sharing one across all units of a particular model) and initializing in a timely fashion.

I don't really buy this explanation. I'm talking about using a single external monitor here, a monitor which I regularly also use with linux (intel/amd) laptops without a single issue.

> Because it’s fullscreened, not maximized. macOS doesn’t really have window maximization in the traditional sense out of the box, you need a utility like Magnet or Moom for that.

That's not really a good reason though. It still doesn't explain why the restriction exists. Why can't a non-maximized/fullscreen window be displayed on top of a fullscreen/maximized window? What problem does that solve?

> That fullscreen mode was introduced in 10.7 Lion and a lot of long time mac users have found it silly from day one. Personally I never use it.

It's default though.


> I don't really buy this explanation. I'm talking about using a single external monitor here, a monitor which I regularly also use with linux (intel/amd) laptops without a single issue.

Might be model-specific then. It’s not something I’ve seen with displays from Asus, Alienware, and Apple. I briefly owned a Dell monitor that would periodically flicker but it got returned.

> That's not really a good reason though. It still doesn't explain why the restriction exists. Why can't a non-maximized/fullscreen window be displayed on top of a fullscreen/maximized window? What problem does that solve?

Probably because it’d be easy for windows to get “lost” if the fullscreen window were focused and non-fullscreened windows fell behind it, with there being no obvious indicator that those windows exist.


In my experience, MacOS monitor support is very hit or miss. For an Asus Gaming monitor, I had to write a custom plist file to get it to even show a picture and getting it to run at native resolution took a lot of experimenting.

I would not buy any new monitor for MacOS that does not explicitely advertize MacOS support.


That’s kinda wild. Is there anything remotely unusual about the monitor, like high rez and refresh rate?

I was pleasantly surprised when I plugged the AW2721D that’s part of my gaming setup into my daily driver Mac. Everything worked great, including 240hz and variable refresh.


> That fullscreen mode was introduced in 10.7 Lion and a lot of long time mac users have found it silly from day one. Personally I never use it.

Agreed, however it's arguably for a different mental model. If people think of it as "focus mode" or "workspace mode" it makes more sense. Four finger swipe to slide between the workspaces or focuses.

More importantly, you don't need MOOM.

- - -

To maximize a window to the dimension of the screen without entering full screen mode, either:

1) hold shift option ⌥ and click the green maximize button on the top left of the window

- or -

2) hold the Option key and double click a corner of a window to maximize without full screen focus, and doing that again will size it back to where it was


> To maximize a window to the dimension of the screen without entering full screen mode, either:

This is a handy trick to know, but has caveats. Option-green-functionality is actually defined by apps, not the OS, and so in some cases it will for example act as a “fit window to content” button.

Option-double-clicking a corner appears to be consistent across windows however.


Double click anywhere on the title bar works for me.


I've both used Macs for decades and at one time worked in enterprise Mac technical support, and I've never seen this stuff happen.


I've used multi-monitor dev setups and large screen setups in mac laptops for, christ, 15 years.

I've seen all of those things listed.

It would be one thing if OSX/Linux/Windows UIs would just stay at their basic usability from around 2010. They haven't. They just get steadily worse.

OSX has been the most stable of them, quirks and all, even with a goddamn architecture change, but its closed hardware, a forced upgrade cycle, and bad backwards compatibility.

Linux once would just rewrite window managers every 5 years as soon as they got stable, now Linux is rewriting with Wayland and Vulkan, and ... about 5 years in they'll probably start rewriting those once they get a little stable.

Windows? Committed UI suicide with the tiles and two desktops thing in Windows 8. Now there's ARM in the future and a break with the only thing it has going for it: backwards compatibility.

But yes, everything the author complains about happens in OSX with laptop + monitor / dual monitors, and even without with disappearing mouse cursors.


I haven't used Mac for a while, but I definitely remember the part about window freezing (and requires killing the process) when connecting/disconnecting a external monitor. It's very frustrating.

(And I for one hasn't bought a Macbook since 2020 because I am not going to spend at least $1,999 just to get dual monitor support. A $500 asus laptop can do dual monitor without any problem, and it turns out that machine is good enough for my productivity needs. That money is better spent elsewhere)


Do you honestly believe all of those things happen to everyone? You think everyone who likes macOS is just ignoring those kinds of severe issues?

Obviously you have some kind of problem with your system.


Well, most of what he is saying are actually easily reproducible macOS quirks.

- Windows disappear, become inaccessible after moving between monitors, even though it still open and active according to doc/task list.

This indeed happens relatively often if you have multiple monitors and switch between them for any reason. e.g. in my case I have two machines and two monitors, sometimes I switch the primary monitor to a specific machine and this almost always fucks up macOS. Solution is to disconnect and reconnect the monitor.

- Why can't I move a window to a monitor/workspace that has a maximized window? Like, you can do it by un-maximizing the window in question, moving the other window over and then re-maximizing the window again. But why is this nonsense needed? What problem could this restriction possibly be solving?

A lot of people who use macOS agree that the fullscreen window thing is needless and makes for quirky behaviour.

- Dock constantly breaks/becomes inaccessible. This is a known problem for at least 5+ years - the solution is to manually kill and restart the process???

- Text cursor/caret randomly disappears when editing text, so you can't see where the cursor is, and you can't fix this unless you restart the app (happens to pretty much all apps).

Yep, happens quite frequently to multiple people I know and in multiple apps.

- Bad defaults in terms of keypress repeat times and rate. No apparent way to change this from settings - need to run commands and re-login to test new behaviour.

Indeed this is very painful for people who are non-developers and are used to be able to have higher repeat rates.


The Air is not the computer for people who use a lot of external monitors. 99.9%+ of MBAir users will never connect even one external monitor.

They make a small MacBook Pro that is suited for that task.


You need to spend a minimum of $2000 though, the new $1600 MacBook Pro also only supports one external display.


Yeah, the new MacBook Pro with a non Pro or Max chip is essentially an Air with a couple extra ports. I don't think they should have released it.


It's better than the 13" MacBook Pro with touchbar that it replaced.


The screen is substantially better, and the addition of the ports will be useful for a lot of people. Honestly, the base Apple Silicon chips are more than fast enough for most people, if they supported more than one external display I would have no problem recommending the M3 MacBook Pro with a RAM upgrade.


If you're referring to the 13-inch model, that's been discontinued.


No, the 14. The Air is 13, and the “normal” MBP is now 16. The 14 MBP is not onerously large if you are one of the people who want an Air-sized computer with lots of IO.

It would be annoying if you had to get a 16” laptop to be able to hook up to multiple external displays. I love my 16” MBP but my Air is the one that goes everywhere in my handbag.


The plain M3 version of the MBP doesn’t support more than one external display either[1].

[1] https://www.apple.com/macbook-pro/specs/ (scroll down to “Display Support”)


It's not a computer for people who don't use external monitors because it's not supported. Connecting more than 1 display is not a "pro" task, it's pretty normal especially for professional purposes.

Also, the base MBP 14" doesn't support more than 1 display either.


And yet it's a dick move from Apple to artificially limit number of external screens supported.


They are not artificially limiting the number of external screens. To support more screens, they would have to reserve die area for an additional controller. That could mean a gpu less or something. This is about engineering tradeoffs.


Considering the M3 has 55% more transistors than the M1, I'm sure they could have found some room for a third display controller, or a more capable one (one with MST support for instance), or just a more flexible one.

They're selling a $1600 machine which can't drive two external displays, it's sad. Intel's HD Graphics have been doing better since 2012 and they're trash.


It’s a business decision. You don’t think Apple knows the demand for multiple external displays? If they thought it would sell more Macs, they would do it.


> It’s a business decision.

That doesn't make it less frustrating.

> If they thought it would sell more Macs, they would do it.

It's short term thinking, it generates bad feeling of nickel and diming and unnecessary upsells: on M3 you need to pay $2000 to be able to use two external displays, and $3200 for 3, even if you have no need for the rest of the processing power.

Not only that but they managed to create more range confusion with the expensive but gimped entry-level 14".


What you describe is only considered by a vanishingly small number of their customers. The vast majority don’t care or even own an external display. You could be right about range confusion, but I doubt it simply because, like everything else, Apple is so disciplined about market research. Time will tell


> The vast majority don’t care or even own an external display.

Or more than 1 external display. The vast majority own none or a single. Few people have multiples, and those people are generally enthusiasts who will also tend to pay more.


I'm curious where people are getting this data point?


Fair point. It's a bit anecdotal and things I've read over the years. Multi-monitor was growing quickly for desktops until the big wide screens came out. Then it seemed to slow. As many people shifted to laptops, the one larger monitor seemed to be more common or no monitor at all. There's also the challenge of defining multi-monitor as I use 3, but one of those is my laptop. In that sense nearly everyone can run 2 screens.

I'd love to see some modern stats that take into account these nuances.


You can say, "it's a business decision," about nearly anything that a business does. That's not really an argument for or against something.


Ofcourse it's a business decision but it's one a lot of people disagree with. How can you keep repeating 99.9% of the customers don't care when it's one of the most repeated complaints on places like this over and over again? It's a stupid decision that doesn't make sense.


Hi, in case you forgot, you’re on an HN forum. That’s why 99.999% of people do not care about most things people here care about. Get some perspective.


> considering the M3 has 55% more transistors than the M1

And that's why everyone loves node shrinks. You can stuff more transistors for the same cost.


Engineering tradeoffs to support something intel/amd PCs have supported for 15+ years? Maybe apple is just shit at engineering?


Have you ever actually tried driving decent large monitors from those crappy intel chips?

I have.

Engineering tradeoffs indeed.

For all its flaws, Apple focuses on making sure there is a good experience overall. In general, they'd rather have a more limited hardware ecosystem that works better than a broad ecosystem that's flaky.

There are tradeoffs to both approaches. I go back and forth between windows and mac, and have for decades.

I was a surfacebook/win10/WSL guy for a few years, and liked it a lot. Win 11? Not so much.

Currently the M series chips blows any PC laptop away for the combination of portability/performance/battery.

The Snapdragon X might change that in the next couple years, and I'll take a look at switching back then, and see what win 12 offers. If MS decides to take a step back from being an ad-serving platform and goes back to focusing on productivity, I may switch back.

For now though, I'm planning on accepting the gut punch to my wallet and picking up a 14" M3 Max in a few weeks.


> Have you ever actually tried driving decent large monitors from those crappy intel chips?

Yes, I regularly drive my monitors with 1 intel laptop and/or 1 amd laptop (modern ones, linux). Not a single issue ever. I plug in my work m2 pro mac to these same monitors - constant issues.

> For all its flaws, Apple focuses on making sure there is a good experience overall.

MacOS has had dozens of well-know, well-documented bugs for years that haven't been addressed, many of them mentioned in this thread. I don't see any focus from Apple on these whatsoever - their main focus seems to be advertising and visual design.


I have done dual external monitor setup for years on Intel and AMD laptops without any issues.

The standard setup at my company is a ThinkPad connected to dual 27" monitor. My company has thousand of employees. (Mac setup is also available for those who requests.) Many many companies offer a similar setup for employees.

Maybe you need a reality check first.


> Maybe you need a reality check first.

Thanks for the kindness, I had my first laptop (PC) in the mid 80s, and have been writing software on various hardware configurations for 30 years, so maybe I just need a bit more reality I guess?

> The standard setup at my company is a ThinkPad connected to dual 27" monitors

No one claimed that dual monitors isn't possible.

Running two 4k large, high quality monitors off a low-end Intel integrated video controller at 120hz is going to be harder.

Apple is focused on higher-end experience, always has been. They're not a great choice if you're looking for standard enterprise arrangements.

You can go into pretty much any bank or office and see the kinds of setups that you're talking about.

You can also see the flicker and wince at the eye-strain.

This is a silly discussion.


> I had my first laptop (PC) in the mid 80s, and have been writing software on various hardware configurations for 30 years, so maybe I just need a bit more reality I guess?

Yes, sounds like you've been drinking apple koolaid and fanboying over bullshit for 40 years.

> This is a silly discussion.

What's silly is reading through comments of multiple people discussing actual issues they face with macs, completely ignoring what's said, and then claiming 'apple is focused on higher-end experience' based on nothing but your own anecdotal experience and imagination.


> Yes, sounds like you've been drinking apple koolaid and fanboying over bullshit for 40 years

I mentioned that I switch back and forth between PC and Mac, depending on what's better at the time - so you're pretty far off the mark.

Regardless, why are you responding with such anger and venom?

> What's silly is ...

I'd like to suggest that you take a moment to reflect on how you treat others. Even a stranger online.

I notice that your account is 59 days old, so maybe you're new to the HN community.

This community frowns on personal attacks, we try to focus on comments that are informative and add to the discussion.

I recommend reviewing the comment guidelines [1]

[1] https://news.ycombinator.com/newsguidelines.html


> Regardless, why are you responding with such anger and venom?

Any 'anger and venom' is completely imagined by you.

> This community frowns on personal attacks,

Really? What personal attacks? Please point out exactly where I've personally attacked you.

> we try to focus on comments that are informative and add to the discussion.

You mean like your comments in this chain? Where you dismiss the entire conversation as 'silly', based on nothing but your own personal, anecdotal experience? If it's silly, no need to participate. Your comments are rude, condescending and add nothing of substance to the discussion.


@dang - wubrr is a newish account, I've reviewed their other comments/threads and they all seem fine, if a bit inflammatory. This one is off the rails though. I'll stop participating. Probably best if this whole thread is deleted (starting with my original comment), it adds no value.


Are you like an aspiring moderator? Nobody asked you to review anyone's comments or comment on anything.

Save this petty nonsense for reddit.


Intel/AMD chips don't approach the efficiency of what Apple makes.


I don't understand this pretence that the vast majority of the time laptops are plugged into the power anyway.


There's always one reason after another. "It won't fit", you say sure it can, others do it, "Well, they don't do it efficiently", you say okay, how many times are you driving multiple monitors (most of which support power delivery these days) on battery, then it'll be "Apple just understands this better".


You're effectively saying that you want Apple to design a chip without any compromises in features, cost, performance, or efficiency.

Apple sure could design a chip that drives more monitors, but maybe they decided that making the chip x% more performance/efficient, or having y feature was more important.

Or, maybe this really truly is trivial and they decided to segment their chips by not including this feature. There is absolutely nothing wrong with that.


Perhaps because until recently, battery life has sucked? I used to plug in my laptops all the time. Now with the M1, it only gets plugged in to charge, then I don't plug it in until I charge it again. I have been enjoying sitting on my couch when I want to, to get work done rather than at my desk.


You can sit on the couch and have your laptop plugged in. I also don't keep my lenovo plugged in all the time and don't notice any battery issues. Realistically you're not sitting on a couch, working for 6 hours straight either.


It’s not just battery life, it’s heat and noise too. Yeah I know not a lot of people actually use laptops on laps (though I do), but you typically can’t hide it under a desk like you can with a desktop.

My job provided me with a Dell laptop with an i7 and it can keep my coffee warm just idling. Sometimes, when it’s decided it doesn’t want to sleep, I can hear the fans from the next room. While the screen is off, not being touched.

I could go on at length about just how shitty the Dell is compared to my MacBook Air (despite costing more and performing worse!), but Apple Silicon Macs are just so much nicer when it comes to the practicalities of using a portal computer.


So this conversion goes from Apple shouldn't need more display controllers since most people Air users don't use more displays to most people don't use their laptop on battery but power usage should be a major concern.

It's almost like we should focus on what Apple does best and ignore what Apple does worse.


True. But when I was going to customer sites, it was really nice to plug in my MacBook Pro the night before and not have to think about battery life. It’s also nice not to have the fans going constantly and being able to work with it on my lap without the fear of never being able to have little Scarface’s.


Lots of compromises are made to get the Air to the size, weight, and most importantly, price point (ie sub-$1k) that it is.

It is Apple’s best selling computer by quite a bit. They sell metric assloads of the $999 base model each fall when school starts.

IMO the $999 one (or $899 with education discount) is literally the best price:perf ratio of any computer available today or at any time in my recent memory. It’s astounding how good it is for that price.


> Lots of compromises are made to get the Air to the size, weight, and most importantly, price point (ie sub-$1k) that it is.

> IMO the $999 one (or $899 with education discount) is literally the best price:perf ratio of any computer available today or at any time in my recent memory. It’s astounding how good it is for that price.

It is possible for your comment to be correct while the poster above is also correct. The M chips are amazing, but let's not pretend that Apple doesn't knowingly gimp the hardware to upsell the more expensive versions.

"But that would make it more expensive."

Yes it would... by pennies. It is a bit ridiculous to sell a phone as expensive as the current iphones and limit them to USB 2.


Generally speaking, people don’t use the port on an iPhone for anything other than charging. The fact that it supports USB v-anything is mostly unnecessary cost.

If you are thinking in any way whatsoever about the non-wireless data transfer speeds on your phone, you are not the market or user it was designed for. Same goes for external monitors on an MBA.


I agree that USB isn't commonly used but that's kinda the point - it doesn't hurt apple much to do it (and they're not saving anything remotely significant), but they still do it to force the few who do want to use fast USBs. A more common example where nearly everyone would notice would be the abysmal ram in their phones until very recently.


> knowingly gimp the hardware to upsell the more expensive versions.

I'm skeptical. A ton of money is dumped into optimizing everything and then they just throw it down the drain? That would drive more people to the competition than to higher-priced Macs.


I meant that bit in general and not specifically for the mac, but for eg they sold iPads with 32gb storage as recently as 2020 iirc. Even now the base level macs have much less ram and storage than similarly priced windows counterparts. I think part of it is just so they can say "macbooks start at $xx!". There are more examples I can't think of right now.


the 2000 Euro M3 Macbook pro also supports only one external display and is the same size and weight as the ones that support more.


Come off it.

- the 14" M3 is $1600, but limited to a single external display, and actually only has two TB ports, it's just sad

- the M3 Pro is limited to two external displays, despite an entry price of $2000

$3199 is the baseline to be able to use 3 external displays (although at that price you can plug up to 4).


If you can afford multiple monitors, surely you can afford a slightly more expensive laptop.


Given what appears to be a reduced performance focus for the M3 Pro chip, and the M3 Max has taken off to the stratosphere in both performance and unfortunately price, I’m wondering if for a lot of professional users if getting a refurbished M2 Max machine is actually the best price to performance move right now.

Really looking forward to some detailed M3 Pro analysis.


I think I'll hold out to the M4 and make sure memory bandwidth and latency is restored to it's previous value.


Given that Intel and AMD tend to choke the memory bandwidth on anything below their server chips, this seems short sighted.

The M3 Pro's "reduced memory bandwidth" is still double the memory bandwidth you see on Intel and AMD's HEDT chips.

Step up to the M3 Max and you're looking at five times the memory bandwidth of Intel and AMD's chips.


Fair point, I think what I'm saying is I'm currently at an M1. I think waiting for an M4 will be another 10x leap. Apple is hellbent to release something new every year, so I think my M1 should last me quite a long time.


You're comparing strictly CPU memory bandwidth to CPU + GPU. If you add CPU + GPU bandwidth for a PC you'll get similar numbers.


Your PC can't use the GPU memory bandwidth for the CPU whatsoever. So why would you add the bandwidth?


This is good to know. I worry about the longevity of these new M-series processors from Apple since they're so (relatively) new. What is the usable lifespan of a base M1-MacBook Air for example? I guess I'll find out since I just bought one for my kid who is a junior in HS and I'm hoping this will last until they graduate college.


From my experience with M1, M1 Max, and M2 Max laptops, I think the only real limitation to useful longevity of the M1 machines is configured RAM and Apple's willingness to provide software updates.

A point in its favor is that Apple is still selling the M1 MacBook Air new today as their entry level laptop, which is now 3 years old. And it's still a great machine with better performance and battery life than a lot of competitors and it's completely fanless. I expect Apple to push for dropping new OS support for Intel machines fairly quickly, but the countdown clock on M1 support probably doesn't even start until they stop selling them as new.

My previous Intel MacBook Pro was still perfectly serviceable 8 years after purchase when I traded it in, although it wasn't going to receive major OS updates going forward.

The only time I've had issues with my M1 MacBook Air 8GB is when I temporarily tried using it as a real dev machine while waiting for a backordered company laptop. As soon as you really hit that RAM ceiling due to running docker and big IDEs you really feel the performance drop, but until that point it was perfectly competent, and again this is a fully fanless machine.


I bought M1 Air 256GB with 16GB RAM on release for light web development and professional 2D gamedev on Unity. Still using it till now and it's really really durable hardware and I've been usign it daily for 3+ years.

With so little RAM had to sacrifice ability to properly run VMs, but yet till this day there just no comparable hardware on market.

8GB versions though... Certainly Apple should be damned for selling laptops with 8GB RAM in 2023 when my x220i ran 16GB in 2011. At some point they will all become e-waste due too little RAM or dead soldered SSD because of extreme swapping.


8GB of RAM ought to be enough for anybody right? I'm hoping that's true for someone who spends most of their time in Google Docs or Word and the occasional light web browsing. This MBA replaced a Chromebook which really was on its last legs.


I'm using an M2 Air with 16GB and M1 Mini with 8GB. On the 8GB machine I have to restart Firefox every few days because otherwise it completely fills up RAM + Swap and the entire system starts getting sluggish. It's a very noticeable bottleneck while on the 16GB Macbook I don't really think about memory usage.

I would not recommend the 8GB variant to anyone who plans on using it for more than a couple browser tabs and some light tasks.


Check about:memory in Firefox. It's not reasonable to run out of swap; that means it's trying to keep something like 40GB in there after compression. You might have an extension or something leaking.


> I would not recommend the 8GB variant to anyone who plans on using it for more than a couple browser tabs and some light tasks.

Which is basically all my kids do. Less than 10 tabs, nothing else.


I’m a heavy user often do 50 tabs and lots of apps in the background. 8gb M1 air is really fast for me.


For a while now, the lifespan of Apple devices has mostly been limited by Apple stopping to provide software updates. Most Macs from 20 years ago still work fine today. If some component died, it was usually the hard drive or the power supply in my experience.

Apple Silicon will definitely be the architecture best supported by Apple for the foreseeable future. There is no downside to it in terms or reliability or longevity.


To make a somewhat exaggerated comparison: horses still work fine today, but it really doesn't make sense for people to be using them anymore outside of niche use cases or as a hobby.

Sure, macs from 20 years ago still work "fine", but the reason they aren't widely used anymore is not the lack of software updates.


For context my M1 Air feels as good as the day I bought it and it's just about to hit 3 years old.

I kept my previous 2014 MBP for 6 years and I can't see this one being any different.


Same. The only issue I have w/ my original M1 Air is that I was impatient and got the 8GB model because I could get it into my hands a month faster. But that's not something that's changed over time.

Every new release I drool over getting the new shiny but the reality is that for my personal laptop it's AOK for almost everything I do. Sometimes a bit slow and frustrating due to the RAM but again, that's on me.


I'm running Windows (ARM) virtual machine on M1 Air, and it's constantly thermally throttled. It was hitting 95 Celsius degrees sometimes. I even bought one of those cooling pads with three fans, to make up for the lack of cooling in the laptop itself, so it is now down to 65 Celsius degress. Makes my life little more bearable, but I'm switching to M3 Macbook pro (with proper cooling) ASAP.


Add a thermal pad to connect the heatsink to the back cover and the thermal throttling will go away. Tons of tutorials on Youtube and it takes 5 minutes. I did it on my M1 Air and it doesn't go above 85-90C now.


For me at least the longevity is already far better than the Intel MacBooks Pro. I bought the 2017 and 2019 models and for my use those felt obsolete from day 1, with 2 hour battery life and permanent full-speed fans and CPU throttling.

I have a M1 Pro since release day and I don't see myself wanting to replace this until probably the M5 is out.


My experience with buying lower end intel based Macbook Airs is that they generally last 10 years before we replace them. Usually this comes down to asking if it is worth replacing the battery or better to invest in a new machine. I'm assuming the M processors will be similar.


I’ve got a 16 inch Pro with 32g of RAM and I can see myself using this for development work at least a decade.

I tend to run it with a large external 4k 120hz screen and don’t see the need to upgrade for better IO for a long time and the thing is very, very fast.


It’s ridiculous that the base and Pro only support 1 and 2 external displays, respectively.

Want 3 monitors? Have to pay for the Max.

https://www.macrumors.com/2023/11/02/m3-chip-still-supports-...


From what I know, this isn't some artificial limitation that Apple imposes. Adding support has a cost that most customers don't need to pay.

Regardless, most people aren't going to be plugging in three monitors into their laptops, so most people aren't going to care about this.


Intel support 4 displays on a bottom end i5-1235U.

Even my crappy little Lenovo M600 with an N-series Celeron supports 3 displays. Two 4k ones fine. I didn't have a third to test it with.

It's either an architectural fuck up or intentional segmentation.


> It's either an architectural fuck up or intentional segmentation.

Do you think there is any chance that Apple would have had to make some engineering compromise (cost, performance, efficiency) for a feature that very few people would use?


No


  > Adding support has a cost that most customers don't need to pay.
It also adds heat, battery consumption, and space. For a chip that will also be used in iPads. Still I wished the base M3 chip supported at least two external displays. Or at least let you use two by disabling the internal display (eg. clamshell mode or whatever).

  > most people aren't going to be plugging in three monitors into their laptops, so most people aren't going to care about this. 
Last summer (2022), or thereabouts, I got a survey request from Apple where they asked how much I cared about external monitor support, followed by subsequent questions about the number of external displays that was important to me. So, Apple's looking at it.


It's entirely Apple's choice to use the M chips in iPads. And if that means they need to compromise on laptop performance or features, they should change course.


Do you want Apple to drop some of their existing products, or do you want them to make more chip designs each generation with fewer economies of scale for each?


Apple still uses iPhone chips in the iPad and iPad Mini, and at this point I'd say they're more than fast enough. The iPad Air and Pro with M chips seem to me to be unnecessarily powerful for a device that runs iPadOS and therefore can only run apps from the App Store, etc. IMO, people don't buy iPad Airs and Pros over the standard iPad or Mini because they need the performance of the M chip. It's the rest of what's in the package.


If I ever have to try to plug more than two monitors into a laptop at that point I'm going to start asking myself why I'm not just buying a tower instead. The whole point of laptops is portability.


> The whole point of laptops is portability.

Many people want portability, but also want to use external displays at home or at the office .


Really you’re complaining you can’t plug in a 4th display into your computer? The fraction of the population who does can afford the Max, together with the 3 extra monitors.

I’m not trying to justify Apple here, but this usage seems quite niche and it’s understandable to me to need a non-base setup. It doesn’t sound ridiculous at all. As a matter of fact, the vast majority of notebook users never even plug a single monitor in.


I might argue that what you are describing is exactly why it's a bit absurd. It would be surprising to me as a user to discover such a limitation. If I didn't read about it here, I would probably find out after the return window and then be pissed that they took a stand on this particular artificial market segmentation.


The vast majority of people wouldn't return something because it doesn't have a feature they weren't even aware in the first place that it didn't have, because they don't need it. That's just weird.


You're forgetting another group of people, those that assume it's there because it's basic functionality


Perhaps they should also only allow users to plug in an external mouse if you buy the pro or Max chip.


I’m using the same Dell 1920x1200 monitors I’ve had for 10 years. And have been using them across many desktops and laptops. Including my current 2018 MBP that happily supports them (via 3 native USB-C ports, with one left over for a hub).

I say it’s ridiculous because the computers I’ve been using over the past 10 years — most of them much less expensive than a Max — have had no issues with them. Not because it’s a common need.


Apple may be setting up for this to be a killer feature with Apple Vision headsets.

They are saying a single 4K feed, but if you could break that into segments it would change the game. That would be 4 x 1080p screens + the laptop.

If you could slice the rectangles anyway you wanted and rotate them individually then you have the perfect environment. Which you would be able to change instantly, presumably, with Mission Control.

Personally I would pay lots of dollars for that.


Speaking of...

Can I sidecar to two or more iPads with this yet?

--

"Use an iPad as a second display for a Mac"

https://support.apple.com/en-us/HT210380


Why on earth would you want this when you can get much nicer screens for fractions of the price?

One I can understand as I used to travel with my iPad Pro and used it as a second screen. But two? I’ve switched to a much lighter and much nicer UHD HDR usb-c monitor that is amazing for the price and the weight difference.

You also don’t get the random disconnections when sidecar weirds out for no reason.


It’s called command right arrow. Try it. I think someone did a study and found people cannot use more than two. Everything else you’re better off using desktop management.


As a user of many Apple Silicon generations in different / mostly top configurations by now I’m obviously a big fan. One thing that I have “observed” though (with no data to back it up): it seems to me as if under heavy load / resource over-provisioning the Intel systems from before seemed to recover more gracefully? I wonder if it’s just me, the particular OS version I had been using at the time or some other thing I’m missing?

Again, no idea if this is actually the case / for other workflows than mine. I would be curious to know if anyone else had made the same observation potentially with actual “high load performance” data to back it up?


Right now I have an intel laptop for work and an apple silicon laptop for personal use. The workloads I do on these machines is different so comparisons are a bit hard to do, but I've seen the intel laptop exhibit poor behavior under load that I've never seen from the apple silicon machine.


By “load” you could easily mean “running Microsoft Teams”.


Devices get a fail for repairability. So yes there is more to performance than cores. For instance the entire Mac lineup are mostly unrepairable e-waste in the making with excessively wasteful design choices behind them.

The company has a about as much ethics as wet tea towel. A facade of good intentions masking greed. I couldn't care less if their laptops get an hour extra battery or are a few seconds faster. The company is scum from an ethics and waste standpoint. Surprises me so many can turn a blind eye to it for 3 months of chart topping numbers.


is efficient cores just lower clock or how could 3 6 efficient and 6 performance be worse, is it possible it could be better as its more efficient while also faster and just calling it efficient cores? After all isn't that an arbitrary semantic term?


They take up a smaller size of the chip.

Which could be seen as an indication that those cores have less caching and less parts for prefetching and stuff like that.

The efficiency core would do less things per cycle but also use less power per cycle. The performance cores would do 3 instructions in a cycle but the efficiency core might only do half an instruction.


I asked Claude.ai to summarise this discussion for me, and below is its expanded summary of the discussions related to Apple Silicon and the M1/M2/M3 chip generations.

ON M3 PERFORMANCE GAINS:

The jump from M2 to M3 seems smaller than M1 to M2, especially for the regular M3 chip. Some speculate this shows limits in further scaling down process nodes.

The M3 Pro reduced performance cores from 8 to 6 compared to M2 Pro, suggesting a focus on efficiency, yields and costs rather than maximum performance.

But the M3 Max increased performance cores from 8 to 12 versus M2 Max. It matches or exceeds the M2 Ultra, indicating significant gains at the high end.

The M3 line overall shows Apple is diversifying core configurations across the range. This makes comparing performance gains more complex than simple core counts.

There is debate around how much benchmarks like Geekbench reflect real-world usage. Things like frequency management and cluster configurations affect multicore performance.

The 20-30% gains over M2 may seem small yearly, but comparing to M1 shows solid doubling of performance over 2 years.

ON EFFICIENCY CORES:

Some argue the M3 wasted the process shrink on just frequency gains rather than wider cores or gains in other areas.

But others note efficiency is still improved. The regular M3 efficiency cores clock higher while using the same power as M1's.

Efficiency cores handle lightweight threads well, not just background tasks. Their performance can be significant when frequencies are maximized.

ON THERMAL MANAGEMENT AND BATTERY LIFE:

The M3 line generally favors efficiency and thermal management over peak performance. This suits laptop use cases well.

But it's unclear if a future M4 will tune for performance over efficiency in desktop models, or just add more cores.

Either way, the 22 hour battery life on M3 Air is seen as a major achievement showing efficiency strides.


For context, a desktop intel 13900 to 1400 combined with at least a NVIDIA 4060 beats it...System76 Thelio Mira for example.

Both in Single and Multicores and GPU memory bandwidth


They are basically equal while the Intel CPU uses significantly more power.

https://browser.geekbench.com/processors/intel-core-i9-13900... https://browser.geekbench.com/v6/cpu/3364975


This is my favorite genre of Apple Silicon posts. Someone who can’t tell the difference between a laptop and a 30lb desktop.


Yes but only just and it used 5-10x more power.


I don't see apple silicon in macs having been the right decision a decade from now because competing toe to toe with the entire semis industry on architecture has never worked before except in iphone where they had a huge first mover advantage.


I think the argument that this might not be great in the long run is worth considering, but in the short run I think that their results over the last few years are nothing short of astonishing. Specifically their laptops, especially when not plugged into a power source, have both stunning performance and battery life at the same time.

Yes there are competitor chips from AMD and the like that are faster when plugged in, but at a power budget that mean they are severely throttled when unplugged. This is what Apple cares about, and at this point they are unmatched. Apple likes to have fast desktop machines (and they do), but they don't care about that nearly as much as they care about laptops-in-laptop-mode. We are a couple of years into this, and Intel is making noises about trying to match this, but has not yet, and Apple seems to be pulling further ahead in specifically this area.

And they have a huge overlap in this focus in the development of the related processors for their iPhones/iPads/AppleTV products, and the upcoming headset. A lot of the development work gets largely reused in that other space, making Apples ROI even better.

So long as their is a big enough market that cares about that (and from the look of sales, there is), Apple's strategy seems to be a winning one.


They are making themselves more different from the rest of the market instead of "the same but better" which brought people back to the mac in the first place. It used to be as a normal person you bought the latest Mac and it came with the latest i7 and you didn't even think about it and none of your friends on PC could say anything bad about it because it was the same chip they had. Now it's like woo M chip is so different and fast look at my new Mac guys and then your friends who have PC say actually it's not faster it's just more power efficient and now the seed of wondering if Apple is really better has been planted whether or not it's totally true. I've already started seeing this happen on forums. The average person buying a Mac doesn't want to have to think about the nuances of specs and once they start digging into specs they might find themselves buying a PC instead. But we'll see, maybe the past won't repeat itself this time.


>It used to be as a normal person you bought the latest Mac and it came with the latest i7 and you didn't even think about it and none of your friends on PC could say anything bad about it because it was the same chip they had.

Only a tiny, tiny fraction of people who buy computers have conversations like this with their friends. I am a software engineer and could not care less which chips my friends have in their laptops.


Then why is their marketing right now all about the chip not the product


Why are you arguing about PC performance with your friends? Don't you have something better to do?

The biggest advantage of ARM over x86 is security, efficiency is second. Performance is nice to have but is kind of coincidental (depending on workload) and I'd expect everyone else to catch up.


Are you lost? This is HN, I talk with people about computers a lot.


Yeah but you probably shouldn't. In my experience enthusiasts/power users aren't actually any more accurate on this subject than, say, your grandma. They just know bigger words they can use in conversations.

So there's got to be something either more productive or more grass-touching to do.


I always understood the move to Apple Silicon was also for supply chain concerns, and not just technical. I remember reading that Apple would often express frustration at having to align their products' schedules based on Intel's schedules, and there would be a lack of support from Intel in their collaboration -- I could see this is a a valid bottleneck for Apple, and this isn't their first rodeo in processor transitions. Of course, now, their bottleneck moves down the supply chain and will be TSMC, but TSMC seems to be happy to provide whatever Apple asks for, as seen with their large 3nm orders.


So why work only with Intel and not also with AMD. Never understood that. They chose vendor lock in so of course the vendor is going to sit on it same as Motorola did and later IBM. Now they still have vendor lock in except the vendor is internal. From a distance, it looks super dumb, betting that you can do better with an internal project than playing off the established duopoly. I'm sure I'm too out of the know to understand.


One thing to remember is that hardware designs have a much longer lead time than software and significant support commitments since they need to maintain a business relationship for a decade. When AMD started gaining on speed versus Intel, especially for the thermally-constrained systems Apple sells, Apple already had an in-house CPU design team which was doing good work and they were quite familiar with the benefits in terms of roadmap coordination, pricing, and support.


Apple used the AMD chips that didn't absolutely suck: the GPUs (and they still got plenty of criticism for that). AMD's CPUs were not at all suitable for Apple's most important products until the past few years. The first Zen mobile processors were only a year before Intel's 10nm Ice Lake.


I certainly thought this back in 2012 when Apple started designing their own mobile CPU cores. How could Apple compete with ARM's designs which would have much greater economies of scale or with Qualcomm who would be shipping way more units than Apple? More than a decade later and it's proven to be a durable competitive advantage.

I would point out that Apple had no first mover advantage with the iPhone. Apple was using standard ARM cores for the first 5 years. It's not like ARM didn't have a multi-decade head start on Apple designing ARM cores.

There are certainly risks, but so far it looks like Apple is making the right decision. Intel and AMD are mostly still targeting a higher power/thermal level than Apple is looking for. Apple likes being able to run things without relying on an external GPU. Having the ability to design things themselves means they're able to make what is most important to their users.

With Intel, Apple was always at the whims of what Intel wanted to produce. It took forever to get the 1038NG7 28W part from Intel which meant that Apple couldn't offer a 13" laptop with the speed that a lot of users wanted. It also meant that they were beholden to crappy Intel graphics - even as they paid up for the Iris graphics.

With their own CPUs, their destiny is in their hands. It's worked extremely well for the iPhone and it's working well for their Macs too.

One of the things to remember: Apple already needs to design these cores for their iPhones and iPads. Once you're doing that, it makes a certain amount of sense to re-use those core designs for the Macs. It isn't zero effort, but they aren't starting from scratch.

I'd also note that Apple tends to do less than a lot of companies and the same applies to their processors. Intel, AMD, and Qualcomm are trying to make parts for a huge range of devices. Apple has a much narrower field of products. While Qualcomm is trying to create processors for $100 cheap phones, Apple just goes with a single CPU for its mobile devices (sometimes using last year's CPU for some devices). With the Mac, Apple has a little more variety, but it's still pretty constrained. An M Ultra is basically just two M Maxes put together. Most of the time, the design is basically the same and Apple has just changed the core count or something like that.

Will Apple compete with Nvidia's GPUs? Probably not, but an M2 Ultra is competing with an Nvidia RTX 3070 desktop GPU. Sure, that's not the current generation or Nvidia's highest spec, but it's still good - and OpenCL isn't great on Mac so it isn't even a fair test. The new M3s have much upgraded graphics and it'll be interesting to see how well they do.

You can say it has never worked before, but I think that might ignore a few things. First, one of the reasons that Intel came to dominate things is because everyone who tried to vertically integrate their processors tried to charge too much. Regardless of what you think of Apple's pricing, they aren't charging more for the privilege of their M processors than they were for their Intel ones. I think it also ignores the fact that Apple has so much money. Finally, in terms of cores shipped in the segments that Apple is shipping them, they're huge. ARM isn't shipping many X-series cores (their top performance cores). They're mostly shipping lower spec'd cores. There aren't a ton of flagship Android phones being sold. Intel is mostly shipping lower-end laptop CPUs destined for machines in the $400-800 range, server CPUs, etc. Apple has a lot of scale for the cores it is designing.

There is risk, but Apple took that risk a decade ago with their iPhone CPUs and they've had a great advantage there. While Intel and others are looking to revitalize their CPU game, it seems like they're doing it at a higher thermal/power level than Apple is looking for - and trying to benchmark themselves against Apple parts at a fraction of the power. Apple is getting to design parts that do what they need and they've proven that they can beat the industry for more than a decade. I'd say it was the right decision to move to Apple Silicon for Macs.


It's kind of funny seeing the Intel iGPU being bashed even though the basic GPU in the low-end M chips is barely better. In fact it can run less display and since it is Metal only, many games will run worse. But I guess Intel iGPUs are bad yes, the point is they were never supposed to be anything else than display drivers... Intel actually made GPUs and even though they were drivers issues, they are pretty decent for the price, everything considered.

As for the rest, Apple enjoyed a lead in the phone chips for a while, but a lot of it is related to the cash at hand and the possibility to sell most of your products at very high margin. Androids didn't invest as much money, because there is significantly less people willing to put as much cash in a phone if it doesn't come with the "social status" thing Apple has. And now even though the Qualcomm chips are somewhat worse than Apple's, it doesn't matter for the vast majority of people. In the end Apple went and chased premium margin instead of volume and I think it's going to bite them in the ass in the long term because in technology volume and network effect are what matters. There are many "superior" technologies that disappeared or never really took off just because of that, so I doubt Apple will fare much better. I think it's all downhill from here and by the time they notice they actually have to compete on price they will have lost the war since there will be less people willing to go for the non-standard. But then we will hear about how their competitors are villain monopolist oppressing them even though they had the money and means to pursue a more general competitive approach.

Apple fans seems to believe they are winning with their efficient chips but as far as I am concerned, they are losing everywhere it matters. High performance desktop, device convergences, cloud, AI, gaming, 3D focused softwares... They still have 3 OS and laughed at Microsoft unifying strategy, but now a Surface can run both desktop software and more mobile optimized software gracefully and you still have to run a half assed iOS "big" version on a 1k+ iPad Pro.

Time will tell...


Apple needs to stop explaining stats in vague weeb speak and just give us what the actual compile times in GCC are.

>Then again, that takes courage.


This guy is becoming too much of an apple apologist for me. I've seen him defending pretty bad decisions like John Gruber (who always was) so I've stopped following him like I have Gruber.

Also, because I've disagreed so much with Apple's decisions in the past years I've abandoned their ecosystem altogether so I'm also much less invested in the topic. Though I still use a Mac for work as a "least bad" option.

Too bad because he did have good technical insights.


Ps I know this is a kinda hot take but everything Apple has rubbed me the wrong way the past 10 years and I'm less and less aligned with Apple fans. I know this doesn't apply to everyone.

In 2004 I moved to Mac because it was a powerful and configurable Unix OS with the benefit of a consistent UI (nothing on Linux was there then, it was a mess) and major commercial apps like Office and Photoshop.

The latter are still true but the platform is so locked-in that most of its features are useless to me as a multi-OS person. And the hardware is quite locked down as well which simply doesn't work for me. A lot of the reasons for this are not user-centric but commercial.


I agree. There is more of us by the day. Don't worry, Apple is going to feel it soon enough. We will see in 10 years, Apple strategy will look very stupid because you will be able to get what are today's 2.5K powerful gaming computers for about 1K. Apple computers will be like Porsches, very nice and all but unless you got free money, really not something most people would consider buying. In other words, a niche; very pretty, but still a tiny market.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: