Hacker News new | past | comments | ask | show | jobs | submit | tagrun's comments login

Ferromagnetism has nothing to do with currents, it is due to aligned spins of partially filled shells. Below a certain temperature (Curie temperature of the material), exchange interaction (which penalizes any misalignment, in the case of ferromagnetic exchange interaction) between electrons leads to this alignment.

Spin is a type of intrinsic angular momentum that is not associated with any spatial motion.

The Feynman lecture you linked to is an explanation why currents fail to explain ferromagnetism. You need to read the next chapter, but being a lecture for undergrads, it doesn't go deep into the subject anyway. If you're really interested, any modern book on magnetism would be much helpful.


You said,

> Ferromagnetism has nothing to do with currents

This is why I said ferromagnetism is circulating current in the sense of "to a first approximation" and "heuristically". Wiktionary defines "heuristic" to be:

> a practical method [...] not following or derived from any theory, or based on an advisedly oversimplified one.

I think that if you ask Feynman, he would probably agree or sympathize with the naive idea of "atomic currents" as a heuristic argument in the introduction of this topic... which is nothing new anyway, and has been a heuristic argument used in electromagnetism for a long time, at least before QM.

In Feynman's own words,

> These days, however, we know that the magnetization of materials comes from circulating currents within the atoms—either from the spinning electrons or from the motion of the electrons in the atom. It is therefore nicer from a physical point of view to describe things realistically in terms of the atomic currents [...] sometimes called “Ampèrian” currents, because Ampère first suggested that the magnetism of matter came from circulating atomic currents.

You said,

> Spin is a type of intrinsic angular momentum that is not associated with any spatial motion.

Yet the concept of spin in quantum mechanics was originally developed using macroscopic rotations as an analogy, although today we know that spin is an intrinsic property of subatomic particles (thus the joke, "Imagine a ball that is spinning, except it is not a ball and it is not spinning.") In the same sense that Ampère's concept of "atomic currents" was developed using circulating electric current as an analogy.

> The Feynman lecture you linked to is an explanation why currents fails to explain ferromagnetism. You need to read the next chapter.

Of course, "The actual microscopic current density in magnetized matter is, of course, very complicated." This is surely explained in the next chapter. I could've mentioned "atomic currents" without citing any link, but I included it to allow anyone who's interested to read the whole thing in context.


To the parent and its sibling comments: There is no atomic or subatomic current that can explain ferromagnetism in any approximation.

You read some Wikipedia pages and Feynman lectures of physics. I'm a physicist who has done well over a decade of research in magnetic materials.

In understanding of ferromagnetism, many incorrect theories have been proposed. By connecting ferromagnetism to circulating currents (i.e, paramagnetism and diamagnetism), you just repeated the same mistake.

You're trying to bend the words to avoid being wrong. Physics is not philosophy or debate club. There is no approximation in physics in which electron is a ball with some radius, or its spin is due to a circulating current in physics. Any such explanation attempt fails spectacularly if you actually try to do the math (which gives an electron surface that is moving faster than speed of light, as Uhlenbeck/Goudsmit who proposed this incorrect idea quickly found out), so it doesn't even work as an approximation of any kind.

> Yet the concept of spin in quantum mechanics was originally developed using macroscopic rotations as an analogy,

Who developed this theory in quantum mechanics, where and when? Pauli, who first introduced it into quantum mechanics and the namesake of spin 1/2 matrices, insisted that it is purely quantum mechanical with no classical analogue. And regardless of who said what over 100 years ago, today, it is well understood that spin has nothing to with electric charges that move or rotate in space.

More importantly, the reason ferromagnetism develops in the first place is due to exchange interaction (as I wrote above) between magnetic moments, which is due to Pauli exclusion principle and also has nothing to do with movement of charges.

Furthermore, such magnetic moments (called magnetic impurities in that context) ruin the superconducting order by breaking the time-reversal symmetry, so trying to make a connection to ferromagnetism in the context of superconductivity is even worse.


> You read some Wikipedia pages and Feynman lectures of physics. I'm a physicist who has done well over a decade of research in magnetic materials.

In the same way that a geodesist navigates using a reference ellipsoid defined by WGS-84, while a city commuter uses Cartesian coordinates on a flat map. The commuter's navigational tool will never work in geophysics research, and it doesn't need to be.

> To the parent and its sibling comments: There is no atomic or subatomic current that can explain ferromagnetism in any approximation. [...] Any such explanation attempt fails spectacularly if you actually try to do the math (which gives an electron surface that is moving faster than speed of light, as Uhlenbeck/Goudsmit who proposed this incorrect idea quickly found out), so it doesn't even work as an approximation of any kind.

I consider "circulating currents create ferromagnetism" to be as true as "an atom's structure is similar to a solar system." Both concepts break down when it's examined in details, so its use by research physicists is obviously unacceptable, but I consider it's nevertheless as an useful mental image in introductory discussions among non-physicists.

Would you consider Rutherford's original atom model to be a first approximation? Can it be considered a very oversimplified but useful heuristic, at least when people who know anything about atoms are first introduced to this concept? Alternatively, would you consider Rutherford's atom to be "an explanation attempt that fails spectacularly if you actually try to do the math (which gives an electron that collapses into the nucleus in picoseconds, as Rutherford's colleagues quickly found out)?

If you believe the latter case, everyone can stop this conversation right now. Because it means the entire disagreement is entirely down to what kinds of "metal images" are acceptable, rather than any factual, like "whether a full quantum treatment of ferromagnetism is necessary to completely explain ferromagnetism (of course it is)." The rest of us who don't solve research problems believe a toy model is still interesting, but don't deny (nor mention) better models. You, as a professional physicist, believe many "what if?" metal models from history are just not legitimate physics, and should not be mentioned at any circumstances to avoid poisoning the minds of youths - an approach known as Whig history, in which scientific progress marches from one victory to another, and all losers be damned - a perfectly valid approach for teaching physics to students who only care about pure physics science, instead of "who said what."

As a side note, I know some engineers who really hate the idea that electric circuits works due to an electron flow. The most extreme one I've seen of wanted to ban this concept in introductory textbooks, calling it a big lie (an explanation attempt that fails spectacularly if you actually try to do the math, which gives the speed of an electron 30 billion times slower than the speed of light in free space). As we all know, the steady-state electron flow was only a result of the transient propagation and reflection of electromagnetic waves in free space or dielectric materials. Thus, they believe the wave model should be the only interpretation in a science textbook, since "they're high-school teachers, I'm a design engineer who work with high-speed digital systems with 20 years of experience, and I know for sure that high-speed circuits and computers can't even be made functional if you ignore fields and transmission line effects." Meanwhile, I believe the electron flow model still works as an introductory mental image (although the field view perhaps needs to be mentioned earlier).

> Who developed this theory in quantum mechanics, where and when? Pauli, who first introduced it into quantum mechanics and the namesake of spin 1/2 matrices, insisted that it is purely quantum mechanical with no classical analogue.

The earlier "electron as a rotating ball" idea was considered by Ralph Kronig and Uhlenbeck-Goudsmit in 1925. Pauli personally never accepted it due to its unphysical flaws. Only in 1927 did Pauli publish a rigorous QM treatment. Thus, "electron spin using classical rotation as analogue" was still an intermediate step before establishing this concept in QM. It was a footnote in history since Pauli was a great physicist and already considered the problem himself earlier and found the solution before everyone else. Otherwise this intermediate step may last longer than 2 years.

> Furthermore, such magnetic moments (called magnetic impurities in that context) ruin the superconducting order by breaking the time-reversal symmetry, so trying to make a connection to ferromagnetism in the context of superconductivity is even worse.

This, in comparison, is a more interesting criticism.


What you say is correct only when you adopt certain specific narrow definitions of the words, which you have not explained.

In its original sense, an electric current is any kind of movement of electric charge. In this wide sense, it also applies to the source of ferromagnetism.

Its meaning can be restricted to refer to the translational movement of electrically charged particles. With this narrower sense, there is still no need to use quantum mechanics to explain ferromagnetism. Even in classical electromagnetism, with the narrower-defined current, the sources of magnetic fields are decomposed into distributions of electric current densities and of magnetic moment densities, where the latter are the source of ferromagnetism. If necessary, it is possible to also use distributions of higher-order moment densities and the series of moments when the "electric current" is used in the narrow sense (of a first order moment) corresponds to the "electric current" used in its original, wide sense.

The isolated sentence "Spin is a type of intrinsic angular momentum that is not associated with any spatial motion" is logically contradictory (because, by definition, angular momentum is a characteristic of moving bodies). It can be correct only when you first specify that by "spatial motion" you mean only a certain kind of spatial motion.

The joke mentioned by another poster "Imagine a ball that is spinning, except it is not a ball and it is not spinning" is just a joke, because there is no doubt that the elementary particles are spinning.

Even when you model the elementary particles in the standard way, as point-like bodies (and it is debatable whether this is a good model), you cannot say that they are not rotating, because this would be the same mistake as saying that a delta distribution has a null value in the origin.

On the contrary, while you cannot say other things about the value of a delta distribution in the origin, what you can say with certainty is that it is not null.

In the same way, while you cannot say anything about characteristics of an electron like radius, mass density, angular velocity, electric current density and so on, you can say with certainty the values of various integral quantities (which integrate the corresponding delta distributions), like mass, electric charge, angular momentum and magnetic moment, so you can say with certainty that any electron is rotating (i.e. it has a non-null angular momentum).


As other commenters have said, whether or not an electron’s magnetic moment is “to do with currents” is a little open to interpretation.

I’ll add that the Dirac equation (governing electron field) correctly predicts magnetic moment given the inputs of charge and mass. * I interpret this as indicating that magnetic moment is a derived phenomenon just as it would be in the classical picture of a spinning ball of charge; I.e. the quantum picture refines but does not totally discard the classical understanding.

* Well, technically, sympathetic vibrations with all the other standard model fields also make tiny contributions to the magnetic moment.


Raw performance per dollar (after including inflation adjustment) has stagnated in 40 Series. A similar thing happened in 20 Series.

SUPER series has been a response to rival products offering better raw performance/price released afterwards.

Power consumption is a separate issue which may or may not be a concern depending on where you live.


> They have perfectly great existing hardware decoder offloading APIs via the various OS' native APIs for videos

One vendor specific API, not "various OS' native APIs".

Firefox currently supports hardware video decoding with Intel's vendor specific VA-API only on Linux, which is not supported by NVIDIA. (A third-party VA-API to NVDEC translation layer for Linux does exist on GitHub, nvidia-vaapi-driver, but it's not yet reliable as the officially supported VDPAU or NVDEC, and is not included in official linux package repositories.)

Intel has VA-API, AMD has AMF, and NVIDIA has VDPAU which is being replaced by NVDEC/NVENC.

The idea behind Vulkan Video Extensions is to have a vendor independent and cross-platform video API.


> No, Firefox currently supports hardware video decoding with Intel's vendor specific VA-API only, which is not supported by NVIDIA.

Intel is behind VA-API originally, but I don't think it's fair to say it's a vendor specific API anymore. It's supported by the open source drivers for GPUs from all 3 vendors. It's just that the open source drivers for Nvidia cards are not very practical and the proprietary drivers only support vdpau and nvdec/nvenc


You can make the same argument for VDPAU. AMD officially supports it, and there is an unofficial translation layer with limited capabilities for some Intel GPUs. Is VDPAU not a vendor specific API anymore then?

Intel, AMD and NVIDIA have their own vendor-specific video APIs, and even when they provide official support for the API of another vendor, it tends to expose a limited subset of the full functionality (like the list of available codecs and encoding features).

You are free to call these vendor specific APIs for what they are or something else, but the reality has been that there is no single video API officially supported by Intel, AMD and NVIDIA. This changed with Vulkan Video.

But Vulkan Video isn't just about desktop: mobile devices, Raspberry Pi, etc. are expected to get on board with it eventually, just like they did with Vulkan.

> It's supported by the open source drivers for GPUs from all 3 vendors.

Which 3 vendors are you referring to? Intel, AMD, and who?

> It's just that the open source drivers for Nvidia cards are not very practical and the proprietary drivers only support vdpau and nvdec/nvenc

Why are you bringing up open source drivers, and what is not practical? Both official open source drivers (open-gpu-kernel-module) and unofficial open source drivers (nouveau, through binary firmware) support VDPAU. However, NVIDIA's drivers (open source or binary) does not support VA-API.


Nouveau supports va-api on Nvidia. Nouveau is not supported by Nvidia of course.


What of it? So does nvidia-vaapi-driver. They're third party projects, that's very different from "supported by the vendor", and doesn't change the fact that NVIDIA as a vendor offers no support for VA-API.

By the way, nouveau's support is currently limited and not useful: https://nouveau.freedesktop.org/VideoAcceleration.html see Video engine support status table, only old GPUs and no H.265 or AV1 support.


> What of it?

It was an answer to this question specifically

> Which 3 vendors are you referring to? Intel, AMD, and who?

I either missed some of the other text in your post or it was added after I started to reply.

> Both official open source drivers (open-gpu-kernel-module)

This is not remotely close to being a complete graphics driver. Most of a GPU driver on Linux is in userspace and there is no official open source user space component.

> Why are you bringing up open source drivers, and what is not practical?

nouveau has never been practical for serious use due to poor performance and mediocre hardware support (as you noted). open-gpu-kernel-module is only practical when paired with a proprietary userspace driver.

Anyway, my original point in all this is that describing VA-API as an Intel vendor specific API is unfair given it has been well supported on AMD GPUs for a long time now and on nouveau it's supported as well as VDPAU (i.e. not very well as you note). I did not intend to imply that it was universal. I didn't even intend to imply that VDPAU is a vendor specific API (though as a decode-only API it's not really a complete replacement).

Intel tried to make va-api the standard for hardware encode and decode on Linux, Nvidia tried to make VDPAU the standard for hardware decode on Linux. Neither was entirely successful. By contrast, NVENC/NVDEC, AMF and the Intel Media SDK (and whatever they replaced this with) never had such ambitions.


Right, this is yet another instance of Nvidia not wanting to play nice with the other kids.


> One vendor specific API, not "various OS' native APIs".

Incorrect. Firefox uses Windows Media Foundation, which is cross-vendor, on Windows. It uses MediaCodec on Android which is again cross-vendor. Presumably it uses whatever iOS' equivalent is as well.

It only uses VA-API on a single OS, Linux, and that's probably more a reflection on the media qualities (or lack thereof) of Linux as a whole. Maybe Vulkan video extensions will be the savior on Linux. Or maybe it won't because it won't be anyone's focus of investment since it's largely a Linux-only problem in the first place.


What is "incorrect"? The full sentence that you conveniently chose to cut in the middle before quoting (apparently to fit into some pessimistic forecast about the significance of Linux desktop) reads

> Firefox currently supports hardware video decoding with Intel's vendor specific VA-API only on Linux, which is not supported by NVIDIA.

(emphasis added)

You further wrote:

> Firefox uses Windows Media Foundation, which is cross-vendor, on Windows. It uses MediaCodec on Android which is again cross-vendor.

And? None of those APIs are cross-platform. Vulkan Video will eventually allow developers (including Firefox developers) to write a single code path for video to cover a wide range of platforms and vendors (likely with the exception of walled gardens like Apple-land, although someone might find a way to support like via a wrapper like MoltenVk for Vulkan).


> The full sentence that you conveniently chose to cut in the middle before quoting (apparently to fit into some pessimistic forecast about the significance of Linux desktop) reads

What are you talking about? They didn't quote that sentence at all, and didn't cut in the middle of the sentence they quoted.

> And? None of those APIs are cross-platform.

Your original objection, the thing that got quoted, was about whether things are cross-vendor. That question is completely unrelated to whether things are cross-platform.


> What are you talking about? They didn't quote that sentence at all, and didn't cut in the middle of the sentence they quoted.

Obviously, I meant to say statement, not sentence, but I can't edit it anymore.


My original statement above about what the point of Vulkan Video is

> The idea behind Vulkan Video Extensions is to have a vendor independent and cross-platform video API.

(emphasis added)


You did say that, but it's not the part of your post they were responding to.


> You did say that, but it's not the part of your post they were responding to.

So if someone criticizes a portion of your statement which is already countered by your original full statement, you're not allowed to remind your full statement. What kind of logic is that?

My original post says the point of Vulkan Video is it will be cross-platform and cross-vendor. And gives one example of cross-vendor side of things on Linux.

Someone criticizes me by essentially saying "you are incorrect, that's only on Linux. Windows, Android and iOS have their own video APIs...". This "correction" is incorrect because I already said on Linux, and it goes on to actually reinforce the post that he is responding to by highlighting cross-platform side, which also is in the post he is responding to.

So, if you look at the full conversation, the criticism is self-contradictory. This is what I'm pointing out, but you are implying I'm not allowed to do that.

I disagree. When you fragment a statement in a way that changes its meaning and make a straw man out of it, people are justified in responding to it.


> So if someone criticizes a portion of your statement which is already countered by your original full statement, you're not allowed to remind your full statement. What kind of logic is that?

The other stuff in your comment did not "counter" what they said. You made statements about cross-vendor and cross-platform. They chose to only respond to one of those statements. That's not incorrect.

> This "correction" is incorrect because I already said on Linux

The first part of your comment specifically said "not "various OS' native APIs"". That goes beyond Linux. The later part of your comment was about Linux in particular, but your introduction was an overall statement that wasn't true.

> When you fragment a statement in a way that changes its meaning

They didn't. You misspoke and they didn't know what you actually meant.

And from your other post: > Obviously, I meant to say statement, not sentence, but I can't edit it anymore.

That was not obvious. They quoted an entire paragraph, and the subsequent paragraph does not change its meaning the way you're claiming it does.


I'm always annoyed how any Linux media player or encoder needs to bring its own entire media operating system, down to each individual nut and bolt.

On Windows there's Windows Media Foundation and DirectShow that centrally manage everything and also support the "individual nut and bolt" approach. Android has its own central thing (MediaCodec?) that must be used. MacOS and iOS presumably have their own central manager (Quicktime?) too.

But Linux? It doesn't serve as an operating system for media. It's tremendously inconvenient as an admin/user rather than an evangelist.


You don't need to implement every nut and bolt in the application. Lot's of useful things can do the heavy lifting (Pipewire, ffmpeg, libplacebo, Mesa and so on). Linux isn't after calling it all using some uniform "DirectFoo" naming scheme, but tools are there.

Comparison is also invalid. Linux as a whole (not the kernel but OS experience) isn't controlled by some Big Brother who decides what and how it's done single mindedly. So such kind of composite result is somewhat expected.


Hence why it will never be embraced by desktop application developers, and Electron it is.


Yeah, keep complaining about everything not being proprietary enough, while everyone who needs just makes it work (OBS, mpv, etc.).


Those 2% will appreciate their efforts.


Those who use it appreciate their efforts. You aren't using it, why are you even complaining especially with complete nonsense comments. Anti Linux shilling should be getting old.


Unfortunely I still have to, from time to time.

Luckly Android, ChromeOS and WebOS as proper Linux distributions have replaced most of it.


Android hasn't displaced Linux on phones, there wasn't any.

ChromeOS hasn't displaced Linux on school laptops, there wasn't any.

LG WebOS hasn't displaced "Linux", it competes with Google TV (formerly Android TV)


Oh boy, another Linux advocate on the lose.

So here goes a children explanation.

> Unfortunely I still have to, from time to time.

I, pjmlp, still have to use GNU/Linux desktop from time to time.

> Luckly Android, ChromeOS and WebOS as proper Linux distributions have replaced most of it.

Android, ChromeOS and WebOS, have replaced most of my needs, pjmlp, for GNU/Linux in the desktop and similar devices.


> Android, ChromeOS and WebOS, have replaced most of my needs

Great! I really hope you're happy with that setup! It's your personal computer and by all means, do what works well for you. In the end that's always a personal thing that's different for everyone. Who am I to judge how you use your computer?

But maybe ... stop complaining about Linux desktop then? If you don't like it? This must be like the 3rd time I've seen these types of single-line dismissive "Linux will never win the desktop"-comment from you in the last few days. Just one line, little or no context, or explanation, and IMHO also zero value, and an entire discussion derailed.

This is just becoming disruptive. You don't need to say anything you know. Personally, I rather dislike a number of things, but you don't see me complaining about it with one-liners every chance I get – and when I do say something, at least I make sure it's something of some substance, when I feel it actually contributes. And I sure as hell don't go around complaining people are "children" for disagreeing.


You can ignore my comments you know.


Stop wasting everyone's time.


Likewise, go do some kernel contributions.


Considering how user hostile most app developers are, I don't miss them.


Install ffmpeg and you have all the codec support you need. How is this a real problem?

Yeah, binary software will have to ship its own copy of ffmpeg... This isn't unique to media codecs though.


And when someone installs some obscure or outdated and vulnerable codec on these systems, it's then automatically exposed to all sorts of applications to exploit. Maybe Windows sandboxes that these days(?) It was definitely a problem in the past.

No perfect solutions here; both "system-wide codecs" and "every application brings their own codecs" have their own up- and downsides.

Besides, with ffmpeg and gstreamer the system-wide codecs paradigm also works on Linux.

This is one of those "it's different but it doesn't really matter much" type of things. Most people "just" install vlc or mpv or whatnot and things will "just work" for them, not really different from Windows. That it's technically slightly different is almost entirely transparent to the user.


Yeah on UNIX side, NeXTSTEP, Irix, Solaris had their own thing, as graphical workstation UNIXes, and were great.

Ideally that kind of thing would be part of GNOME, or KDE, but then there are those that rather keep using twm like experience, making GNU/Linux really only good for headless experiences, at least the UNIX/POSIX part is always there.


AMD AMF is not open-source; only the SDK part of it is. The runtime part is closed, bundled with the "pro" drivers.

It is also intended as a multi-platform abstraction.

This makes it a no-go as a platform API. The open drivers for AMD use VA-API.


It says in older adults.

This is a meta-study, touching on that "contrast" already: there is a subsection in the paper dedicated to this, where they claim that

"The major factor in cerebral bleeding however is hypertension, and in an RCT of aspirin based on more than 18,000 hypertensive patients—all of whom were receiving ‘optimal’ antihypertensive treatment—there were no additional cerebral bleeds in patients randomised to aspirin" (Refs 46 and 47).

which seems to be in contradiction with the article corresponding to the news you linked to: https://jamanetwork.com/journals/jamanetworkopen/fullarticle... and strangely doesn't cite or comment on Refs 46 and 47 from the paper of the main thread, possibly because they don't seem to be focusing on older adults.

There is also a subsection on gastrointestinal bleeding.

Intracranial bleeding isn't necessary something that cause permanent damage, or lethal by the way.


Depends on the encoder, this website provides easy-to-visualize data sets for various encoders at various settings https://arewecompressedyet.com/ AV1 encoders tend to have better VMAF score at a given bits-per-pixel.


Quaternions that represent 3D rotations have exactly 3 parameters, because they are normalized: https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotati...

Quaternions that are not normalized do not form a representation of the spin group Spin(3).


I'm assuming Spin(3) is equivalent to SO(3)?


It's a double cover of SO(3) (because q and -q correspond to the same element of SO(3)), and is isomorphic to SU(2)


Oh wow thanks I understand


That becomes relevant for frequencies that are high enough to break Cooper pairs. But this material is claimed to be in the superconducting phase up to 400K, which corresponds to a superconducting gap of 8.3THz.


People buying desktop computers care about performance and price, not battery life or the fruit logo on the case.

You'll find it difficult to convince many people who are not already locked in Apple's walled garden and sense of consumerism around its branding to pay $4000 for such a performance, with limited RAM/disk, and no real path to upgrades or user repairs.

> most of these machines can provide better price/performance than most anything else on the market,

I agree, for better performance/price though, you wouldn't buy from Apple.

(A meta comment: I find it difficult to comprehend how people can become so emotionally invested in a company. Finding an excuse to every single shortcoming of Apple, not leaving any criticism unanswered for them.)


Look at the price of the MacBook Air. Then tell me you can get anything that remotely approaches that level of performance from any other major vendor, at anything remotely close to the same price. Or the same battery life.

You can't, because it doesn't exist. Here's a hint -- the price is much closer to $1000 than $4000.

I've been a MacFanatic since December of 1983, when I saw a by-invitation-only demo of a prototype of the 128k Mac, a month before the Super Bowl commercial where Apple officially introduced the thing. Yeah, that famous one.

It took only five minutes of playing around with the prototypes of what came to be called MacWrite and MacPaint (because those were the only two pieces of prototype software available yet) to convince me that this was the future of computing, and that all computers should work like this.

But I'm also a Mac realist. I fully recognize that there are markets that Mac doesn't serve well, like Enterprise. Apple has always understood the workgroup level and served that market well, but as you scale up, they do worse and worse. The fact that the Mac works well in an Enterprise is in spite of Apple design philosophy, not because of it.

And I also understand that there are markets where Apple makes no attempt to serve that market, because they don't consider it worthwhile -- whether I agree with them or not.

I even did a short six month contract working for Apple Retail Software Engineering, where I worked with the team that developed all the proprietary software that the Apple Retail personnel use, whether that's everyone in the stores, or the people back at HQ that are coordinating with the people in the stores.

I delivered a CI/CD system for them, so that they would no longer have to do all the building of all their software on the laptops of the individual developers.

After six months, I had contributed code to each of their main code bases (iOS, macOS, and Linux for their back-end servers) to get the to compile on the CI/CD system, and the rest of my job was done. And in the process, I learned that Apple is a company I am happy to be a customer of, but I don't ever want to work there again.


> Look at the price of the MacBook Air. Then tell me you can get anything that remotely approaches that level of performance from any other major vendor, at anything remotely close to the same price. Or the same battery life.

I will happily be the one to tell you.

For between $600-700 you can get a gaming laptop from MSI, HP, Acer, Lenovo, and the like with a 12th or 13th gen Intel CPU (e.g. i5 13420H), 8-16 GB of RAM, and a RTX 3050 or better, along with a 512GB-1TB SSD. Single-core performance is just shy of the M2, multi-core performance beats the M2, GPU performance is significantly better, and the only major shortfall is battery life.

As others have said, the M-series is great at power efficiency, but if performance is your only criteria there are significantly cheaper alternatives. Sure, you're going to have to keep your laptop plugged in most of the time and it will probably sound like a jet engine at times, however Apple is NOT the leader in value for performance as you believe.

Regarding battery life, you could approach Apple levels of usage time by undervolting your CPU. Depending on your configuration and hardware, you could gain a couple more hours of usage time by undervolting as well as using integrated graphics and disabling the GPU in favor of integrated graphics.


So, what is the battery life on that device? One hour? Two hours?

Could you get twelve hours out of it? Eighteen?

And what is the resolution on that screen? Anywhere close to 2880 by 1864?

You have failed to deliver a system that actually meets the specs of the MacBook Air 15" that you are comparing against, and then declaring yourself the victor because your machine is marginally faster.

You fail to grasp that you have actually proven my point for me.

Apple is designing their machines for real people in the real world who may not be anywhere close to a power outlet for a long time. Or, who may not want to tie themselves down to needing to be close to a power outlet all the time.


> So, what is the battery life on that device? One hour? Two hours?

I'd put the average battery life of a $600-700 gaming laptop at ~4-6hr, although if you undervolt the CPU (getting maybe 80% of the performance) you could achieve 8+ hours on a charge. Screens vary, but they're typically 1280x1080 -- however at 120/144Hz. I'd say both are acceptable for most people.

And I am not claiming to compete against the MacBook Air 15" for $600-700. I am just pointing out that performance-wise, a $600-700 gaming laptop can equal a $1200 Macbook, and hence is a much better value.

If however you want me to find a challenger to your MacBook Air 15", I will.

With a $1200 base 15" M2 Macbook Air budget, you can get a Dell XPS 15 for $1,149 (https://www.dell.com/en-us/shop/laptops/12th-gen-intel/spd/x...). The i7-12700H equals the M2 in single-core and beats the M2 in multi-core workloads. It comes with a RTX 3050, 16 GB of DDR5, and a 512GB SSD, and a 3456x2160 OLED touch display. You get 8GB more RAM, twice the storage, a better GPU and multi-core performance than the base model Mac Air for $50 less. Battery lasts up to 13 hours.

I am admittedly also an Apple fan, but that doesn't mean I have my eyes on the competition. While Apple has had its share of advances, and while it is still my go-to for my laptops, I will be the first to admit that there are plenty of alternatives depending on what you're looking for. And some of those alternatives are hands-down a better value depending on your criteria.


If you're looking for a gaming laptop, then I agree that Apple doesn't have good solutions in that space. That's primarily because there's not much in the way of good games that run on macOS. Hopefully, that will change with the new game porting kit that Apple has announced, but there's no way to tell right now.

But while 13 hours of battery runtime is quite good by Intel/Windows standards, it's still not the 18 hours you can get with the MacBook Air 15".

It's very clear that Apple has decided to optimize for certain things in their designs, and battery lifetime is one of them. They'll take a slight hit on CPU performance to get that. And most people won't notice the minor loss in CPU performance, but they will notice the significant increase in battery runtime.


There are Windows contenders that do offer comparable battery life to Macbooks, if that's your criteria. The $549 Acer Swift 3 (i7-1165G7, 8GB RAM, 256GB SSD) gets up to 16 hours, and the $799 Asus ZenBook 13 (i7-8565U, 8GB RAM, 512GB SSD) gets up to 15 hours.

Performance is close enough to Apple silicon that both laptops would be acceptable competitors for a general consumer's uses, RAM and storage are equal to if not better than Apple's base model Air's, and the price is unquestionably better.

Apple does strike a good balance, but it most definitely isn't leagues ahead of any other company no matter which way you look at it.


If you want a display comparable to an MBA, the Dell XPS starts at $1449 (according to your link).


Indeed, although that extra display is optional. For a MacBook Air with 16GB of RAM and a 512GB SSD (with the XPS still beating it for CPU & GPU), you'll pay $1,499 which again is a $50 savings. I personally don't place too much value in a screen but that's due to my workload, I'm sure that people who do video/graphics work would want the premium screen.


I wouldn’t buy a 1X resolution screen in 2023. I’m surprised they are actually still sold, the quicker we move up to modern resolutions being standard the quicker we can just call them the new 1X and stop all the resolution hackery we have to do to support both at the same time these days.


> Look at the price of the MacBook Air. Then tell me you can get anything that remotely approaches that level of performance from any other major vendor, at anything remotely close to the same price.

> You can't, because it doesn't exist.

Your tunnel vision is appalling. You're obviously not aware of what the market is offering.

From Apple, at $1000, you get an overly priced laptop with weak specs: 256GB SSD, 8GB RAM and an M1 CPU (which is weaker than a Core i5-1240P or Core i5-1335U), with no path to upgrade: https://www.apple.com/macbook-air-m1/ That's essentially a mid-range tablet in a laptop form which is poised to become an e-waste in a few years.

You can get a laptop with such specs for around half of that price from virtually any vendor.

However, it's not just the base price: want 2TB storage? I can just get a Samsung Pro 990 2TB for $160-$170 an install it, but Apple won't allow you to do it and will charge you $800 for their soldered 2TB SSD. Want more RAM? You can get 64GB DDR5 for less than $150, whereas with Mac Book Air 16GB is the max you can get (which makes it is pretty useless for science or engineering applications, and 16GB RAM will probably become very limiting even for casual users in the near future) and it costs an extra $200.

I will also have the option to upgrade it to an 8TB (or 16TB by then, who knows) SSD and 96GB DDR5 when their prices come further down in the future, adding more to its life. I will also be able to easily replace the battery myself when the time comes. Whereas your option to upgrade will be to buy the brand new M6 or whatever Mac laptop, perhaps for another $2000 with hopefully half the RAM and SSD of what I will have.

Like most people who use laptop for work, 18 hours of battery life isn't addressing a real need that I have. People have mostly moved on to smart phones and tablets for watching video, casual browsing, etc in the absence of an electrical outlet. Nonetheless, for those who actually need such long battery life, it is possible to buy such laptops from other vendors for still half the price at those specs though (like Acer Swift 3 mentioned in another comment).

But nonetheless, you still feel you need to run to defend this company? It sounds like you've invested so much in Apple and just don't want to be wrong about it, even if it means living in a world of alternative facts in which any laptop that "remotely approaches that level of performance from any other major vendor" for $1000 simply "doesn't exist".


Which $500 (or $1000) laptop lets you install 64GB RAM?

(The Acer Swift 3 supports max 16GB)


As an example, pretty much any recent Dell Inspiron, which happen to be quite serviceable too.

Such as https://www.dell.com/en-us/shop/dell-laptops/inspiron-15-lap... which is $530 at the moment but you can get it for cheaper during Black Friday. The manual may say up to 32GB but installing 64GB works just fine, as the CPU supports up to 64GB.

For $330, you can get an i5 Lenovo IdeaPad 14" today but it goes up to 36GB only because one of the 4GB RAMs is soldered https://www.microcenter.com/product/666315/lenovo-ideapad-3-...

You can install 64GB RAM to i3 laptops as well, which can be as low as ~$300, for example https://www.dell.com/en-us/shop/dell-laptops/inspiron-15-lap... is currently $350, but you can get a similar one for ~$200 during Black Friday.

64GB RAM in a laptop may cost a fortune in Apple-land, but it's pretty cheap to get for "normal" laptops.

(You can't upgrade RAM with Acer Swift 3, it's soldered.)


Yes you can upgrade a lot of Dells, but the maximums in their manuals are based on their supported maximums, so if you run in to problems, your Dell support probably won't be of much use.

I find people trying to compare to a Dell (especially to anything which isn't trying to be a competitor - the XPS) come off a bit naive. Once you start talking about the XPS, the price difference shrinks. In the UK the latest XPS 13 with an i7, 32GB and 1TB and the better screen is around £2,000. The M2 Macbook Air 13 with 24GB and 1TB and fast charger is £1,979. The 13 pro with same specs is £2,149.

Having owned an XPS before (with its flexing chassis, coil whine, overheating, cpu throttling, noisy hot fans, smelly plastic), you couldn't pay me to have one.


The worse single-core performance is typical for server CPUs which are intentionally underclocked compared to their desktop counterparts for stability that is expected under non-stop server workloads.

Scroll to the right for the corresponding consumer CPU: Core i9-13900K has higher score in almost all rows.

From the article:

> benchmark result shows Apple's M2 Ultra cannot beat Intel's Core i9-13900K in single-thread workloads and even fall behind in multi-core workloads in Geekbench 5.


I have no doubt that Apple is capable of delivering a product that can beat Intel's i9-13900K, especially since Apple is using a much more modern 5nm process vs Intel's 10nm process for the i9.

As you point out, server CPUs are intentionally engineered for a target market. Apple's processors are very intentionally engineered as well. When engineering their product, they had a certain market in mind, and design and performance trade-offs were made accordingly. As others have noted, power efficiency seems to be a big priority. The M2 Ultra has a TDP of 60W. The i9-13900K has a base consumption of 125W, and draws up to 253W under stress. So do the math. Intel achieves 32% better single-core performance and 41% better multi-core performance (Cinebench) for 422% of Apple's power consumption. If there's something impressive here, it's that Apple is able to do so much with so little. If Apple wanted to, they could probably conjoin two M2 Ultra's and soundly beat the i9-13900K by a considerable margin and still use about half the same power to do so. The real question is why any consumer would need that much compute, and the target market for such a niche is probably very small which is why Apple didn't do that


Your fantastic statements about the potentials of Apple is bordering religion.

It's not just Apple, any big player can do what you said, but there are connectivity trade-offs in chiplet design. Its been open to any TSMC customer including AMD (which has been doing it well before Apple). Intel has also fabricated its own chiplet CPUs already and is currently sampling them.

> The M2 Ultra has a TDP of 60W. The i9-13900K has a base consumption of 125W, and draws up to 253W under stress.

What of it? RTX 4900 TDP is 600W. Just get a better PSU. It's a desktop computer that is always plugged to the wall, not a laptop on battery. It's irrelevant.

What people care about desktop computer is the performance/price and upgradability/serviceability. That's the math you need to do. In terms of what currently exists in the real world, with Apple, you get a CPU that is weaker than a 13900K or 7950X3D, with a soldered 128GB RAM and a soldered 4TB SSD for about $7000. I can build a stronger computer for less than $1500, and it will be both upgradable and serviceable. But the Apple device isn't going to pay the $5000+ difference with lower electrical bills any time during my lifetime: it will instead become a very expensive e-waste in less than 10 years.


> Your fantastic statements about the potentials of Apple is bordering religion.

Could you elaborate?

> It's not just Apple, any big player can do what you said

You're fighting a strawman. I never said that it was only Apple. My statements compared Apple and Intel because that was the topic of this whole thread. I would be happy to discuss any other major player in the appropriate place.

> What of it? RTX 4900 TDP is 600W. Just get a better PSU. It's a desktop computer that is always plugged to the wall, not a laptop on battery. It's irrelevant.

I don't think you understood my post at all. *My point was that Apple very clearly optimized for power efficiency.* That is what my statements regarding power usage served to illustrate.

In fact, looking into the promotional materials for the M2, there is very explicit verbiage regarding "new levels of power-efficient performance", "industry-leading performance per watt", and "using very little power."[0] This just proves my point that power effiency was Apple's foremost goal with the M2.

My point was that, if Apple had optimized for all out performance regardless of anything else, they could have very well done so. My point was that they didn't do that but that they could if they wanted to.

E.g. as we saw with the M1 Ultra, they can very easily conjoin two M1 Max dies using a "UltraFusion" technique.[1] Apple no doubt could use the same technology to conjoin two M2 Ultras.

[0]: https://www.apple.com/newsroom/2022/06/apple-unveils-m2-with...

[1]: https://www.apple.com/newsroom/2022/03/apple-unveils-m1-ultr...


I'm guessing you don't record videos then.

In terms of video capabilities, Canon 5D Mk II is limited to 8-bit 4:2:0 1080p H.264 recording at 30fps, maxing out at 12 minutes of recording. That is a far cry from 10-bit (or 12-bit) 4:2:2 4K, 6K or even 8K RAW or ProRes at 120fps or higher with unlimited recording from a similarly priced camera in today's money.

(It's also limited in terms of RAW photo as well though: the best recording option is 8-bit 10MP RAW)

No phone comes anywhere near that either, not to mention the lenses for phones can't compete with the real interchangeable lenses. The difference probably doesn't matter to someone who is just going to record his baby walking around and watch it on a 7 inch screen, but of course that's not the target audience for those cameras.


10bit and 4:2:2 is only for editing in post, majority of consumers don't need that.


If you read the last sentence of the post that you're replying to, I already said that it won't matter to most people.

That being said, "is only for editing in post" (which is not really true, banding is an issue in scenes with high dynamic range with 8-bit, not limited to sky but also with strong lights or deep shadows) doesn't mean people won't want it. Around 10-15 years ago, in the age of single-digit-GB slow SD cards and weaker camera/phone processors, that's what people used to say about RAW photos repeatedly. Now it is mainstream in even in phones, with built-in editing apps and easy to use desktop programs with few knobs. This means editing itself in post isn't a barrier for mainstream adoption, the issue is video editing currently has a high barrier as it is essentially impossible on portable devices, the programs have their learning curves, and the whole stack requires some financial investment.


Majority of displays are 8bit and it will probably stay as standard for while.

Btw, over 10 years old Canon 5DMIII can shoot RAW video with MagicLantern. Manufacturers should open/update code to their old cameras that are capable do this. Its really disappointing when marketing ruin whole product. No wonder that camera market dying.


Even when targeting 8-bit displays, recording 10-bit is still beneficial. Besides obvious benefits in editing and encoding, simply playing a 10-bit video file straight from the camera on an 8-bit screen is useful when applying any common "effects" (brightness, contrast, LUT, colorspace transformation, gamma correction, tonemapping, etc etc).

> Btw, over 10 years old Canon 5DMIII can shoot RAW video with MagicLantern

Not sure why that is relevant in this context, but any digital camera would be capable of shooting RAW video with hacks: they all have photo-sensors and RAW simply means dumping the digitized signal data in a suitable format. It's a matter of hacking the device. But it doesn't mean you should do it, especially when that's not what they're designed for. Unsurprisingly, in the case of Canon 5D Mark III (which is a photo-oriented camera lacking a stabilizer, you can read about further limitations such as the under-utilized sensor in video mode [which typically happens due to hardware limitations] here https://www.dpreview.com/reviews/canon-eos-5d-mark-iii/25), a lot of potential problems await apparently: https://www.cined.com/consider-this-before-you-shoot-raw-on-... For RAW video, at the very least, you need a more reliable storage hardware hooked to your device with sufficient capacity for recording (meaning CFExpress or NVMe via USB, not SDXC), and possibly active cooling, both missing from that camera so it would require some hardware modding.

That being said, modern video cameras can also do more than the trivial task of recording RAW: they can handle processing and encoding of higher quality videos (resolution, bpp, frame rate) in real time, which requires specialized silicon missing from Canon 5D Mark III.

> Manufacturers should open/update code to their old cameras that are capable do this. Its really disappointing when marketing ruin whole product. No wonder that camera market dying.

1. The camera hardware isn't actually designed for it (by the way, even with new video cameras, there are usually trade offs, you turn one feature on and another becomes inaccessible) 2. that's not the reason why the consumer camera market is shrinking, and 3. doing that would shrink the market volume even further.


Temporal Dithering also known as Frame Rate Control is very often used in 8 bit panels to allow them to display almost as many colours as a 10bit panel.

From the input perspective you're running it as a 10bit panel


Videos are not what most people mean by "photography", 10-bit is mostly a gimmick (there are situations where it gives a real advantage, but they're niche), and higher-than-1080p resolutions are honestly pretty marginal a lot of the time. 30fps is pretty awful though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: