I’ve noticed similar “mini” news stories trickle out after Apple’s announcements. Does this happen organically, or does PR drop tidbits like this to select sources?
It seems like a very specific thing for a reporter to ask and find out about.
Look at the Tweet (X? Blurp? What do we call them now?) - it's got the spectrum of the panel, comparing previous and newer panels.
If you know what you're looking for in those, you can identify a lot of different phosphor configurations just by the particular shape of the RGB peaks - the older ones have a distinctive multi-peaked red emission that I've seen in various LED bulbs as well over the years.
I doubt Apple mentioned it to anyone. Applying a spectrometer to any new light emitting device is just the sort of thing some people enjoy doing.
They're officially just called "Posts" now. It's a hell of a downgrade from how distinctive the old terms were, no wonder people still call them Tweets.
Community Notes was also set to be called Birdwatch originally, continuing the bird pun theme.
Many people remain fascinated by Apple and the small choices that (traditionally) give their products a sense of careful and attentive design and engineering.
So there's both a supply of people eager to pick their products apart and a market of people eager to hear about all the little details and secrets.
While Apple probably does seed some stories intentionally, as their PR teams are sharp, they don't need to be doing so for swarms of these reports to pop up after announcements and first shipments.
It could simply be the people are now getting their hands on them and testing them for things that Apple didn’t specifically say in their announcements.
The M3 line with 256Gb storage had a single SSD NAND chip which made it measurably slower than the M2 series with the same amount. Although irrelevant for most daily work it was a regression which seems to be fixed in the M4 line. Even then, I presume such a bit of bad news would trigger people looking for 'the best spec' to buy the storage upgrade.
I'm pretty sure apple is just a marketing machine. They have pro apple posts and smear campaigns on all samsung forums. Main stream media marketing but also guerrilla marketing on forums, social media, even newspaper comments section. I only see this kind of thing from russian propaganda.
Interesting. As I understand it, shifting the red curve to shorter wavelengths, even by a seemingly small amount, would improve visibility. And something I've learned is that red vision varies by a fair amount from person to person.
>Is there vision tests similar to audio tests where they figure out one's individual responses to different wavelengths of light? Super neat.
Unlike consumer audio equipment where you can easily do a frequency sweep to test hearing, you'd need a specialist light source to do the same. Something like a tunable laser. You could probably use a prism to do a similar sweep from a white light source.
You don't need a frequency sweep. You take three broad spectrum lights and ask people to adjust their brightnesses to match a selection of reference lights. The tool that does this is called an anomaloscope and it was invented before things like lasers in order to study how color vision worked. That work became the basis of CIE (and other standards) that now define how your screen renders accurate colors.
This setup is straightforward to adjust to different types of color vision too. They use 3 lights because that's how many opsins normal humans use for color vision. If you're testing di- or tetrachromats you can use 2 or 4 lights respectively, or 12 if you're testing intelligent mantis shrimp.
Does this mean better motion response times? The M-series MacBook Pro displays have notoriously smeary displays while displaying high-motion content, so this would be a welcome addition.
It shouldn't make a difference. The film is illuminated by a blue LED and constantly glows uniformly yellow, which is the same mechanism as the white LEDs in a traditional display (blue emitter illuminates yellow phosphor coating). The LCD filters this to make specific pixels and would be more responsible. I worked for a now defunct QD company.
The way I thought LCD/LED displays worked was by RGB filtering a uniform white backlight. Is it only this design that does fosforescence per subpixel? Sounds way more energy efficient.
Sorry, the film's yellow and LED's blue lights combine to make white (or, more accurately, a color that makes white when the RGB filters are open 100%).
Oh, phosphorescence per subpixel (instead of flooding all in "white" that was created through phosphorescence from blue in a central place) sounds like an awesome power optimization for where you still want/need subtractive LCD instead of some *LED with additive per-(sub)pixel emitter.
For those following outdoor sports tech, I wonder if this might be the secret sauce that allowed Garmin to abandon transflective screens in the Edge 1050, which unlike their post-transflective watches is still technically an LCD. (the not secret at all meat of the change is a depressingly massive battery and big loss in runtime, but I suspect that the battery alone isn't enough to explain that the loss isn't even bigger)
So far, the answer anecdotally is no, at least not in situations where lit pixels are moved quickly into black areas. In practice, my obnoxious green text black background terminal was kind of gross to scroll, but haven't experimented much with others yet. Playing games has thus far been fine, scrolling in other contexts is fine for practical purposes. Happy to update this if you want after I ruin my new MacBook by experimenting more
Certainly a clever way to test it. In terms of response, I meant that at least on a fully black background, you should see ghost lines of text between where the line of text was and where it's going, almost like a mouse cursor with a trail
LCD BLUs have a uniformly glowing background which is filtered by the LCD to make pixels. If there is delay in pixels updating, it would be the LCD causing it.
Who manufactures their displays? I'm guessing they have more influence in the design or manufacturing than most players, but is this just a matter of them telling Samsung/LG/etc "ok, we're going to use your quantum dot displays now"?
They source from a combination of Samsung, LG, and BOE (Chinese display manufacturer). The way the arrangement typically works is that manufacturers will send Apple preproduction samples and Apple will decide which are worth using for upcoming SKUs. The manufacturer will build out production facilities to meet that demand and whatever specs Apple wants. Apple may also help with investment or R&D to develop products to meet feature roadmap targets and increase supplier competition. It's a very dangerous game for the manufacturers.
Is there more to the thread or just this one tweet/X thing? Response times notoriously suck on MacBooks, it would be nice to see that remedied, anecdotally it doesn't seem like that's happened yet.
Edit: Nevermind, same tweet seems to have been quoted across a bunch of different other news sites. Apparently Blur Busters claims an improvement, I'll try it out and see how it is in some other contexts.
If you're not logged in to Xitter, navigating to a Xeet allows you to view the Xeet, but not the Xomments. Fortunately, there are open-source, self-hostable, privacy-preserving front-ends for Xitter, such as Nitter.
If setting it up yourself is too much work, you can use other public instances. One such instance is called xcancel. Load the Xeet as normal, then simply append "cancel" to the domain name before the period in your URL bar and hit enter :)
In the context of a boycott, using/promoting Nitter is at best neutral - you’re not directly engaging with the most hostile parts of the site, but still engaging with it and making it easier for others to do so, unnecessarily promoting the idea of X being a good platform to post on.
I suggest not making any effort to use the site - rather just ask people to primarily share content from X by copying rather than linking. This removes the need to interact with it both for yourself and (more importantly) for others.
To be clear, I don't mean to boycott the site entirely. There is a lot of good content from good people mixed in with the bad.
You cannot judge an entire cohort of people by what the worst among them do. Assuming all Xitter users are evil just because there are some genuinely evil Xitter users is no different from a bigot assuming all <insert group here> are <insert bad property here> just because there are some members of that group that genuinely are <insert bad property here>. Being exclusionary and prejudiced against all in response to the actions of a few isn't morally justified just because your tribe agrees that the entire cohort of people you're excluding are bad.
The issue I take with it is that the site is effectively paywalled, just one where you pay with your privacy and personal data by signing up for an account. Without an account, Xitter will not show Xomments on a Xeet, nor will it even allow you to view Xeets in chronological order.
No need for workarounds. Only the flagship nitter.net was blocked. Nitter is an open source project and maintains a list of working instances on its wiki: https://github.com/zedeus/nitter/wiki/Instances
I remember hearing something like "the way Nitter worked, guest accounts, was shut down and the 3 instances that remain work by using selfbotting" a while ago, it's logical although I haven't verified it
What’s selfbotting? Based on the name it sounds like something that requires me to surrender my own authentication token to some automation service… but that’s definitely not the case for these alternative Nitter frontends.
I’m not sure how Twitter ultimately blocked them. It would be pretty embarassing (for Twitter) if it were a simple IP block of the Nitter.net servers, but that doesn’t seem too out of whack with Musk’s history of litigating bot behavior…
Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
> I've heard that there are screen lifetime issues?
This has gotten much, much better, especially with "tandem OLED" where you just stack two of 'em on top of each other. It should be fine these days.
> Also, from my limited experience with a single OLED screen, it seems that most stuff was created for a certain kind of screen without as much colour fidelity, and now that stuff seems far more...obnoxiously "saturated"?...on an OLED screen.
That's up to the display manufacturer to calibrate the screen. The content should just be what it is and specify its colorspace properly. (Note, "properly" depends on the environment around you, so if you really care about this you have to participate too.)
Lots of devices that come with OLED displays come with a "vibrancy" mode turned on by default that oversaturates colors until you turn it off. It does look great at a glance tho!
Conversely lots of contents is produced on/for less-than-stellar displays and gamma+color-profiles-be-damned overcompensate with more saturated colours at the data level because it's going to show up toned down.
When a display is actually able to put out the colours it then looks gaudily oversaturated. I've had such problems already with non-OLED "somewhat† calibrated" good quality screens as well.
† I mean I did not calibrate them, they were factory calibrated with a good enough test curve slip in the package.
Pixel aperture ratio has increased drastically since the early displays. This drops current density for a given amount of light output, and there's a nonlinear relationship between current density and segregation so that helps a ton.
Deuterium helps make more light per unit current, improves current density, improves lifetime.
Microlensing of your customers will accept narrower viewing angle, improves brightness and lifetime in the same way.
There was a time that OLED problems were so huge Lenovo cancelled their usage on laptops for many years (e.g X1 Yoga Series). It was so bad that I got the next generation laptop for free when it was released.
OLED has had lower peak brightness than IPS. It may not be perceptually so because of no-backlight absolute blacks and higher contrast, but the difference starts to matter in broad daylight where OLED may not be bright enough, irrespective of matte vs glossy.
the pixel response and contrast absolutely are. Battery life is a little worse (especially in bright mode). OLED pixel response is around 100 micro-seconds compared to ~5ms for IPS, and each pixel dims individually allowing for actually good HDR
If you buy a macbook it's supposed to last a long time, but I'm kind of skeptical of getting one right when they release instead of a tried and tested IPS mbp
Doesn’t OLED pixel layout not line up with modern text rendering engines? At least that’s what I believe I’ve read from reports on banding around text on Windows in particular that makes long-running text work a problem.
Shouldn’t be an issue under macOS for the most part, which has used grayscale antialiasing for several years since subpixel AA isn’t of much benefit with HiDPI displays and complicates text rendering considerably.
If there are any problems, it’ll probably be with cross platform software that doesn't use native text rendering and assumes RGB subpixel arrangements instead of obeying the system.
Freetype also doesn't do the right thing on RGBW at least.
Reading code in the repo suggests it's possible to reconfigure it to work properly with the Harmony algorithm but I haven't worked out how yet.
If anyone knows how to sponsor efforts to fix this I would totally contribute to that.
A separate problem is that I don't think there is a standard way for monitors to communicate the subpixel layout in such a way the font rendering engine will have access to it. That seems like a pretty big oversight when introducing these in the first place.
interesting, We have a failed Samsung QD TV and when we called for a technical support the guy correctly guessed the screen issue as if he has seen that many times before. What makes them unreliable? The problem isn't even in the QD film itself but the LED array, an LED fails and shorts then the other LEDs start working at higher voltage and overheat and causing cascading effect where the problem starts as small and develops into unusable.
It seems like a very specific thing for a reporter to ask and find out about.