Hacker News new | past | comments | ask | show | jobs | submit | LorenDB's comments login

Why not just use something like Qt to build a native cross platform app? As others have mentioned, advertising a device as open source and yet Mac only is going to turn away a lot of people.

Really appreciate the feedback.

The honest answer is I'm a Mac guy, and a terrible businessman.

FWIW I love Linux and used it extensively to develop Photon. (In fact the "MDCUtil" tool for debugging Photon already supports Linux.) I'd love to add Linux support for the app if there's enough interest in this thing.


Qt apps, even in 2024, still don't feel native when used on Mac.

Personally I'd much rather just drop the custom Mac app and replace it with:

  - firmware that can read/write a proper filesystem
  - firmware that can create e.g. DNG files that can be handled by regular photo editors like Lightroom, RawTherapee, Darktable, etc.
  - A webapp that lets you generate a configuration file and does the battery life estimation.

Can he afford Qt? He'd need a lawyer on retainer.

Groq is an insane company. SambaNova (discussed yesterday[0]) is also very promising. However, what I really want to see is local AI accelerator chips a la Tenstorrent Grayskull that can boost local generation to hundreds of tokens per second while being more efficient than GPUs.

[0]: https://news.ycombinator.com/item?id=40508797


Samba is on gen 4 silicon and still lagging, somebody over there is doing something wrong

This looks like a really cool project! I do have two questions, though.

1. Do you plan to add AMD support?

2. What advantages does Metrics have over leaving a PC with Steam always on and then using Steam Remote Play?



See also JerryRigEverything's teardown/durability test video: https://youtube.com/watch?v=GN6ZlssqNAE


GPT-3 still costs nearly twice as much as Groq's reported $0.79 per million tokens. Of course, Groq doesn't have self-serve paid plans yet, so there's still a reason that people find it easier to use OpenAI.


It's interesting to see that Google takes this so seriously they're backporting it to Android 6. I guess they probably have metrics on what Android versions are still in active use, but I'm a little surprised that Android 6.0 would still be used heavily enough to warrant the backport. Regardless, it's good to see this sort of industry cooperation from companies who would normally be at each other's throats.


Google made a deliberate decision long ago to detach library and feature support from the operating system due to manufacturer fragmentation. So most of their new stuff automatically works with old versions of android


Same for security updates. A lot of new features and security patches get updated via the Google Play Services to older versions of Android as well.


Google Play System updates provide very few security patches and this only applies to devices with Android 10 and higher and most APEX modules are only updatable in later versions.

I track security patch counts of monthly Android Security Bulletin vs available APEX vs my aftermarket backports for A7 through A13 here: https://divestos.org/pages/patch_counts#aggregatePatchCounts


And the annoying thing is that on some phones (like Samsung phones recently, twice), the Play System updates get blocked for months on end. Worse, people who buy a new device are often stuck on ancient versions.

I recently tried an Android phone again for a few months, and the update/security situation is still a mess. E.g. Samsung does monthly updates on more premium phones. But a former flagship like the S22 would sometimes only get the update near the end of the month, even before the S24 is out. Having a phone with known CVEs for the better part of a month is… meh.

For some vendors, like Samsung, things are much better than a decade ago, but it’s still a far cry from Apple rolling out updates to all models simultaneously.


> Samsung does monthly updates on more premium phones. But a former flagship like the S22 would sometimes only get the update near the end of the month, even before the S24 is out.

This is complicated by their rolling updates per country, it can be few weeks between the first CSC (3 letter identifier for country and carrier variant) to receive an update and it being rolled out to the final one.

I was towards the end of that update cycle, so the Android security patch level could become quite detached from the actual month.


>Having a phone with known CVEs for the better part of a month is… meh.

Out of curiosity , have you ever encountered any malware that exploits said CVEs? If a month delay would be so dangerous, Android users, even of new devices would be getting pwned left and right, let alone Android users of device no longer getting patches.

Source: Android user of old phone who hasn't been hacked yet so I'm not sure where exactly the dangers are, as the attack surface is mostly the web browser and the apps, both of which are scanned and covered by up-to-date patches from Google Play Store/Services even on my ageing phone. So as long as you don't browse extremely dodgy websites, and don't download shady apps you should be good as nothing else can't get to the Kernel CVEs on your unpatched phone.

Yeah, I'm sure some crafty malware dev can whip out a targeted virus that can exploit the chain of open CVEs on my particular phone through a MMS message or something, but I'm not sure targeting the 100 or so users left still using this old OnePlus model that's worth less than 20 Euros used (pointing to a user without much income), is a good use of their skills and time, when they could be frying much bigger fish with that know-how like going after Microsoft's Azure or something.

Nor am I being targeted by state actors who have these means. And if you are being targeted by state actors, they have access Zero-Days that even Apple or Google haven't patched yet so you're not safe anyway no matter what phone you have.


> If a month delay would be so dangerous, Android users, even of new devices would be getting pwned left and right, let alone Android users of device no longer getting patches...Android user of old phone who hasn't been hacked yet so I'm not sure where exactly the dangers are

What are the odds that you'd ever know if you were hacked? If you have root access on your device your odds of being able to see something amiss are probably somewhat better than if you don't, but even then I wouldn't count on it. How many people detected Predator/Pegasus? It isn't just state actors taking advantage of zero days. A zero day gets that malware on your system, but once it's infected how would you know? There have been reports that millions of android phones are infected at the factory. (https://www.theregister.com/2023/05/11/bh_asia_mobile_phones...)

We know mass infections happen (see https://www.bleepingcomputer.com/news/security/over-nine-mil... and https://www.wired.com/story/android-gooligan-ghost-push-hack... and there have been some bold (if unverified) claims that most android devices are/were infected with something (https://www.zdnet.com/article/bt-almost-every-android-device...).

I don't know how you could possibly be confident that your device isn't infected with something. The devices are designed to keep you from having the ability to poke around too much at their internals and the radios make it difficult to monitor exactly what's being sent/received to the device.


YSK:that zdnet article was (thankfully) corrected in this subsequent article in which BT refuses to release their data:https://www.zdnet.com/article/bt-backpedals-on-claims-almost...


>What are the odds that you'd ever know if you were hacked?

Would you know?

>I don't know how you could possibly be confident that your device isn't infected with something.

Easy, my bank account is still full.

How are you confident your phone isn't infected? Being up to date is no guarantee. Until you can poke around with root access to inspect everything it's still Schrodinger's cat in a black box you trust to not be dead inside.

Because how would malware ever make it into my phone? It doesn't just magic itself onto your device once it stops received updates. It needs an entry point off the attack surface. And what's my attack surface since all your examples don't apply to me?

I never download shady Apps from the likes of Huawei AppGallery lol or even off the PlayStore and I don't use Android 5. All apps I use are whatsapp and Google chrome, and I also don't browse shady websites on my phone.


> Would you know?

I very much doubt it.

> Easy, my bank account is still full.

That assumes the malware is intended to take your money instead of your data, or even just your internet connection. Malware can be used to attack/infect other devices or even just click ads. What kind of harm could someone who had full access to your device, including access to your internet activity, texts, location, camera, and microphone do to you without telling you about it (blackmail).

> Because how would malware ever make it into my phone?

Maybe it was installed at the factory. Maybe it came from literally any one of the many many vulnerabilities that made it possible to infect your device without any indication. Android phones have been compromised via text message, via Bluetooth, via QR code, and via apps.

It's great that you aren't doing anything obviously risky, but that isn't a requirement to get infected and the problem is you just can't know. You aren't the admin of the device. You don't have the authority to control what it does. You aren't allowed to see what it's doing. You can't see who it's communicating with or when.


I've also used phones which haven't received any updates for years without any obvious problems. Just maintaining basic digital hygiene like you do. In theory, one could use a zero-day in a web browser (like the recent libwebp vulnerability), then exploit one of the numerous CVEs in one of the system libraries or the kernel, and own the phone that way even without you doing anything worse than visiting a random website. For example, that's how one of the the first methods of jailbreaking PlayStation 4 operated.

Your average Joe six-pack like myself probably shouldn't really worry about it though, it seems more likely to be used against really high value targets.

You might want to try out another web browser that has aggressive ad blocking (Firefox, Brave, or Vivaldi should do it) since ads are one of the major methods of spreading malware.


>You might want to try out another web browser that has aggressive ad blocking (Firefox, Brave, or Vivaldi should do it) since ads are one of the major methods of spreading malware.

Under rated advise. Too bad said Joe six-pack donesn't follow it because it thinks other browsers "have viruses"


> It needs an entry point off the attack surface.

Considering how many vulnerabilities have been in the media stack of Android, all that would take would be an image or an autoplay video on a website.


So why haven't I or anyone else here with Androids been hacked already, if it's so "simple"?


Are you suggesting we've got a statistically valid sample of Android users here to suggest that drive-by RCE exploits that have been published and patched on modern Android devices are essentially just a fantasy and aren't actually concerns at all for unpatched devices? And that the people in this sample would always clearly know their device was compromised?

FWIW, while I don't think I personally had an Android device hacked, I do think I've had family members with devices potentially hacked. One family member running an older version of Android continued to have lots of accounts constantly getting stolen despite using a password manager and unique complicated passwords. Pretty much any time they'd be logged in to an app there would begin to be fraudulent orders or other mischief on that service. They never installed shady apps. Rotate the password on another device, no problems for a while. Log in on the phone again and within a day or two have the mischief start again. All that stopped after replacing the device.

Another family member started getting popover ads on their device despite not having any odd apps installed that would be the cause. Even after a "factory reset" the popover ads continued to plague the device, as if it was embedded in the ROM.


>Easy, my bank account is still full.

That one is probably not valid.

My bank app is sht, The 2FA is EMBEDED in the app; in the beggining it was a separate app. It also needs Google Play Services. And I live in a third world country with little accountabilty. On top of all that, most people use very cheap chinese phones which never get any update.

Still, bank accounts have never been depleted through hacking of phones.


>Easy, my bank account is still full.

To be fair, it's non-trivial to convert hacked bank credentials to actual cash, due to anti-fraud measures, KYC rules, and reversibility built into the finance system. A better indicator would be $1000 worth of BTC on an unencrypted wallet not being hacked.


True, given that the average android phone probably needs 100s of CVEs patching, it's kinda weird that there's not obvious epidemic of phones being hacked


I think you are mistaken.

If something is actively exploited they definitely don't wait to months end. Also a Google update is not urgent if the vulnerability is not really exploitable on Samsung (they have some additional security) or a fix is already backported.


Does this mean system-wide libraries can be shipped via APKs?


Yes, that's what google play services is.


And/or Google Play System updates, which are a separate thing and usually needs to be triggered by hand to get it timely.


Is this the reason that my old mid/low end android devices become unusable even through they no longer get updates?


That's probably more related to storage performance (I guess eMMC) decreasing over time


>Is this the reason that my old mid/low end android devices become unusable even through they no longer get updates?

More recent versions of apps (like whatsapp, which requires to be updated regularly) are unnecessarily more demanding. Try disabling Play services, all Bloatware, all quasi-bloatware (calendar, contacts), all the way to the default keyboard. (pm disable-user --user 0)

Install FOS replacements with no internet connectivity, e.g. from f-droid and such (not affiliated)

It is easier to buy a new device, but I get attached to my pal ;)


> all quasi-bloatware (calendar, contacts)

Has Android gotten that bad? I last used it in 2018, and haven't exactly missed it since, but even back then I didn't think of Calendar or Contacts as quasi-bloatware.


Most of the memory in a phone is flash. All flash memory has limited write cycles and there's a speed/lifetime tradeoff, also. Flash devices normally actually have more memory than stated, writes go to empty pages that then replace the original, which is then wiped. Every cycle the cell grows slightly weaker until one day it's too weak.

In the best case when a page write fails the memory controller discards the page from the pool, when the pool becomes exactly equal to the official size the device goes into permanent read-only mode. (Which would brick any device it's built into. A PC you could pop in another drive, a phone you can't.) Many devices have failure modes worse than this.

The better the quality of the memory the more margin it has, and how much you write to a drive has little to do with how big it is--from a practical standpoint life expectancy is roughly linear with size.


>from a practical standpoint life expectancy is roughly linear with size.

For the flash itself (the giant mass of NAND or NOR gates making up the cells), in terms of TBW it's straight up linear with size. 2 TB flash storage has twice as many cells as a 1 TB (of the same make) therefore can eat twice as many write cycles (unless the manufacturer does something on the sly like change overprovision ratio based on capacity, change from TLC to QLC on higher capacity units and "forget" to mention it, etc., but those are factors beyond the basic logic gate arrays).

However I don't think size effects MTBF as that looks at factors unrelated to write cycles; things like catastrophic failure of a chip or a short that catches fire and burns down the server farm.

Realistically speaking, MTBF has little bearing when considering flash storage lifetime. TBW is where it's at. If the specs only give MTBF, I tend to assume the TBW is bad enough it's worth hiding and I'll avoid those . If not that, then it's either straight incompetence, recycled flash scam, or the manufacturer just doesn't give a shit (all of which are way worse than choosing to omit a low TBW rating).

It may seem like i'm nitpicking the word 'roughly' but i don't disagree with the sentiment. Depending on how you want to measure lifetime (jfc don't use MTBF), it isn't exactly linear, but it's not the flash's fault.


Supporting Android 6 would be like supporting iOS 9.

Apple is only supporting latest version of iOS (17.5).


> Supporting Android 6 would be like supporting iOS 9.

1. Your point still stands, but this is because this update is probably shipping as a Google Play Framework update, which works on >= 6.0. Google is not (to my knowledge) releasing a new firmware.

Apple would do well to decouple certain software components from iOS, IMHO.

2. In case others are curious about iOS version market share, statcounter's stats for April 2024:

17.4 (current until today): 51%

17.3: 22%

16.6: 4.3%

16.7: 3.18%

16.1: 2.8%

16.3: 2.12%

https://gs.statcounter.com/os-version-market-share/ios/mobil...


As a developer, that long tail of folks on the previous major version really sucks with fast-moving frameworks like SwiftUI. There's no way my company (a banking app) would drop 10% of customers, so we typically do N-1 for iOS support.

Our Android app shipped almost 3 years ago with minSdk=24 (Android 6.0) and we haven't had to update it.


The web browser is a big issue with this, too. A Safari release broke IndexedDB and they didn’t release a fix for over two months because browser updates are tied to the OS.


If there's a critical security update they can release an update within days. So it's got nothing to do with the complexity of releasing a new OS, it's just that they found IndexedDB not important enough to warrant an out of cycle update.


I write apps for a large regional grocery store chain, and we have to support N-2. Even then we get support emails every time we bump it up.


I work on Android apps for a food company turned media network.

We, to this day...with no talks of changing it any time soon, support all the way back to Android 5.0 (released in November on 2014).


I mostly work on Windows apps. People still complain when software drops support of Windows 7 released in 2009, or Windows 8 released in 2012. Despite none of them are supported by Microsoft.


It'd be a more fair comparison to take into account Windows 7's EOL which was just a year ago IIRC if you coughed up Extended Support money, its broad install base, and it being the last actually truly decent Windows that had a consistent UI across the place and no ads.


Windows 7 didn't have a consistent UI, though. In multiple places you could find Vista and XP-decorated panels. Your nostalgia is wrong.


That only continued to degrade with Windows 8 and 10 though.



> Apple would do well to decouple certain software components from iOS

I'm not really convinced of this. When you control the entire hardware and software stack, and already provide updates for 7+(?) years per model, would doing this really change much? Google needed to do that because some cheapo manufacturers weren't really providing any updates to speak of, and Google has limited ability to force manufacturers to provide updates on any particular schedule.

It's funny, because to me this is really just the Linux model vs. BSD model, in a way. Linux base systems are cobbled together from various bits of software maintained by different people and groups. Many of those components can be updated independently of one another, including the kernel. The BSD model is to ship the entire base system as a versioned unit, and only update the entire thing monolithically. Android does the Linux model, iOS does the BSD model. Both models work just fine, depending on what your goals are and your distribution model.

I think the one thing I'd argue Apple should decouple from the OS is Safari. A web browser (and the underlying OS web view) should be independently updateable, even after a device stops getting OS-level security updates.


Android 6: 0.79%


So this has the downside of not being supported for Androids that don’t use Google Play services right? Like the Amazon tablets?

Still a big win overall I’m just curious what number of devices basically don’t get important upgrades due to this method. (Surely a tiny amount compared to what you’d have if you relied on device manufacturers to update the OS.)


Google and Apple have fairly different platform software update strategies so I was curious how the support windows line up. Android 6 was the last supported release on devices like the Nexus 5 and Xperia Z3 from 2013 and 2014. iOS 9 was the last version supported for devices like the iPhone 4s and 3rd gen iPad that shipped in 2011 or 2012. Going forward to 2013 iPhones you have the 5s which was last supported in iOS 12. It sounds like Google is able to ship this directly rather than as an OS update that would need to go through manufacturers, while Apple typically deploys these type of fixes by pushing an OS update. I'm curious how far back they'll go, they rarely ship security fixes more than two major versions behind the current release (so maybe down to iOS 15, supporting devices released in 2015).


> they rarely ship security fixes more than two major versions

Every flagship iPhone since 2011 has gotten at least five years of OS updates, with additional years of security updates after that.

For instance, the original version of the iPhone SE is currently in its eighth year of support, and just got another security update a couple of weeks ago.


One issue with iOS vs Android is that iOS has the browser tied to the OS ala Internet Explorer. Once iOS is no longer supported, you're now using an unsupported insecure browser. And all other iOS browsers sit atop the unsupported insecure browser in iOS.

On Android, the browser is separate and you can install alternative browsers. The current release of Chrome for Android works with Android 8 which was released in August 21, 2017 and dropped January 2021. Android System Webview (the browser engine that other apps can use so they don't have to ship their own) works the same way and is independently updated from the OS.


Updating an app doesn't fix an issue that requires a security update, or a replacement driver.

> Virtually any Android, Linux, or Windows device that hasn't been recently patched and has Bluetooth turned on can be compromised by an attacking device within 32 feet. It doesn't require device users to click on any links, connect to a rogue Bluetooth device, or take any other action, short of leaving Bluetooth on. The exploit process is generally very fast, requiring no more than 10 seconds to complete

https://arstechnica.com/information-technology/2017/09/bluet...

A browser update doesn't fix that and that and Blueborne was disclosed during the era when an Android device only had a support window of two of three years.

iPhones from over a decade ago were getting security updates, including for the browser, for seven or eight years.


This was from 2017, and nothing came out of it.

Every month or so someone (usually a security company that wants to sell something) find a domesday exploit for android that is unpatchable, and then it gets patched or turns out to be a non issue.


> the browser tied to the OS ala Internet Explorer.

Internet Explorer was usually only loosely tied to the OS. Yes, many versions of Windows came with some version of IE, but you could usually install a newer version if you wanted. Every once in a while, a newer version of IE required a newer version of Windows, but that wasn't typical.

As I understand it, on Apple systems, Safari comes with the OS and is updated together --- Safari updates come in OS updates. Mobile IE was very similar though, at least on Windows Phone 7 and newer.


> Safari comes with the OS and is updated together --- Safari updates come in OS updates.

That’s true on iOS but not exactly true on macOS. Upgrading macOS also upgrades Safari, but you can also install newer versions of Safari on older versions of macOS. You just can’t have older versions of Safari on newer versions of macOS.


Two different things, Apple provides a pretty long window of OS updates for each device but doesn't ship updates for older OS versions. For example the 1st gen SE shipped with iOS 9 which has not been getting updates for a few years, but it can run iOS 15 which still seems to be getting security updates.


It probably seems odd when you're coming from a history of devices with a very short support window, but you don't need to worry about years when you only get a security update until you run out of years when you get the OS AND security updates first.

For instance, that original iPhone SE came out in 2016, the same year as the original Pixel phone. The iPhone is still supported by Apple today, while the Pixel phone was dropped from support by Google five years ago.

Google had to come up with a way to backport features to older unsupported versions of the OS because their support window was so abysmally short.

Hopefully, this won't be an issue going forward, with Google promising a comparable support window in the future.


You’re right iOS 15 got an update earlier this year.

https://en.m.wikipedia.org/wiki/IOS_15


iOS 17.5 supports devices back to the iPhone XS released in 2018.

iOS 17 runs on 80% of current iPhones in use

https://telemetrydeck.com/blog/ios-market-share-03-24/#:~:te....


Here in Asia, people are still on the older Android versions. So maybe try switching the filter on the website besides US.


Where in Asia? Does it vary much by country?


Philippines, I am not sure how accurate the link is below but it is almost certainly possible people are moslty on older devices.

https://deviceatlas.com/blog/mobile-os-versions-by-country#p...


This is a 2019 article with figures from that time. So it's wildly out of date.


FWIW, WhatsApp claims support back to Android 5.0, and if they haven't changed their support decisions since I left, that means there's a significant amount of users in the wild on Android 5.0. I'm not surprised Google only goes back to Android 6, they were always dropping versions from support before WA did; their threshold must be higher.


Yep, my company would lose a significant amount of our users if we updated from Android 5.0.

A lot of the early Fire TV devices are still out there running Android 5.0, and they are actively used.


WhatsApp majority userbase sits outside of the US, and it's deeply embedded in many 3rd world countries. I'm not sure how their European userbase compares with the African/Asian userbase.

I reckon the WhatsApp userbase OS distribution skews much more to older android versions compared to an app that mostly enjoys US/1st world country userbase.


Certainly, the WhatsApp userbase is skewed. But most of WhatsApp Android users are also Google Play Android users (certainly not all, WhatsApp publishes the apk directly), so if WhatsApp doesn't want to cut off X million users on Android 5, but Google is ok with not supporting it, Google almost certainly has a higher threshold value required to keep support. I'd guess the Android 5 users with Google Play and not WhatsApp outnumbers the Android 5 users with WhatsApp and not Google Play, but none of those statistics are public.

With the number of Android devices out there, it really makes more sense to think about it in actual user/device count rather than percentages.


10 years support seems pretty reasonable to me for a major operating system version, regardless of usage numbers.


It would be nice, but I wouldn't call it reasonable. That's a LOOOOONG time.


Isn't 10 years pretty standard for windows OS versions?


And 20 years for Linux :P MacOS and iOS really are the major outliers that convince people otherwise here.


Unless I missed a memo, Linux LTS is 2 years of support these days. Many distributions offer 10 year LTS releases, but I’m not sure of any off the top of my head that currently offer more.


My point was more so that you can upgrade 20 year old devices to Linux 6.X! You're correct though.


I feel like a couple things were mixed that aren't the same. The commenter upthread said 10 years for Windows OS versions. I took that to mean e.g. "Windows 11 will be supported for 10 years from initial release".

Sure, you can often install a modern Linux distro on a 20-year-old device (though just as often, you cannot), but that's not the same thing as a single version of Windows being supported for 10 years. The analogous situation is indeed LTS versions of Linux distros, which certainly don't have 10-year support lifetimes, let alone 20.

But the support lifetimes make sense for the various vendors. Apple doesn't need to support a particular major version of macOS or iOS for all that long, because they make sure new versions of their OSes will run on fairly old devices (all of which are devices they've built, and have full control over), and they aggressively push people to upgrade to new major versions as they come out.

Microsoft has a lot of customers who value stability and consistency above all else, and on top of that, they have to support a wide variety of hardware that they don't and can't control. Supporting a major version of Windows for many years makes sense for them.

As for Linux, there's no one single source, so a rolling-release distro can decide to only support the bleeding edge, whereas a cloud provider might roll their own distro for server use and decide to support that for a decade, if they think that's what their customers want.


Phones have a far shorter shelf life.


I just recently pulled out an old phone that originally ran Android 6 for a project, hardware wise it still runs perfectly. The only thing wrong with it is that I can't upgrade it to a newer Android version.


This is powered by Google Play Services (aka: gmscore). https://support.google.com/googleplay/answer/9037938?hl=en


From what I understand, backporting won't make a difference, unless vendors integrate it into their custom OS installs, and, from what I hear, they aren't really giving legacy support much love.


They are taking it seriously because of the legal liability issues. Their lawyers are clearly worried about the legal implications of their devices being used to track people and things for illegal purposes and want to make sure they have a level of protection against lawsuits from consequences of tracking devices used for illegal purposes. There are already cases of women being stalked using these devices.


> Their lawyers are clearly worried about the legal implications of their devices being used to track people and things for illegal purposes

Just that Android devices are not involved in tracking of AirTags, as of today only iOS devices actually share the location of AirTags back to Apple.

They maybe want to change that, but considering the huge amount of volume disparity between AirTags and Google's tags, I assume Apple would have to pay Google for the service of extending their tracking-network...


> Google takes this so seriously they're backporting it to Android 6

Or for contractual reasons, or for some technical reason it was easy enough to be "why not"


Or they are pursuing a gov related project, and this was mentioned as a showstopper...


Compression describes how it is physically attached. Similar to desktop CPUs, LPCAMM maintains contact with the mobo via screws that apply a consistent pressure to the module, ensuring a completely solid connection.


LPCAMM is the best thing to happen in the laptop space since Framework. Forget Apple's ridiculously expensive soldered storage and memory; LPCAMM is a step in the right direction.


That was true when they had Intel CPU's and soldered the RAM chips on the board.

Now they have the memory chips "inside" the "CPU", aka: system in a package.

Big Tim is way ahead of the game when it comes to ruthless product targeting and segmentation.


Apple's ridiculously expensive fast & wide on-package memory is a big reason why the M series chips are so fast. Expect AMD & Intel to adopt a similar strategy in their high end chips soon.


Intel is already there at the ultra-high-end, the Xeon Max is a hybrid with 64GB of HBM2 on-package and an external DDR5 bus supporting up to 4TB of socketed RAM. The HBM2 can be configured to work as L4 cache for the DDR5, or it can boot without any DDR5 at all and run entirely from the HBM2.

https://www.servethehome.com/intel-xeon-max-9480-deep-dive-i...


I remember seeing this from STH as well, and my first thought was "I want one of these in my laptop!" It'd be perfect then to max ram in a laptop with 2 sodimm slots, or one of these 64gb lpcamm's. Ideally with a thinner laptop in mind, but probably not with the heatsink for THAT massive xeon.

I really hope the hbm memory on-die trickles down into pro-sumer processor chips eventually too.


MI300X also has on-package RAM, but that's even further from a laptop chip than Xeon Max. :)


Intel's Lakefield also comes to mind, pretty underrated chip imo. The first x86 big/small heterogenous chip, using Intel's Foveros to stack memory, CPU, and platform chip into one package. It made for a very simple design that just needed power. LPDDR4X-4266 doesn't crazy fast but in 2020 it was pretty swift. https://www.anandtech.com/show/15877/intel-hybrid-cpu-lakefi...

Alas the 1+4 Skylake+atom were all pretty meh chips & Intel hasn't tried again. Conceptually I really.liked it, and it was doing great against the Qualcomm 8cx competitor for the tablets/2-in-1s it was in, but people wanted more.


> Apple's ridiculously expensive fast & wide on-package memory is a big reason why the M series chips are so fast

Increasing memory speeds have had only a marginal impact on common workloads (not AI) in the PC World - this has been a well discussed topic. Why would it be different on the Mac?


Memory speed usually refers to memory bandwidth, which does have a limited amount of impact on day to day apps as they don't usually require a lot of bandwidth.

Memory Latency, on the other hand, can have a huge impact on performance. That's the reason chips have had steadily increasing cache sizes over the decades along with the introduction of additional caching layers.

What Apple's M series improves by putting the memory on chip is both bandwidth and latency. The latency, however, is what will impact app performance more than anything else.

Latency improvements are the reason memory controllers were moved from motherboard northbridges into the CPU IC. Shortening that distance means a lot to shortening latency.


Amusingly the top-line latency number is higher than for X86, but I agree with you because that number hides the better performance in other latency metrics that are probably more important.


probably something to do with rounded rectangles


It's not about memory bandwidth. [1] shows that M1 is 66x faster at pointer chasing than an equivalent AMD. You can look at [2] and [3] to see that TLB penalty is also lower at any given test depth which is important because TLB misses aren't that infrequent and they're serial in your ability to access any memory that's not in your cache.

It's not an unreasonable hypothesis that this is because of on-package memory which means you can run with tighter thresholds since your memory traces are shorter. Their unified memory architecture also means that GPU is way more effective since you never need to copy memory from the CPU to GPU. That's probably not a huge deal for normal apps, but having rendering the screen not competing with any app resources for memory probably isn't nothing.

Neither the CPU nor the GPU can fully utilize the available memory bandwidth [4] The insane memory bandwidth is to also support the other coprocessors like the media engine & to ensure that you can do a bunch of tasks in parallel without them contending with each for resources. Those workloads do benefit greatly from memory bandwidth & the "artist" community is hugely important to Apple.

The reason M is overall faster is more because of things like better ILP (e.g. fixed-width RISC encoding vs compression-like CISC means easier to keep the pipeline fed). Apple software for the most part is better engineered for performance & the M chip is optimized for that performance profile (e.g. C# and Java vs Swift/ObjC - Rc is pretty conclusively faster & requires less memory overall not to mention that the M chip has specific optimizations to further reduce the traditional cost of Rc that they have to pay because they don't typically distinguish Rc from Arc as Rust does). The final truth is that Apple throws a lot of money at this problem as well & buy up the latest processing node to stay ahead of the competition.

[1] https://www.linkedin.com/pulse/apple-m-cpus-probably-much-fa...

[2] https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...

[3] https://www.anandtech.com/show/17047/the-intel-12th-gen-core...

[4] https://www.anandtech.com/show/17024/apple-m1-max-performanc...


No. M1/2/3 has data memory-dependent prefetcher[0] which can speculatively chase pointers in contiguous memory ranges, reducing indirection cost provided the data there is homogenous, looks like pointers and prefetch attempts are successful per cache line. It also has extra good indirect load predictor.

These have little to do with memory packaging and a lot to do with cache and prefetching architecture.

It does have one of the lowest latencies for atomics[1][2] which is something Swift cares about, but the overall core design just as much benefits other, less indirection and synchronization heavy languages (side note: Swift and Java are closer with each other in defaulting to virtual dispatch, where-as C# sits in the middle of the road with static dispatch by default but some code heavily using interfaces and virtual methods which do make such calls virtual to an extent (JIT has DPGO to devirtualize, AOT can uncoinditionally devirtualize in certain scenarios too, similar to what Swift's WMO does)).

[0] https://gofetch.fail/files/gofetch.pdf

[1] https://dougallj.github.io/applecpu/measurements/firestorm/D...

[2] https://dougallj.github.io/applecpu/measurements/firestorm/C...


Yeah. But even as a NUMA area, upgradability would win a lot of brownie points here (of course modern Apple does not give a flapping bird about it)

Not sure how it works on the iMac/MacPro


Would love to see a Mac Pro with 8 of these…


Soldered? It's not soldered. It's part of the actual SoC. Could we please update our research to more recent than 20 years ago?


It's soldered to the package. The actual RAM components are the very same discrete BGA ICs that would normally be soldered to a motherboard, but they've soldered them right next to the SoC die instead for tighter signaling.

https://i.imgur.com/Y3PLp33.jpeg

Anyway results are what matters, and the module in the OP has bandwidth figures half way between the M3 and M3 Pro while retaining upgradability.


The switch to ARM Macs only happened 3.5 years ago, plenty of Intel models still in service that would benefit from a RAM upgrade if it weren’t soldered


The ram is soldered into an Intel Mac for the exact same reason that it's soldered into an ARM Mac.


To make upgrades incredibly expensive, allowing them to have the same product generate revenue as both an expensive mid-range machine as well as a ridiculously expensive "professional" machine. Why make different products when you can sell the same product at different prices?

In case of intel macs, the soldered RAM should have made the device and RAM options even cheaper, as the BOM is simpler. The only real argument against soldered RAM was DDR5 in SODIMM being bad - which LPCAMM2 fixes.

If they didn't use it specifically to price gouge, the soldered RAM wouldn't have been a problem. 64GB of RAM costs practically nothing at market prices, so they could honestly have had a single SKU with max RAM without notable price change over entry level - heck, maybe the BOM reduction would even sponsor it.

... But then they'd lose their mechanism to drain companies with deep pockets for necessary upgrades. Being sensible and fair is not profitable.


[flagged]


What if you can't afford a top-specced model at the time of purchase? What if your memory chips fail after your warranty expired?

I pulled up Apple's pricing to check the exact numbers. On the 14" Macbook Pro, upgrading from 8 to 16 GB of RAM costs $200. I could get a 64 GB DDR5 SODIMM kit for only a little more from Newegg[0]. Again: upgrading the MBP from 512 GB of storage to 1 TB costs $200; meanwhile, even a high-end 2 TB NVMe SSD from Samsung costs less than $200 on Newegg[1].

There's no point in sacrificing repairability and upgradeability in that way, especially in cases where it doesn't make a huge difference in performance (from what I can find, Apple's storage speeds aren't faster than non-soldered storage, and LPCAMM's gimmick is that it brings soldered speeds in a removable package).

[0]: https://www.newegg.com/g-skill-64gb-262-pin-ddr5-so-dimm/p/N... [1]: https://www.newegg.com/samsung-2tb-980-pro/p/N82E16820147796


[flagged]


And there are people who'll buy $10 watch for $10k and then will come to forums to comment how brilliant they are.


I'm sorry, but if you have to resort to labeling reasonably-priced non-Apple products as "what poor people buy", I can't accept your arguments. Not everybody can afford to spend $3k+ on a top-specced MacBook even if they wanted a MacBook. I would much rather take my personal route of buying a nice $1400 Thinkpad and then upgrading the storage to 2 TB later on for $100 over Black Friday. Sure, it's still pricey, but I get great specs for less than half the price of a MacBook.


I’m not letting a Lenovo product on my home network, even if I was given one for free.


The Superfish controversy was from years ago. Plus, that didn't even affect ThinkPads! This is just some tilting at windmills for the sake of it.


All the same, I’d rather not take that chance and prefer MacOS. No need to downvote me just because you don’t think it’s an issue to be concerned about (security).


Hey, you pays your money and you takes your choice. I certainly am not critical of those that prefer Apple products.

But wipe the SSD on the ThinkPad and install some Linux-based OS on it, and you're rocking and rolling. No weird phoning back to the mothership behaviors either. Again, I'm simply pointing out that you can be just as secure with a ThinkPad.


Apple charges you 3x or 4x what RAM is worth for upgrades.


Indeed, Apple currently charges $400 to add an extra 32GB of LPDDR5 to a machine.


Not only Apple. I bought a HP laptop 10 years ago. The 16 GB version cost at least 100 Euro more than the 8 GB version plus a 16 GB RAM module from another vendor (I can't remember which one, but one of the usual companies.)


Yeah, but with other vendors you still have plenty of options that you can upgrade yourself. That's what the OP is about, this new standard allows for fast, upgradable memory in very thin form factors that previously might have needed to compromise on upgradability.


Which is exactly how much they should charge if you want to prove that you are successful and don't care about money.

Apple is a status symbol, being priced at cost would destroy their brand.


What kind of first world insane fanboyism is this?

Cellphones have got us deep in the 'use once and throw away' world of electronics, but the idea of just throwing away things like laptops once the software world has bloated itself once again is insane to me. I've rescued tons of computers by tossing more memory/new SSD in them keeping them from adding to the growing pile of e-waste, and allowing someone use of them that otherwise would have been unable to afford them.


https://mrmr.io/apple seems to be relevant here.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: