Hacker News new | past | comments | ask | show | jobs | submit login
I stopped buying new laptops (2020) (lowtechmagazine.com)
374 points by danradunchev on July 8, 2023 | hide | past | favorite | 530 comments



> Most of the 160-200 million laptops sold each year are replacement purchases. The average laptop is replaced every 3 years (in business) to five years (elsewhere). 3 My 5.7 years per laptop experience is not exceptional.

This is one of the many aspects I don't like about how laptops are designed today. Even with OEM desktops like my Dell Precision T1700, it's still serviceable and fixable today without a whole lot of effort. For both my Dell and Supermicro workstations, they basically last 10+ years, and I've got a lot of options to keep them going; I wish I could say the same about laptops. I really wish I could get a new 2k screen for the old Dell Latitude E6530 I have -- it's built a lot more robustly than my newer latitude. I really would like to see our government mandate user serviceable parts (kind of like how the EU apparently mandated replaceable batteries, which is a huge win), as there's so many older designs that would continue to chug along with a few fixes, like IBM era Thinkpads or some the more robustly designed Latitudes. Lenovo has been going the wrong direction lately...I have sworn them off after having a two P1s and a P53, which were gently used and taken care of, die in under 2 years.


This is what Framework set out to do (and is pulling it off so far...I'm typing this out on my 12th gen framework). They have an entire marketplace of replacement parts you can buy, including screens.

They've even got battery upgrades (increased capacity), better speakers, new screen type (matte) that you can retrofit into your existing Framework.

They support upgrading from their first gen laptop (admittedly they are a very young company, but their promise has held up thus far).

Hopefully more folks that are interested in upgradability and repairability are purchasing frameworks to show other companies that it's worth it to do (I'm unaffiliated with Framework, just a customer).


I just upgraded RAM(4 -> 12 GB) and HDD to SSD of a 8 years old laptop and it became a lot more responsive. Laptops now a days are not upgradable and companies claim they are doing so to build lighter laptops but framework is proving all of them wrong


> and companies claim they are doing so to build lighter laptops

A framework 13" is 15.9mm thick and weighs 2.9 lbs. A macbook air M1 is 16.1mm thick and weights 2.8 lbs. So it's pretty darn close.

I'm sure that thinner and lighter laptops exist, but yeah, the framework is pretty thin. And at a certain point, it stops mattering. Being thinner wouldn't really provide much utility at this point.

The fact is that repairability is just not profitable for companies like Apple or Dell. They would rather you replace your entire computer when your screen cracks. But I very much hope that enough people do care about repairability to keep Framework afloat (and profitable). As an owner, I was astonished how easy it is to take this thing apart and replace things; I would guess that most of my non-technical friends can probably replace a monitor or a touchpad on one of these laptops with very little difficulty.


I’ve got a framework 13, and am in the “I wish I just bought the MacBook Air” camp.

The battery life is terrible, the speakers are terrible, the thermals are terrible. It may be thin and light, but the different levels of R&D really show.


FWIW, I'm a recent owner of a Framework Laptop 13, 13th gen 1370p. I am extremely happy with it. w/regards to battery life, I'm doing ~4.5watts idling/text-editing, screen on, wifi on. That is more efficient that I got out of my 5850u I had before this. I get about 9ish hours of battery life. You do need to run things like thermald, tlp and set the lower tdp setting in the bios (max battery setting lowers the tdp afaik). Fan barely turns on. I could probably take it further with ectool tweaks. Edit: to add, powertop is your friend! measure. Anything above 6watts usually means something is doing something (this ball-park 6watts comes from experience, last 3ish devices - anything lower than that whilst doing things == great. I do about 8ish watts when using VA-API to play video for example, which I find very reasonable + similar to my previous 5850u).


We’re talking about 9 hours with a lot of faffing here. The M2 MBA hits 15-18 hours real world. The FW feels like a laptop from 7 years ago to me.


Is it running windows?


It is currently - tried a couple of distros (arch, Ubuntu) with different combinations of the recommended (and other) tweaks. Windows seems to sip battery compared to both.


It’s scale, too, isn’t it? At Apple scale you get better materials, better engineering at every level, and better testing processes—for the same or less money.

I admire Framework for achieving what they’ve done, modest though the success is in absolute terms. That said, they’re also probably still burning venture funding. Hardware disruption is hard.


The tradeoff is that the Framework's battery life is garbage. I had laptops a decade ago with 2x the battery life. I'm lucky to get 3 hours of battery doing web browsing, and standby/hibernate on the Intel processor in it is so bad that it'll fully discharge in less than 12 hours doing literally nothing.

I really want to like it, but my Framework is literally a worse user experience than my old 2016 Macbook Air and it's probably 3x the price at this point. Playing even a lightweight game makes the fans ramp to a frustratingly loud level and I essentially have to keep it plugged in at all times.

It's also hard to express how oddly problematic having a 4x3 display in 2023 is. Almost literally nothing expects that.


Yeah, I'm with you on the battery life. Though I don't understand where you get the 12-hour figure for a full discharge while in suspend. In that time period I lose about 20%. (If you're running Linux, try changing from 's2idle' to 'deep' sleep; not sure if it's possible to change this on Windows.)

The thermals are indeed terrible. My 12th-gen has this issue where it will sometimes overheat, and the firmware will lock all CPU cores at 400MHz (yes, MHz) for anywhere between 1 and 20 minutes, even though temperatures drop down to reasonable levels within seconds. Support has been completely unhelpful, and Framework doesn't even seem to be acknowledging the problem (which has been reported by quite a few people on their community forum).

The display is actually 3:2, which is just bonkers. At least 4:3 would be somewhat reasonable (if very mid-00s), but who has ever had a 3:2 display for anything? I kinda get why they decided against a widescreen display: the mainboard needs to be 'taller' to accommodate the RAM and storage slots and the spacing for the expansion ports, and a 16:9 or 16:10 screen would mean a 'shorter' chassis, which would lead us to a much smaller battery. But man, the screen is just weird.


3:2 is used by microsoft on the surface.

16:9 is plain disgusting for a computer.



Are you experiencing degraded performance when running on battery or is it happening at all times?

You might want to look into BDProcHot throttling.

https://pastebin.com/JB3R2CdM

I returned a HP laptop because of this crap.


The Framework 13's screen is not 4:3, it's 3:2 (2256x1504), just like a Microsoft Surface Pro 9 (and others). 3:2 is widescreen-ier than 4:3. I had a Surface from work and found no issues with using a 3:2 aspect ratio, so I'd be baffled if the user experience with the Framework differs. Most (all? I kinda stopped caring) iPads, incidentally, are 4:3.


A cursory search shows that various testing websites have measured battery life to be 7–8h on the framework which is decent I’d think?

I have no firsthand experience but 3h in comparison seems low.

Also, web browsing can be intensive if you use js heavy sites like linked in or even the GitHub homepage (when not logged in)


We’re seeing other thin&lights get double that nowadays.


> They would rather you replace your entire computer when your screen cracks.

The fact that a company dictates things to the customer is proof in itaelf that the market is not competitive


I wish something like Framework could be more popular so it could be easily available on other countries. As is, the impact something like it is small. Hopefully this changes at some point.


It's still very new and working out the kinks in the concept. I wouldn't expect it to take off in the current state. The videos they've done on Linus Tech Tips (Linus is an investor) keep me optimistic about their prospects.


What I really appreciate with Framework is that they have little kits that let you repurpose the old insides of your laptop when you upgrade the processor or SSD.

That's a nice touch that means you're not just throwing that stuff away.


Yes, Frameworks are amazing on paper, and yet I still can't purchase one in New Zealand but I can easily purchase Apple hardware.


It’s straightforward to use a shipping forwarder, just adds a bit to the cost.


NZ Post’s forwarder - YouShop - has worked well for me in the past.

https://www.nzpost.co.nz/tools/youshop


It adds a lot to the cost, and I am not protected by the New Zealand consumer laws in case something isn't right with the product. Way too risky for my liking.


I'm a Kiwi (now in the UK) and you're absolutely right. I always avoided forwarders because: no company warranty, no consumer protection, etc.

Forwarders are just grey market. Good for cheap items, but not so much for devices.

I miss the Kiwi attitude & the paradise on Earth environment, but I certainly don't miss being left out of the rest of the world, no easy access to products & services and long shipping times besides.

Have an L&P for me bra.


How does it compare cost-wise to just buying normal used or even new laptops?


Some of the laptop issues are a chain of dependencies that en up with mechanical and power envelope. That doesn't mean it's impossible, but say you want to go from 720p to 2k, that also requires more power, a bit for the panel and a bit for the backlight. The mainboard might not have a TCON and eDP or LVDS transceiver to drive such a panel, and the GPU might not be able to handle it either. So now that new panel needs three more things:

  - Power supply
  - Display transmission
  - GPU
And those things in turn need more (or well, 'different') space, which in turn might not fit in the existing chassis. And if it does, you might still have the thermal envelope issue where the chassis can't support the right airflow in the right places, or the cooling solution might not.

None of those are impossible to solve, but they also aren't as simple as "just give me a different panel", because it's not really just about the single component, it almost never is.

We have had a lot of luck in the past with a single 'family' of computers (usually largely designed by compal, Foxconn etc.) having interchangeable parts, and the base models essentially being downgraded versions of the top-tier model, which means that as long as you stay within that family, you get to upgrade parts along the way.

Framework does an excellent job of keeping everything that still fits within the same envelope going. But if at any point the current eDP interface needs to be replaced to support higher bandwidth, that means that the possible combinations between panels, cables and mainboards is now a whole lot more complicated. Some panels might only work with some mainboards, which might mean you have to replace both at the same time, greatly reducing the usefulness of what remains. Currently we don't have a huge gap between previous generations and current generations, but I'm very curious what's going to happen when we do make another significant jump. Only time will tell.


I imagine by that time you've used their market place often enough. Their used parts shop might be more awesome than the laptops. I can only imagine what it will look like in a few decades. Should be awesome.

In contrast, I'm the 4th owner of a "free" asus laptop that broke shortly after it was purchased (by the first owner) They never bothered to return it. I have a pretty good idea what is wrong with it but there is about 4 cm of space when I open it. I need some long strong pliers to pull some distant plugs. In a repair video from someone who really knows what they are doing, he reaches deep inside, pulls the plug with great force then smashes the pliers against the pcb making the laptop jump on his desk. "this is normal" he said and proceeded to pull the rest the same way. I've been gathering the courage to do it for about a year now.


> And those things in turn need more (or well, 'different') space

But this is the source of the problem. The lack of modularity and standardization.

The first generation mobile Core i7 from 2008 had a TDP up to 55W. The base TDP of the current generation is the same, Ryzen is even less. It shouldn't need a different cooling solution to dissipate the same amount of power. The same is true of mobile GPUs.

Don't layout the new board in a different configuration. Don't change the chassis. Just put the newer chips in it, in the same places they were before. Which means the newer boards and chips will be compatible with the older chassis.

> say you want to go from 720p to 2k, that also requires more power

If the new screen uses 20W instead of 15W, that shouldn't really matter. Screens typically aren't actively cooled and are also hinged into their own section of the chassis, so if the newer screen needs different passive cooling, it can come with that and still slot into the same base chassis.

And if the new screen uses 20W instead of 15W, you should still have enough power unless the entire rest of the laptop is at max spec. If you have a battery and power supply that could support a 55W CPU and a 55W GPU, but you actually have a 25W CPU and a 30W GPU, the screen using an extra 5W should be irrelevant.

> The mainboard might not have a TCON and eDP or LVDS transceiver to drive such a panel

On the other hand, it might, because 2k displays have been around for more than a decade. And if it doesn't, you should be able to upgrade the GPU to one that does, and plug the display directly into the new GPU, which in turn plugs into a standard slot using PCIe that doesn't require anything else to be changed.


>For both my Dell and Supermicro workstations, they basically last 10+ years, and I've got a lot of options to keep them going; I wish I could say the same about laptops. I really wish I could get a new 2k screen for the old Dell Latitude E6530 I have -- it's built a lot more robustly than my newer latitude.

Agreed. When I was checking out new laptops to buy one, in shops, about a year ago, I was shocked to see the lower quality and sturdiness of the cases and keyboards. For no apparent gain except possible lightness.

I think laptop makers have started doing an auto industry in terms of built in or planned obsolescence.

Edit: just like mobile phone makers.


It's been like this for at least 2 decades now.


I swore off those Chinese manufacturers (Lenovo, Asus, Acer, etc) many years ago for similar reasons and haven't looked back. I just don't buy any products from them, laptops included. Surprised people are still having to learn the hard way that their products are crap.


Asus and Acer are Taiwanese, not Chinese, and none of them actually manufacture their own laptops in house, that is outsourced to manufacturing partners.

Nearly all laptops on the market, regardless of brand, are manufactured by 6 Taiwanese companies: Quanta, Compal, Wistron, Inventec, Pegatron, and Foxconn.

And yes, that includes Framework, who use Compal as their manufacturing partner.


Is there any way to link up which designer works with which contract manufacturer (or at least the models)? And are there any exceptions? I seem to recall Alienware stuff was made in Florida at some point.


Search for schematics, or look for identifying clues on the motherboard. All the OEMs have more-or-less distinguishable styles and part numbering schemes. E.g. a part number beginning with "LA-" followed by a few digits, then possibly another character, indicates Compal (LA-8331P, LA-4921P, LA-A994P.) Inventec likes to use 6050Annnnnnn (6050A2266501, 6050A2493101.) Quanta uses a few characters/digits code (e.g. ZHY, LX89, ZE6.)


Alienware is just owned by Dell now and seems to put out overpriced poorly built chunky machines that target gamers with too much money. Razer and Falcon NW might be better from the chassis design side, or Sager and Clevo from the OEM value side.

The Alienware laptop I had for a week or two a few years ago was the worst laptop I've ever used. Noisy and hot and expensive and poor ergonomics and just all around unimpressive, poor hardware masked by a once respected brand name.


To be fair, Wistron actually used to be the manufacturing division of Acer, until it was spun out ~25 years ago.

My first job out of college was at a US-based networking hardware company, and I was really surprised to learn that, even though most of the manufacturing & assembly is done in China, most of the electronics OEMs & ODMs are actually Taiwanese.


TSMC gets a lot of attention being as prominent as they are, but most people don't realize that the entire computing supply chain is Taiwanese. It's not just the silicon, that's just the tip of the Taiwanese iceberg.

Combine that with China having some manufacturing capacity of its own, China stands to lose far less than the west in a hypothetical war over Taiwan as far as computers go.


Also Pegatron was spun out from Asus.


This response does not actually address the thrust of my comment. How is Apple able to put out very high quality hardware, even as it uses the same few factories that others use?

It is the customers of these factories that provide the designs and dictate the quality of manufacturing and tolerances that they want, no?


Isn't that because (modern Apple Silicon) Macs are vertically integrated? The Wintel stack is a Frankenstein and nobody has particularly good quality. Dell, HP, Alienware, Razer, Samsung, Sony etc. are generally all crap these days too. The Lenovo X series is quite a bit better than most American lines, but still not particularly impressive in terms of build. Microsoft's Surface Book line looks well designed but had a bunch of heating and power problems. Even my Intel Mac frequently has problems. My M1 one, on the other hand, is amazing, so far ahead of every other computer I've ever had.

On the Wintel side, I've had better luck with small biz machines (ThinkPad, not Lenovo, or Latitude) plus upgrades in home warranty, but honestly those break down too. They're just easier to fix.


Common American L.


My T420 is still running strong 12 years later. I don't use it for much since I'm not on Linux anymore, but I occasionally open it up and play with it. I'm using a Dell Inspiron now and I'm hoping it lasts a while.


Yup. I have two working t420s (t420s is thinner than t420 and I think puts it in the realm of "acceptable" in modern standards. T420 I find is mostly a stationary laptop). They run upgraded ssd drives and ram.

I also recently stocked up on 3 used t480 as it's the last generation with easily removable battery as well as drive and memory. Those should last me a while! My primary daily driver is the 6 year old t25 - basically t470 but with good keyboard.

But I think... That's kinda the point. These laptops were upgradeable but modern ones are not. T490 and onward don't have external replaceable battery. A lot of modern thinkpads have soldered ram. And none of them have the elegance of ultrabay of yore.


Most of the newer business class laptops I've seen just have a few screws to pull the back panel and replace the battery. It takes about 5-10 minutes.


The T480 and older have two batteries; one fixed internal and one removable external. It transparently switches between them, so you can hot-swap the external with out any interruption. Fantastic feature!


Right but that's a) compared to 5 seconds with external battery and b) requires certain comfort and skill level and c) as you point only for specific laptops


Regarding c), I personally never get anything but business laptops. The consumer ones are just too crappy, plastic everywhere, won’t last more than 3–4 years, underspecced hinges, and plastic clips.

The business ones are the same price or cheaper past the three years mark when IT departments all get rid of them at the same time and spare parts are plentiful for the next decade.


My Inspiron made it 7 years before a chip in the battery that identifies the battery as dell authentic died. The battery itself was still 95% good (I mostly used it plugged in, and set the bios so)

Still, a laptop without a battery lost many convenience factors, like moving to another room without shutting down.

Now I bought a M2 max and am mostly pleased. Still can’t say I’m a huge fan of macOS, but it’s serviceable.

I try to get 10 years out of my devices. Fingers crossed.


For a lot if what I do, the CPU on an older laptop is just fine. So, I too use really old laptops.

That said, I am upgrading laptops that sold with 2 - 4 GB of RAM to 16 GB, or at least 8 GB. Often I am replacing on old HDD ( maybe as slow as 5400 RPM ) with a faster SSD. That SSD also juices the battery life. So, what I am using is a lot nicer than what the original owner had to deal with.

Many laptops today will not age as well as, in ten years, that soldered on 8 GB is really going to be a bummer.

I use Linux a fair bit which makes a difference. Linux is fantastic on older hardware and the software itself is totally up to date. I use old Mac laptops a fair bit. The version of MacOS they support would be unusable as would the applications that would still run on it. The same is becoming true of Windows.


That soldered ram is indeed a killer, and the transition can be quite sudden. We have a 2014 macbook air with 4gb of soldered ram, and my kid was fine using it for office and browsing, until at one point he suddenly wasn’t. The ram footprint of that software together with macOS updates passed some threshold where it just became too slow, so now he’s using one of my other old laptops, a windows machine with an even slower cpu, but with 8 gb ram, and that one is still ok, for now.

Apple’s talk about sustainability is just talk as long as they bake in poison pills like soldered ram and soldered storage. Those machines are unupgradable and unrepairable.

The thin and light market is almost entirely soldered ram now, so now I just price in the 16 gb upgrade as the base config price when looking at those.


The real scandal though, is that you need more than 4GB of ram to do some office and browsing. I mean, you update the software and it just becomes worse?


windows need 4 GB ram, so this is operating system...


Honestly I feel bad for most people who ended up with a 4gb macbook, and Apple should feel some shame for continuing to sell them for so long, not terribly unlike the Vista debacle. 4gb was an awful choice already in 2014, and since 16gb was (I think) the max and still kinda mid, that was the obvious choice; 8GB would have still been fine for an average user. I have no idea if 64gb is the obvious choice now, but the upgrade is so obscenely expensive that I'll be holding onto my intel thing for a while longer. Not as long as I can, but until the cost to value ratio becomes more compelling, or something like a cellular modem comes around.


It's about price detection. If you offer several versions of a product to discover the maximum reach consumer is willing to pay, then the cheapest versions need to be meaningfully worse to drive those who can pay to the me expensive versions.


True, but at a certain point you're just selling a shitty product with no longevity and justifying with behavioral manipulation for profit. Same with the 16gb iPad or meager iCloud drive storage tiers. They were fine I guess when they first came out, whatever, but the legitimacy of that tier shouldn't have lasted more than a small number of years


They finally catered to devs in the last few years but it’s already too late for many of us because macOS is garbage now. The ARM transition makes it worse. I use Linux now.


> The ARM transition makes it worse.

No idea what you mean. The M1 MacBook Air is the best machine Apple has produced for years and finally convinced me to update my old personal laptop. The battery life is insane despite the performance being better than any of the high end Windows laptop I have used for work during the past decade. Plus it was correctly priced at launch which is very rare for Apple which usually extremely overprices everything.


Ya I'm not sure where people are coming up with this extreme opinion. If they go to linux, maybe they're just trying to follow a path that's been restricted through layers of system protection or something. One of my only gripes with new macbook pros is just that the ram feels insultingly expensive, and to go past 32gb you have to upgrade the CPU (all in it's over $1000 to get there).


Yeah it’s top-tier hardware, but the software limits it. My main issue has been with Docker, which has been discussed to death here on HN in various threads.


Very much disagree, there's not a chance in hell I'd choose linux over macos and imo—with a few exceptions—it's better than ever, and gradually becoming a more approachable platform to build things for.

Although I'm fine with linux for server, when something breaks it can be a massive pain in the ass to track down why, until you've spent years day-in-day-out troubleshooting particular software to the point you know immediately what the issue is. This just doesn't happen at all for me personally on macOS in everyday use.


I was listening to a Python podcast a few months ago where they were hating on macOS/Macs and it really resonated with me. They were saying it was becoming a common sentiment to move away from Macs, and this is after their throttling issues seen in Intel macs.

Anyway they’re just opinions. I can list the reasons macOS makes my life harder, or doesn’t do things it used to do, for example display scaling on 1440p displays, but ultimately it doesn’t matter, because it works for you!


What do you mean by display scaling on 1440p displays? I'm using a 2560x1600 display atm but it's just running at native resolution.

I'm not here to vehemently defend macOS, and ultimately it is just my opinion and personal experience, but I see a lot of "macOS isn't what it used to be" or complaints about some specific gripe. They do happen rarely, but I've used other OSes and don't see how they'd be more compelling. What's on the list for you?


So you won’t be able to scale the display, meaning 2x all UI elements. You used to be able to, but Apple removed that feature in the last couple of years. There was a sort of hackey workaround you could do on x86 macs, but it doesn’t work on ARM macs. It’s just dumb man. All I can fathom is they want to sell more “5K” displays.

I find it harder than ever to just do stuff. SIP makes things harder, and I get that it improves security, but software engineers don’t care for stuff like that that just gets in our way. Gatekeeper used to be just a mild inconvenience.

If I want to downgrade a Mac that’s on a beta version of macOS, I have to have another Mac, connect that Mac to it, and run software to “restore” it. I can’t just plug in a flash drive with the macOS version I want.

Night Light doesn’t work on DisplayLink displays. No real reason for that.

It’s 2023 and I still can’t adjust the brightness or volume of external displays (sometimes it works with something like Lunar). This is shit that Windows and Linux have been doing for 15+ years.

Not a lot of people know this but DisplayPort Daisy Chaining doesn’t work on macOS. Not supported. Never will be.

Screen recording doesn’t include audio. I’m sure there’s a way around it but come on.


Why is macOS garbage now? I am considering buying a Mac, can I reliably use Linux on it without hitting hardware related issues all the time?


You should just buy the mac and find out, then return it if not. They have a good 14-day return window that I use occasionally for a similar purpose.

I think Asahi has come a pretty long way but idk quite how far.

Why do you feel macOS is garbage?


GP said macOS is garbage, not me. I can't believe it is garbage for the average consumer. It's Apple after all.


Yeah it seems decent enough for non-devs but iCloud has always been buggy for me and everyone I know that uses it. I don’t have a Mac anymore but I do have an iPhone.


Yeah I'm so glad I paid top dollar for the 8GB version of the OG X1 Carbon back in 2012 - and got the i7 with the biggest NVME, a 256gb.

I also have a newer Nano for work, but prefer the keyboard of the old. If I had picked the 4GB model then, it would have been pretty useless today.

Today I wouldn't buy a personal laptop with less than 32GB of memory. The X1 Nano was hard locked at 16GB, and I guess I didn't quite anticipate how much I'd be using Docker. :(


The nano looks cool but to me there’s no point if I can’t get 32GB of RAM.


Yep, 16GB today is truly the 4GB of 2012.

(It's super lightweight though, to the point where it's impossible to know if I've packed the laptop in my bag by weight alone. Hopefully it'll see an upgrade soon.)


Unfortunately soldered RAM consumes less energy and space.


And it sells more systems when the user needs to upgrade for more RAM.


And gives some users on HN something to complain about


And $$$ for hot air workstation operators.


And when you have great resell value than that device might get a second and third user as well.


And they operate trade-in programms spesifically to destroy second hand market


Waiting for the data to be swapped in consumes more energy.


Compute power has never been an issue for me, but I generally try to max out the RAM as much as my budget allows. This is why I still have a 2010 Macbook that is more than fine. Started with 4GB and was barely usable for heavier even when it came out, but dropping 16GB into and it has been smooth sailing for over a decade now.

That said my daily runner is now T420 from (2011-ish) that came with 6GB of RAM that is in near mint condition I bought for $100. I swapped the HDD for an SSD and just threw Linux Mint on here - I disable the swap file so that it doesn't hammer the drive. So when it is 6GB of RAM - that is all it can work with but it has never been an issue.

But the original HDD was running Windows 7 and oh my gosh the bloat they had installed! This poor thing was in pain with that junk on board! It is amazing how many times I have had an old laptop come up, unusable because of how the OS treats the system and the users just keep dumping more on it that is reasonable. But swap the disk drive and OS and you would never recognize some of these things.

I once came across some netbook Toshiba Presario (?) - 1.4Ghz Core Duo with 3GB of RAM running Windows Vista. The poor thing! But do the old one two on it and it was a brilliant little machine for plucking away on, could easily get 6-7 hours on battery as well.


> I swapped the HDD for an SSD and just threw Linux Mint on here - I disable the swap file so that it doesn't hammer the drive.

If that's for the longevity of the drive, I don't think that's really even necessary on a personal device, based on my limited experience. Unless you're aiming for a lifespan measured in decades or would expect to be swapping heavily if you had swap.

I have a basic consumer-grade Samsung 850 EVO SSD that I've been using daily in my laptop since ~2015 or 2016 or so. I've had zero problems with write performance so far, and the SSD reports a wear leveling count of ~140 which seems to indicate the (average?) number of writes per block so far. That value normalizes to a SMART value of 93 out of 100. While SMART may not be much of a reliable indicator of anything, if the number of blocks written is anywhere near realistic, that value seems to make approximate sense if TLC NAND lifetime expectancy is rated at ~1000 writes per block.

Samsung also seems to have a warranty of up to 5 years or 150 TB written for this SSD (depends on drive capacity). The drive reports a total of 23.7 TB written so far.

I haven't been doing a lot of data-heavy work on the device but I haven't really been particularly careful with the SSD either. I've got a swap that's seen some actual use (8 GB RAM), and I also hibernate semi-regularly. I've been trying to keep at least ~15 percent of the capacity free to help with wear leveling but that's about the only conserving I've been doing.

Of course it might be that my drive is just waiting to suddenly start failing writes but I'm not really expecting that.

And your disabling swap might be for some other reason, of course.

But if this is par for the course for consumer-grade SSDs in general, I wouldn't really be worried about the effects of swapping on life spans unless there's some particularly heavy hammering planned.


> Unless you're aiming for a lifespan measured in decades or would expect to be swapping heavily if you had swap.

This one. My VERTEX3 is running fine, though it's not TLC, of course.

And with auto-leveling all you really need (if you care that much) is to.. increase the swap size, so there would be less evictions from the swap => less writes.

But anyway, even modern TLC drives would be fine for a decade if this is just a machine used for a couple of hours everyday. You need like a constant 3MB/s 24/7 to even be close for their rated TBW.


> And with auto-leveling all you really need (if you care that much) is to.. increase the swap size, so there would be less evictions from the swap => less writes.

How does that work out? To where would data be evicted from swap?

Increased swap size might of course help with wear leveling if that means there's just more unallocated space on the device.

> But anyway, even modern TLC drives would be fine for a decade if this is just a machine used for a couple of hours everyday. You need like a constant 3MB/s 24/7 to even be close for their rated TBW.

Mine hasn't even been just a couple of hours every day. Sure, during a regular work week it might not be used a whole lot, but I've used it for entire days e.g. when I was doing my master's full-time.

Again, not a whole lot of memory intensive work, but not intentionally conservative for a general-purpose laptop either. Running just about any kind of a game often means some of the memory of the umpteen browser processes get swapped out.


> To where would data be evicted from swap?

This is actually an interesting question, but this is quite obvious if you know how it works.

Consider this scenario:

Some app loads data (it goes to RAM), sometime later this data is moved to the swap. Now the app is trying to access it, so there are two ways:

there is no free RAM (eg other apps locked the physical memory), so the access is going through the swap

there are free memory blocks so the OS copies the accessed data to the RAM, app is working fast... but till the data is modified, there is no need to mark the data in the swap as a stale, so until it happens the OS can wipe the RAM and redirect the app to the swap.

Now consider another app is pressured to moved to the swap at the same time:

If there is not enough swap space, then the data of the first app, which has [a valid, synced] data both in the RAM and the swap is just left with with the data in the RAM and the swap space is freed to accomodate the data of the second app. If turntables (sic) the process would repeat, just with the both apps trading places.

But if there is enough swap space available, the second app data is just written to the spare swap blocks, and both apps' data could be read from the swap at any time.

Quite simple.

> Increased swap size might of course help with wear leveling if that means there's just more unallocated space on the device.

This is a thing too, ofc, but the main point is having less 'overwrites' per usable space.

> Mine hasn't even been just a couple of hours every day.

It's even 'worse' than that. I've seen a server with 850/860 Evos and LUKS (so the worst case: cheap drives and all writes are new), even after the 3 years they were only 50% of 'ssd health'.

TLC is the worst thing happened to SSDs as a technology, but people are forgetting what 3MB/s * 3600 * 24 is whooping 259GB a day, and you need to write ~259GB a day for 3 years to totally deplete the [official] write endurance (YssdMV).

Wanna hear about 5 WD10JUCT?


Right. So, when swapping pages back into RAM, a typical OS will not actually remove the pages from swap despite them now being in RAM as well? So if the memory needs to be freed again for another purpose before the pages in memory have been modified in the interim, the OS can just drop the pages from RAM without having to write them into swap again?

And thus thrashing causes more swap writes when there's limited swap space.

That's the only scenario I can think of. I didn't think about pages not being removed from swap when swapping in.

> I've seen a server with 850/860 Evos

Well, that's desktop-grade hardware in a server. Might work, but if it doesn't, you get what you asked for.

> and LUKS (so the worst case: cheap drives and all writes are new)

How does LUKS affect that?

> people are forgetting what 3MB/s * 3600 * 24 is whooping 259GB a day, and you need to write ~259GB a day for 3 years to totally deplete the [official] write endurance (YssdMV).

Right, this was my point. That's not an impossible amount of writes but it is a lot more than almost anybody does on a laptop/desktop.


> OS will not actually remove the pages from swap despite them now being in RAM as well

Now the hard part: I don't have an idea, because I'm no that versed in OS VMM, but it does make sense even if it is 1993. But this is the basics and I doubt it works some other way. I would be happy if someone more familiar would chime in (and the reason why or not), but in my experience this is what happening.

> Well, that's desktop-grade hardware in a server. Might work, but if it doesn't, you get what you asked for.

Extremely often and works not just 'quite well', but well enough. Similiar servers (literally, just without LUKS) were fine, with > 90% of 'SSD health'

> How does LUKS affect that

Every write is different, because the bytes on the storage are already encrypted. Ie you write the same bytes to the same blovk pn the FS, but the underlaying, encrypted, bytes are not the same => new write.

> Right, this was my point.

Yep.


If the OS didn't keep the pages around in swap after they were swapped back into RAM, I don't see how having more space in swap could reduce writes to it.

> Every write is different, because the bytes on the storage are already encrypted. Ie you write the same bytes to the same blovk pn the FS, but the underlaying, encrypted, bytes are not the same => new write.

How are they not the same as the previous encrypted version of the same bytes if the encryption key stays the same?

Are you sure TRIM/discard just wasn't enabled on the LUKS?


> If the OS didn't keep the pages around in swap after they were swapped back into RAM

Well, why it shouldn't? Don't forget, while the RAM can be prepared (zeroed) fast, the on-disk swap can't be prepared that fast (compared to RAM ofc), and you already under some memory and disk constraints (that's how you got to be using the swap in the first place) so adding another workload for cleaning up the swap is... not a good thing.

If you only would do 'zero on allocate' then sooner or later you would be in a position where you would need to stall the whole system until enough pages in the swap are available. And nor the users nor the programs like that.

> I don't see how having more space in swap could reduce writes to it.

Swap is just 'slow memory' part of your overall virtual memory allocation of the whole OS. If you have a small swap then you would be pressured to evict the data from it more often. If you have a big swap then less pressure => less evictions => less overwriting the same LBAs assigned to the swap file/partition => less writes overall.

> How are they not the same as the previous encrypted version of the same bytes if the encryption key stays the same?

Even if you write 00000000 to the filesystem block (a typical situation would be deleting a file and on the flash/SMR drives you would just call TRIM on those bytes^W LBAs) it's not 00000000 down there, it's some ciphertext.

> Are you sure TRIM/discard just wasn't enabled on the LUKS?

See above, no such thing as TRIM on an encrypted storage device.


> Well, why it shouldn't?

Yeah, maybe it should. I don't know how exactly that's typically implemented, hence the question mark.

> Don't forget, while the RAM can be prepared (zeroed) fast, the on-disk swap can't be prepared that fast (compared to RAM ofc), and you already under some memory and disk constraints (that's how you got to be using the swap in the first place) so adding another workload for cleaning up the swap is... not a good thing.

I doubt any of that would typically involve the OS specifically zeroing anything apart from just marking the corresponding parts of the swap space as free. And that's probably in the in-memory data structures keeping track of pages in swap. No need to zero any of the page contents, just mark those areas of swap as unallocated.

An OS might want to keep the pages around for the sake of having thrm still around but unless otherwise shown, I kind of doubt there's a great cost to the deallocation itself.

> If you have a small swap then you would be pressured to evict the data from it more often.

I get that if indeed the OS keeps pages available (and in the books) in swap after swapping them back into RAM, having a larger swap can reduce writes in case those same pages end up getting swapped back out again before they've been modified. It may be that's how it works.

I'm not sure I follow the logic if that's not the case. I don't understand why the OS would "evict" pages from swap just because swap is getting full -- there's nowhere to evict them to except RAM.

If you've got thing A currently in swap and you're needing to "evict" it to make space in the swap for thing B, that means you're wanting to get rid of B in RAM. That means you're already needing more space in RAM for some third thing C, so why would you solve that by swapping pages of thing A from swap into RAM?

My understanding is that swapping pages back in would be initiated by needing those pages back in RAM, so that's not really a case of eviction.

Anyway, since it doesn't seem like either of us knows for an actual fact how it works in any particular OS, I doubt speculation will lead to anything better.

I believe what you've seen as a phenomenon, I'm just not sure the explanation of that phenomenon makes quite enough sense to me.

> See above, no such thing as TRIM on an encrypted storage device.

TRIM may have security implications for encrypted drives, e.g. in terms of plausible deniability for forensics, that's true, and I didn't think of that. Though I'm not sure if that's what you meant.


> Though I'm not sure if that's what you meant.

No TRIM - no info about free blocks what could be safely and fully reused - more overall wear for SSD.

> I doubt any of that would typically involve the OS specifically zeroing anything apart from just marking the corresponding parts of the swap space as free

I'm not sure about the swap but memory nowadays is definitely zeroed before being allocated[0], not only it's more safe on security side of this, it's way more stable. Imagine a buffer overrun bug which wouldn't manifest itself if the PC was freshly booted (ie all memory is zeroes) but after a couple of hours it writes 0x10 blocks but reads 0x100 and executes them? With all the garbage what was left there by previous processes...

Obviously[1], zeroing the swap is expensive and questionable practice, if you are swapping from the RAM then you overwrite the corresponding parts anyway...

Guess I was over(under?)-thinking the process.

> Anyway, since it doesn't seem like either of us knows for an actual fact how it works in any particular OS, I doubt speculation will lead to anything better.

Of course, but I do remember how the things worked back in the day (when the memory were constantly at the constraint because 16MBs isn't enough) and you definitely see the process with A and B swapping in and out, without any C, because you just used A and B, occasionally switching between them.

[0] https://stackoverflow.com/questions/18385556/does-windows-cl...

[1] well, now, after a fresh look and at the desktop


> No TRIM - no info about free blocks what could be safely and fully reused - more overall wear for SSD.

It's definitely possible to enable TRIM on an encrypted device, at least on LUKS. It requires specific support from the LUKS layer, though, and that requires software versions from the last decade or so and still isn't enabled by default:

https://askubuntu.com/questions/115823/trim-on-an-encrypted-...

https://askubuntu.com/questions/59519/do-you-recommend-luks-...

https://wiki.archlinux.org/title/Dm-crypt/Specialties#Discar...

I originally hadn't remembered about the security implications or that it's not enabled by default, I just remembered it was possible, so "LUKS, therefore no TRIM" didn't seem to make sense. But yeah, it's not enabled by default.

> I'm not sure about the swap but memory nowadays is definitely zeroed before being allocated

Yeah, good point about switching between processes. Not sure it applies to swap, though, and I doubt that's zeroed, exactly because of what you said.

> Of course, but I do remember how the things worked back in the day (when the memory were constantly at the constraint because 16MBs isn't enough) and you definitely see the process with A and B swapping in and out, without any C, because you just used A and B, occasionally switching between them.

Yes, absolutely, that can happen. But if it's just between A and B without any third C, the entire need to move stuff between RAM and swap is initiated by the need to get A's pages into RAM in the first place (because A is trying to access them). Not by needing more space in swap for B's pages.

Having to write B's pages into swap is caused by memory pressure and the need to fit A's pages into constrained RAM rather than A's pages being "evicted" from swap being caused by a need to get B's pages into swap. You're already needing to get A out of the swap and into the RAM -- that's the initiating reason for the whole deal.

Once the OS switches between the processes again and some of B's pages need to be accessed in RAM, the OS may then need to write A's pages into swap again in order to make room for B's pages in RAM. But again it's due to memory pressure.

I don't see how swap size affects the amount of writes required in that scenario.

Unless, of course, the OS actually also keeps A's originally swapped copy of a page still around in swap after reading it back into RAM. In that case having a large enough swap to fully contain all the dirty pages of both A and B might reduce the amount of swap writes. If A, in the original scenario, only reads data from the accessed pages between the switches and doesn't modify them in the meantime, and the OS has kept its a copy of A's pages around in swap even after swapping the pages back into RAM, then of course when B runs again and its pages need to be read from swap back into RAM, the OS might be able to skip physically writing A's unmodified pages into swap again because their copies are already/still there.

If, that scenario, the swap is not large enough to contain both A and B's dirty pages, the OS had to drop the copies of A's pages from the swap to make room for B's, thus causing the need to physically write A's pages into swap again when switching back.

That's exactly what I was speculating about. Maybe that's what you meant by swap evictions?

I don't know if that's how OSes do it, though. It's possible but it would require some additional tracking of pages and their statuses.

If the OS does do that, then the size of the swap may potentially affect the amount of writes required to swap over time. Although it's only beneficial for pages that aren't modified in RAM between the swaps in and out.

If there is no such mechanism of keeping copies of pages around in swap even after reading them back into RAM, then I don't see how the size of the swap affects the amount of reads and writes.


My main issue with especially lenovos from 2010-2015 is that the lcd panels are poor quality (dim, loads of bleed etc) everything else is really good


You could buy them with both crappy panels or good high resolution panels. Obviously most companies opted for the cheaper crappier version, that's why most second hand thinkpads have them

But you can often buy a LCD upgrade kit for cheap on amazon/aliexpress


I have a T420s that really struggles with video e.g. Youtube. Watching any h.245 via VLC or mplayer is impossible. Also the battery is in a bad state and I haven't been able to find a good replacement that isn't equally poor or worse right from the start. It actually has two batteries, one being in what used to be the cd tray. Still, usage time is 3 hours tops.

It had been my daily driver for years, running Linux, but I had to replace it with something newer.

How is your T420 doing with all this? Did you update the CPU? Does it still have the stock battery?


It is still running a 2.4ghz 2nd gen i5. The battery is in decent condition but because it was a second hand machine, I am not sure what its life was like before it.

I do find it odd that it is struggling with video content, it doesn't seem to really push this thing to the limit even at 1080p but it does drive the CPU's a fair bit. It is fun looking at the video state of video decoding on Intel chips during that time period.

2nd gen are still fairly CPU heavy. 3rd gen the load is cut in half. 4th Gen it almost looks like your system is idle running that stuff. I have a Dell Optiplex Mini that runs a 2Ghz 4th Gen i5 and you can barely tell the CPU is doing anything do the media acceleration. Cool to see in action.


On aliexpress, there is an usbc adapter for the t420 power cable that will allow it to take charge from a 65w powerbank (maybe even 20w). No need to search for battery replacements anymore and can also throw away the original power brick.


Yeah, I replaced the battery on my t540p but it's still sub 1 hour life.

Replaced hard drive and upgraded the ram so now it works pretty well as a desktop PC running win 10.

It was running super slow for ages, fixed after I banished Google drive backup. Maybe a compatibility issue with newest Google software thrashing constantly on older OS / hardware.


I have a t420 in my drawer I've been considering doing this with. Years ago before it was old I used to dual boot it with mint and the power management wasn't great. How's your experience been? mind you that was with original spinning drive. what spec SSD did you install? sorry for all the questions.


The only thing that has aged, other than HDD that can be replaced, is video cards, which you can't upgrade. 4k monitors have become common and you will need something fairly recent to drive that. 4k videos also, my laptop struggles with certain 4k hevc iphone videos.


Actually my ZBook has an upgradable graphic card but I can upgrade it to a better model of the same age, so it's basically useless for compatibility with newer drivers. Nvidia and Noveau are going to end supporting cards from 2014 sooner or later.


>Linux is fantastic on older hardware

I've been having a hard time getting Linux to work on older hardware ever since the majority of distros dropped 32-bit builds.

Then again, "older" for me means stuff from the Pentium through the Pentium 4 era. Seeing as "older" today means Sandy Bridge and the like, the moral of this little tale is I am a fucking old man angry at the kids on my lawn.


The x86-64 architecture was introduced in 1999, with the first processor introduced in 2003. Comparatively, the 8088, introduced in 1979, was barely older at the time than the 64-bit Opteron is now.

It might be time to let i386 go.


Not as long as i386 devices still exist that are superior in at least one way to any amd64 or arm alternative. And what "superior" means depends on the use case. Some Mega Ryzen Uber Speed CPU with a Googol FLOPS that can't connect to anything is worthless to me.


What’s an example of something you’d prefer a 32-bit CPU for over a 64-bit CPU?


The CPU? Nothing. The device isn't just the CPU.


That doesn’t answer my question


I think you have graduated to retro computing enthusiast.


No kidding. Not to mention that Pentium 4 was just about to worst line of intel CPUs, culminating with junk like pentium d 820 and its siblings.

People ought to be paid to use this crap


People were definitely paid to use this crap at that time


But there must be some with 32bit support?


Debian still supports 32 bit for the latest version and there are several small distros that are explicitly aimed at 32 bit.


Seems like Debian is all one would ever need.


Slackware (http://www.slackware.com/) also still provides a 32-bit version as well.


OpenSUSE still builds i686 ISOs. Of course if you want to go crazy there's always TinyCore.


I won't take any distro seriously that doesn't support at least i686, amd64, arm and arm64.

Compiling for four architectures can't be too much to ask. Shit software that can't be written portably enough to run on more than amd64 must be kicked off the repo. We need this pressure on developers if we're to maintain a modicum of code quality.


> Shit software that can't be written portably enough to run on more than amd64 must be kicked off the repo.

Convince me, why should I spend my time supporting i686? Just to accommodate a handful of people still running 32bit hardware?

Calling other people's work "shit" just because they don't spend their free time supporting the wishes (not needs!) of 0.1% of their users is rude and extremely entitled.


FWIW, I agree with you. But I think there is an argument to be made: the OpenBSD team has indicated that cross-architecture ports help them find bugs that otherwise might not be noticed if they were just targeting the usual suspects.


Why should you spend your time writing it in the first place?


What a needlessly dismissive and frankly rude take.


I often buy a ~one generation old phone for this same reason. It's just so much better value to buy someone else's phone. The folks who keep buying the newest generation tech often seem to take really good care of their gear - perhaps because they plan to resell it later.

I haven't done this as much with laptops because I do tend to keep those for such a long time that I can reasonably amortize a bigger investment over ~5+ years.

Curated marketplaces like "swappa" are a good way to purchase used gear. But anywhere you've got an escrow/guarantee should suffice.


Went to comment this. A mid-range or high end product from a gen or two ago is often the same price as this year's budget option, and way, way better in every sense.


And in a favourable position on the bathtub curve of reliability.


Right, which is why replacing a laptop after 3 years isn’t that bad, so long as the old one finds a home. Unfortunately it seems like windows machine are much less resealable/reusable after they’re no longer wanted..


I often replace phones because the battery starts holding less charge and I find it annoying to charge all the time.

I tried a battery replacement service once and didn't like the result on my sealed and hard to open by design phone.


Get the battery replaced at an official repair location rather than a middle of the food court business. Apple has a whole setup of gear that will get a perfect result pretty much every time. And they will just give you a new phone if they break it.


I replaced it myself, perfectly happy with results. Cost me $20.


I do wonder how considering the iphone battery on iFixit costs $64AUD which doesn't even include the adhesive to reapply the screen. If you got one of ebay or similar, it's not a new battery, it's one that's been salvaged from a stolen phone and likely preworn.


Not an iphone, Lg G8s thinQ was suprisingly easy to remove the battery cover, just need a blade and a hair dryer. The replacement battery is not original, could not actually find genuine part on sale. But it is a new battery, and they don't need to cost $50

So it was £15 for battery and £5 for adhesive


But the biggest reason (for me) to replace phone is to get security updates; buying used is counter-productive to that end


I think that is more of an issue with the manufacturer.

Apple supports their phones for a very long time. They updated iOS 12 this year with security fixes, so that covers iPhone 6 and up.

https://www.macworld.com/article/675021/how-long-does-apple-...


Then buy Apple. The 6s is up to 8 years of updates.


Just replace it with another one generation old used phone when it stops getting updates. Still better value.


what are you talking about? I have an iphone 10 that that is several generations old at this point and it's still getting security updates and will for a couple more years.


Yeah. And since Reddit decided to be mega bitches and forced me to quit from their shenanigans, my battery lasts forever too.

I should have quit when they instituted their very obvious alt-right agenda pushing blocking bullshit, but whatever. Better late than never, and forcing third parties out was also a good boycott reason.

I did not realize how much time I spent in Apollo. Damn man.


If you buy a two year old phone then there's plenty of support left for apple and samsung devices in that time. At least 3-4 years for apple, 2 for samsung.


Because the time to get updates is less on an older phone or that a used phone is less secure because someone could install malware on it?


> Laptops don’t change

> New laptops may be more energy-efficient per computational power, but these gains are offset by more computational power

Guess this was written before the M1 :)

I also felt the Zen 2 Laptop chips marked a huge step forward on the Win/Linux side at that time: generationally more power efficient than the best laptop chips from Intel, same core/thread count as the desktop equivalents, and cheap enough to be in a lot of budget laptops.


Anecdotal, but my M1 MacBook Pro doesn’t have amazing battery life. The other day I went to a coffee shop with 60% charge and only made it about 2.5 hours doing iOS development.


My M1 max [14inch] was suffering with really bad battery life. Turned out it was sync plugin for VSCode.


Could you expand on that? You discovered vs code takes a lot of power when settings sync is turned on or something?



Same here. My MacBook air easily could go 6+ hours and it feels like the pro goes 2-3. Same workload.


I have an M1 MacBook Air and use an app called CoconutBattery to monitor power usage.

What I’ve noticed is that while the laptop is usually super efficient (say, 2.5-4W while web browsing giving 8+ hours battery life), occasionally the power draw climbs up to 6, 7, 8W or more with no obvious explanation, draining the battery in 3 or 4 hours.

Activity Monitor will show no apps using unusual energy yet the power is going somewhere? I’m suspicious that there’s something with maybe the WiFi hardware or the SSD which sometimes causes elevated power use, unseen by Activity Monitor?

Even a reboot won’t solve the phantom power drain while it’s happening, but it does seem to go away on its own eventually…


Watch your backlight level, which auto adjusts to the room light level.

"When comparing battery life, remember that screen brightness makes a huge difference!

On an M1 MacBook Air sitting idle on the KDE desktop, you will get around 7 hours of runtime at 100% brightness, and 23 hours of runtime at the minimum (not zero) brightness.

Even keyboard brightness matters! Those numbers are with the keyboard backlight turned off. If you turn it on at max, even with the screen left at minimum, your runtime goes from 23 hours down to just 14 hours.

So please don't ask us what battery lifetime to expect while doing a certain task and expect a single answer!"

https://social.treehouse.systems/@AsahiLinux/110548137464042...


Yes, I should have mentioned I always have keyboard backlight turned off [1]

I’ve suspected something to do with screen brightness in the past also. System log was often full of messages from the brightness adjustment/monitoring daemon, which raised my suspicions, but I think it might just have been normal behaviour at the time.

However the mystery power draw seems to happen independent of screen backlight setting. ie: when the problem is happening, turning backlight right down will reduce power draw a bit, but it’ll still be high. On other occasions, even with the screen brightness at 100% I’ll still get “normal” power draw. I’ve experimented with turning off brightness auto-adjustment too, with no effect.

[1] Keyboard backlight is very annoying at night when trying to watch a movie or something, and Apple made it difficult to toggle on and off on recent Macs by removing the hot key for it. So I’ve learned to just live without it.


Keyboard brightness - you can put the control into control center and then it’s two clicks to get to a slider for it.

Not awesome, but for the relatively rare times I want to change it manually, it’s been acceptable.


Yeah I do have that, but I usually find it too slow/annoying to have to use the mouse to turn it on and off. It was so much quicker and easier when there was an F-key to instantly do it.

Maybe one day I’ll get around to making an app to map it to an F-key again, which has been on my todo list for a long time…


Have you got the automatic adjustment off? The night time thing hadn't bothered me.

https://support.apple.com/en-au/guide/mac-help/mchlp2265/mac


You are correct. Even plugging in my gaming mechanical keyboard makes a noticeable difference.


Metadata indexing for Spotlight?

I've had very stubborn runs of the md* indexer (not at laptop can't recall name -- mdimport?) decide that yes, this coffee shop is a great place to crank it up.

A similar one is a large initial indexing by the Dropbox app. I've had my M1 Air on the desktop Thunderbolt dock for the past month so it's gotten way past that initialization stage.


Spotlight’s energy use shows up in Activity Monitor, though.

Whatever drains my battery is invisible to Activity Monitor (but visible to apps that monitor the power management hardware itself), so presumably it must be some hardware device?


I'm no expert but I definitely can't help but wonder if it's WiFi or Bluetooth. Maybe something like radio interference making it work 50x harder?

Around a decade ago I had a MacBook that was super sensitive to interference issues. Although I'm not sure if that results in a big power draw (the way it does with cell phones for example when you're constantly going in and out of service).


I can fairly confidently rule out Bluetooth, as turning it off in control Center doesn’t seem to affect the power drain. Wifi is certainly a possibility though.


On my (2020 intel) MBP, SSD IO is VERY power hungry. Usually the culprit is Firefox, but my company is on the MS office suite, so sometimes it’s Outlook or Teams. It’s frustrating.


Do you have a 13" MacBook Pro with M1 or a 14+" MacBook Pro with M1 Pro/Max? If the latter, it has decent battery life, but it's nowhere near as efficient as the regular M1 MacBook Pro. If it's the former, I guess I'm surprised, I own one (I'm writing this comment on one) and I've loved how the battery seems to just last forever on mine.


I have an M1 Max and while I admit that I find myself sometimes disappointed by the battery life, it’s still often getting more than three hours while I expend precisely zero effort to ensure I’m getting the most out of my limited battery life.

It’s a far cry from the 20+ hour runtimes shown during reviews of the new machines, but still good enough to cover about as long as I’d want to sit in one spot.


I’ve gone 4 work days without charging my M1 MBP


I've never gotten a full day


I've got the M1 2020 and find it almost lasts 1 day usage.


If you live on Jupiter, with its 10-hour days, then maybe.


Oh, this was meant for the 4-day guy. Not sure how I responded to GigaChad


> Guess this was written before the M1 :)

Depends on what you do, my girlfriend uses her M1 heavily for video calls and the battery lasts the same as any other average laptop.


OP claims still holds I guess. There are secondhand M1 in the market.


I'll also point out that quoted argument doesn't even make any sense.

The vast majority of people are not going to run their computers at full tilt all day every day all year long. Most of the time is spent in idle cycles or very low usage, so efficiency gains matter a lot.

If you are the few people who run their computers full tilt, it still doesn't make sense because you will finish your task sooner meaning you will spend less energy and time overall completing the same task as before.


I can't say I stick to this 100%, but I actually like using underpowered laptops. Optimisation is an important step in the development process but its often rushed through even by those who consider it most important (myself included). However, if you NEED to optimise your code just to get the thing to work on your computer, a lot more attention is put into it. If I'm optimising for a lower spec machine than my users are likely to have, that gives me plenty of leg room.

For the record, I'm mostly talking about personal projects, which are game dev, where performance is more of a factor than something like a web app.


You can easily underpower your better laptop on demand for a specific task. And also you can do it in more controlled ways: what happens if the process gets less memory, fewer cores, lower cpu speed, worse network, etc. Getting an underpowered laptop to do both development and testing is not the only, or even best way to do this.


You can, but you won't. Not regularly anyway, thats not the way you'll run your computer daily. Maybe for a while, but then you'll forget or want the extra power for something and get too used to it. Instead, you'll do it occasionally at specific times.

You've just recreated the problems I was talking about with an optimisation step but in a new environment.


You can be more specific. Don't lower the performance globally, just for the app you're developing. It's easy to automate in your ide configuration / run script / app itself when running in debug mode, etc. It's a one-off setup per project you're developing.


To expand on that, one of the main reasons I upgrade my laptop is to make development tasks better: faster compilation times, and less sluggishness when I need to work in bloated IDEs like IntelliJ.

Sure, that may put me a bit out of touch computing-power-wise from many of my users, but, as you point out, there are ways to compensate for that.


I'd like to defend the T430 from a T430. Yes, it was a bad purchase when I made it (and when the author made it), but you can buy them for $100 a piece now on Ebay. Keyboards you can probably buy 20 for $50. Since they are from the dawn of Thinkpad's bullshit, they hardcoded the hardware choices you could make for upgrades into the firmware. Awful. But 1vyrain (https://1vyra.in/) came along and fixed all that.

A T430 doesn't even feel underpowered these days, and has a (horrible) webcam (that can be improved a bit by judicious software postprocessing.) With a dining room table, a router running OpenWRT, 4 T430s, and a 5-10 year-old gaming PC as a server to compile things on/serve things from/etc., you could start a little software company and your biggest expense would be the table.


The 35 watt processor in the T430 is just way too easy to get toasty.

The T440 or above with a 1080p screen upgrade and trackpad replacement is so much nicer to use since it cooks your legs less, has a nicer screen, is lighter, thinner and still able to take hard falls without issue.


> The 35 watt processor in the T430 is just way too easy to get toasty.

Definitely true, but I don't use laptops on my lap very often. I also have a fear that

> lighter, thinner

may mean harder to get into and repair/replace things.

-----

edit: let me let these people argue in favor of the T430...

https://www.reddit.com/r/thinkpad/comments/r88wag/my_almost_...


I really enjoyed my T460s from a job back in 2017. My main complaint was the rather anemic 2c/4t CPU.

I have a LG Gram 15 that is pretty decent, but very wobbly/flexy. It's saving grace is the lightweight of it.

My 16" MBP M1 Max is phenomenal, but very heavy, and I move it 5-10x daily.

I plan on replacing it with the ~2025 MBA 15". My wife's 13" M1 MBA is great, but the 13" is not nearly as comfortable for a single display as the 16".


I have a T420 gathering dust in a drawer. It was my everyday laptop in college for a couple years but then I switched to buying used Thinkpad tablet PCs for taking notes. I eventually went back to it last year and upgraded a few parts but regretted how hot the thing got with such poor battery life. It was fine running in a docking station but wasn't great on the move which is annoying if I want to use it in bed or on the couch. I remembered how much I disliked the thing so I bought a used X280 and am overjoyed with the purchase.


I am running a T420 and 99% of the time it is so cool the fan doesn't even kick in. But yeah when you start to push it - it can get toasty. Work load really determines a lot here.


I just bought a new Macbook Pro and have been using it for ~1 week.

Even compared to my previous 2019 Macbook Pro, it's a huge step up in build quality and speed.

If you spend any time waiting for code to compile, web pages to load, or dependencies install, I'd say buy the fastest laptop you can find. I'd guess this new machine has already saved me around 20 minutes of waiting. At that rate it will pay for itself in around 6 months. If you're just using the laptop to write and browse the web, maybe it's ok to try to save some money.

Even ignoring economics, I spend like 12 hours a day staring at this thing. It's really important to me psychologically that the screen be high quality so I don't loose my mind.


+1 on the importance a high-quality screen. I never quite understand why PC manufacturer skimp on what is agruabably the most important interface with the device. And when they do, they of course jump into spec chasing with resolution and frame rate taking the focus instead of contrast, color reproduction and PWM (for people sensitive to it ).

What's the delta in BOM cost for a high quality display ? Maybe someone with industry knowledge can comment.


Because the screen is pretty close to the most expensive part and isn't one that shows up well on spec sheets other than resolution. Windows laptop consumers almost all go by price per spec sheet item so those shitty screens boost the value way up.


> What's the delta in BOM cost for a high quality display ? Maybe someone with industry knowledge can comment.

Higher quality screens are more expensive than shitty ones. It's not just the cost of the panel itself but more pixels take more power. That means more expensive driver circuitry and likely a larger battery to be able to offer a marketable battery life.

That higher power may not work in your current design's thermal envelope. A different panel may also need a redesign of the display shell, bezel, and even hinge(s). Each component might not have a significant increase in cost on the BOM but all of them together add up as does any redesigns to support the extra power usage.


The high quality screen is so important. Every time I go looking for cheap laptops, it's the screen that steers me towards higher prices.


Related:

How and why I stopped buying new laptops - https://news.ycombinator.com/item?id=32674830 - Sept 2022 (111 comments)

How and why I stopped buying new laptops - https://news.ycombinator.com/item?id=26188282 - Feb 2021 (100 comments)

How and why I stopped buying new laptops - https://news.ycombinator.com/item?id=25486191 - Dec 2020 (279 comments)


Nice. Now try to do the same as (some types of) programmer or 3d modeller or video editor. The newest MacBook with an M2 chip compiles and runs my projects drastically faster than the 2015 Dell XPS Skylake-i7 laptop I used to have and the GPU on the M2 is so ridiculously more powerful it's like going from 3dfx voodoo to a modern GPU.


Did you read the article? The author is, well, an author. They don't program or 3d model. Computation sped wasn't even mentioned in the point they were making.

I guess I don't see the reason to read an article about a specific use-case/point of view, only to follow up and say, "yeah well it wouldn't work for this completely different thing you weren't talking about". It's not the 'gotcha' you think it is.


I'm not trying to make a any gotchas. Yes, if all you need is MS Word and a web browser you can use a 10-15 years old laptop, I don't think this surprises anyone here (in fact its why most people these days use phones instead of PCs to do most of their internet browsing).


I guess your superfast laptop wasn't very good at rendering the webpage of the article


Are we not allowed to use the article as a springboard for conversation? Am I obligated to only restrict myself to what the author wanted to talk about without expanding the realm of discussion with the people on HN?


I just don't think it's that interesting to take an article that says "I and other people with similar needs to me can do X", and reply, "well I have different needs and can't do X". Well... duh? There isn't really much conversation to spring into.

Many people who buy new laptops every few years really don't need to do that. Some who buy new laptops every few years actually do see a benefit to doing so. The latter group isn't really all that interesting to talk about, at least not in the context of the article. The former group is, because making old laptops more useful for those types of people could have a huge positive environmental impact.


Yes, but you should start by explaining that you are switching the context, to make it clear that you're aware of it. The author took care to explain that his stance applies to office applications and may not apply to other use cases, even specifically mentioning audio-visual work.


I feel that, for instance, game developers would fall into this category.


I’m an author, sometimes. My 2015 MacBook Air has slowed down so much, due to software bloat, it’s painful to use. You need to constantly buy new hardware to run peoples literally exponentially less efficient software


Yes! It's frustrating. I have a perfectly good 2015 MBP 13" that runs as fast as the day I bought it but is now unusable for much of the web, especially video or high Javascript situations. To use YouTube without it being a jetengine you need to force YouTube to return you mp4s instead of vp8 or whatever they use now. Twitch is just unusable is chat moves slightly faster than not at all. Everything in general is a frustrating sluggish mess. My phone is going the same way as well, becoming less and less usable.

But it's not the laptop or phone that's changed, it's the rising tide of tech enthusiasts constantly buying new computers, having their performance QC be "feels OK to me", and that raising everyone's minimum spec.


The worst offender imo is VScode. it won't run smoothly on my 2010 era Thinkpad. yeah it's old, but we're talking about a text editor here. without any plugins it should still be able to do at least that much smoothly.

I don't need AAA games to run on it, but text editing!


Yes, jesus.

I am forced to use VSCode because some disability plugin[1] I use only (for now!) exists for it, but having to switch to it over SublimeText + Vim was annoying. If you don't need to use it definitely try either of those options

[1] https://github.com/cursorless-dev/cursorless


I want an ·easy· button for vim. I want to upgrade to a new version, be given new decoration and a new colorscheme to be angry over. but then I want it to have all the modern features. LLM code-complete that i can choose to pay a monthly SaaS for, a bevy of plugins like one to help me program my esp32 for my little iot project, super special handling for language-of-the-month.

but I also don't.


I use my laptop (Macbook 12") to write too. It's fantastic for that... until you start playing YouTube Music in the background or try to use Google Maps. It's getting to a point where the delays on many tasks become noticeable and affect my workflow.

That's unfortunate because I love that little laptop.


It would be silly if this article were trying to recommend a 15 year old computer for intense numerical processing, which is why nobody, including the author, made the attempt.


yeah, but "author realizes they have no substantial computing needs because they only word-process" isn't a particularly interesting article. like boss you can hook a bluetooth keyboard up to your phone and go to town at that point, if you want, there's really no need for a laptop at all.

and since it's not applicable to the userbase here you're gonna get discussion on why it's not applicable.


> "author realizes they have no substantial computing needs because they only word-process" isn't a particularly interesting article

That's not really the point of the article, though. The point is that there are a lot of people with similarly non-computationally-intensive needs who still nonetheless buy a new laptop every few years. If more people would realize what the author has realized, they'd not only save money, but reduce e-waste and the need for quite as much new laptop manufacturing.

> and since it's not applicable to the userbase here you're gonna get discussion on why it's not applicable.

I think that says more about the bubbles HN users live in, and their inability to think outside those bubbles, than anything else. Kinda unfortunate. I would be much more interested in discussing why so many people with modest computing needs think they need a new laptop every few years (advertising, probably), and how this might be changed.


My M1 Mac is the best laptop I ever owned.

But it is also a big step up, I expect my machine to stay viable quite some time and recommend my noob friends to buy used M1/M2 Macbooks (if available).


Is it also (inflation / time adjusted) the most expensive ?


I hadn't thought of this too much but I bought an M1 14" Macbook Pro, mid-spec, for around $2300 I think back in 2021 right after it launched.

My previous laptop that I had used for 8+ years was a 2013 Macbook Pro 15" with the i7, and I bought it as a factory refurb 6 months after launch for- get this- I think it was $2400. New it would have been $2600 I think, in 2013-dollars.

So, especially adjusted for inflation, a similar level of machine is actually about 30%-35% more affordable now than it was a decade ago and computationally it's anywhere from 4x to 10x faster and something like 2x as energy efficient.

All that combined makes my M1 Macbook Pro one of the best values in computing I've ever purchased, IMO. And considering my 2013 MBP lasted more than 2x as long as any previous laptop I had ever owned, even though at the time it was about 50% more expensive than any previous laptop I owned, it paid for that run- and so if I get the same use time out of this current M1, the value over the lifespan of the machine will trounce any previous computer I've owned.


Base m1 macbook air sells for 800 bucks routinely. It's a great value in my opinion. I agree it's the best machine I have ever owned.


If you don’t mind wearing out your ssd with the insane swap use. You really need 16+ gb of ram on macs, otherwise you’re damaging your hardware


That depends on what you're doing. If you're running a bunch of bloated garbage like Discord and Slack and whatever then yeah, you'll need a lot of memory. If you're running mostly native apps you might be surprised how much you can run with 8GB. I would still recommend 16GB to people for future headroom, but 8GB is usable.


Doesn't answer my question. I wasn't trying to imply that M1 are a bad value...


My old Dell XPS was actually slightly more expensive (even before adjusting to inflation - I was researching online and at the time it was the most recommended high-end pc laptop). And macs retain value a lot better than PCs - you can likely resell it for a significant fraction of its new price in a few years (unless you broke it).


Still doesn't answer my question, not sure why people feel the need to jump into this... maybe something wrong the tone on my answer.

But while i have you here... Which dell xps (year/model/spec) was it and which M1 mac (year/model/spec) was it.


How does this not answer your question? If the XPS was more expensive even before adjusting to inflation, than the answer is no: the macbook was not the most expensive computer I ever bought (both adjusting and not-adjusting to inflation).

The Dell was XPS 13 (9350) bought Janaury 2016 for €2,105. The Macbook is M1 Macbook Air (the highest end model of which), bought June 2021 for €1,598. In both cases I wasn't trying to save money and was happily paying a premium for (within reason) the best laptop I could get, as far as I could tell from online reviews. I bought both of these directly from Dell's and directly from Apple's respective online shops.

The latter is radically higher performance while also being much better in many other ways (build quality, battery life, reliability, etc). I promise you I'm not getting a commission from Apple, in fact there's a lot of stuff they do that I deeply dislike :)


> How does this not answer your question?

Because i was interested in OP point of view, not someone else...

> The Dell was XPS 13 (9350) bought Janaury 2016 for €2,105. The Macbook is M1 Macbook Air (the highest end model of which), bought June 2021 for €1,598.

You are comparing the experiences of two laptops 5 years apart ...

> The latter is radically higher performance while also being much better in many other ways (build quality, battery life, reliability, etc). I promise you I'm not getting a commission from Apple, in fact there's a lot of stuff they do that I deeply dislike :)

I agree, i have nothing apple hardware. I was just curious what kind of machine OP had before his mackbook air. I also recommend macbook air to anyone who can use macos


At that point, I guess I don't understand why you wouldn't just step up to a desktop workstation?


Err, mobility? M1 / M2 Pro MBPs pack a serious punch... and they don't have to sacrifice battery life to do so - they smoke virtually all other laptops in terms of battery life.


Battery life gets mentioned quite often with laptops, but is it still really that important?

I run my business from an X1 Carbon, but it's more about portability than battery life. Trains, planes, Starbucks all have power available so it's constantly plugged in and I can't remember the last time I used it on battery for more than an hour.


I recently got my first macbook (coming from a thinkpad) and it’s honestly more about not having to give a shit about a charger. It charges incredibly fast when I plug it into its monitor, but otherwise I just throw it in a bag without even thinking about what my battery percentage is. I’ll leave the house with it and not even bring the charger.


I think this is the biggest selling point for me. Fewer cables around.


I've been using my X1C on battery for the last 4 hours on a transpacific flight (hello from the middle of nowhere) and am looking forward to about 4 more hours. The seat has a convenient power plug, but it's awfully nice to leave the charger tucked away in my bag to be one less thing to trip over when I have to wriggle free from my cattle class seat to run to the restroom.

So, to address your question (which I would normally also ask), it is that important though this is also a somewhat narrow use case.


It's just one of many great aspects for such a capable device.

They also run very cool and are effectively silent even when the fan DOES eventually turn on for a sustained max load. By comparison, every Windows/Intel laptop I've had over the last decade exhibited obvious fan noise every time there was some short-lived load on the CPU - that includes several nicer business/pro grade laptops including Dell XPS, Latitudes, and a Thinkpad.

...and NONE of those other laptops came remotely close to reliably lasting 10+ hours on battery like the MBPs can/do.


It's pretty much impossible to find an outlet in Germany in public. Certain types of trains have them, but many don't. Like, the only guaranteed outlets I can rely on are at home or at work, and even at work, I'm constantly moving between meetings and simply don't want to futz around with cables all day. I never have to worry about charging my laptop between meetings. That kind of convenience is hard to beat.

Battery life matters. Maybe not to you, but it matters.


In Europe, nope, only some trains have power available. Never saw it on planes, and I fly regularly.

Even on trains, one must be lucky and even if lucky, it is usually one socket per pair of seats.


Mobility is such an antifeature. Taking your work home with you is something you really shouldn't have to do. Worse even, taking your computer which potentially holds sensitive data with you could result in it getting stolen, at which point you've got a big problem on your hands.

Laptops for work make no sense unless you're a field researcher.


Disagree. Mobility is absolutely a feature. When the pandemic sent everyone at my employer (several thousand workers) home in 2020, we were at 90% productivity within a few days because virtually everyone had laptops. Getting everyone settled in with effective home office setups took a week or two, but we were connected and working the whole time.

Then, when we went home a couple years later because of a cyberattack, essential workers were able to log in and work with the systems because we all had laptops.

As time goes on and less and less of our data is actually sitting on laptops.


Mobility is not about taking work home. Disk encryption is almost standard these days. (Windows home edition is the last holdout) People travel for / with work, do on-calls, work at customer sites, work from home and don't want to be stuck in one room, etc. Laptop may not make sense for you, but it seems you're a bit out of your element here.


You do know that a lot of people travel for work, right?


Journalists and plenty of other fields benefit. Though I agree with the sentiment. A Mac mini is probably good enough for most folks who work in largely the same place.


My office is everywhere there is an Internet connection.


Mobility lets me work on the balcony and in parks


A laptop can be easily transported for use in a variety of locations.


You can even put them on top of your lap.


most pc laptops I've used weren't very comfortable after some number crunching or kernel building after a few minutes...


yes but have you tried a desktop computer on your lap


Honestly, the new MacBooks are faster than any desktop I ever had - I don't feel a need to sacrifice mobility for more than that.


I don't think that anyone is implying that a few people need the newest hardware they can get, however if you're working on webpages, cloud apps, project management, most phone apps, etc you don't need a brand new laptop to get that done. And that's just for developers, your typical student or home user certainly doesn't need the latest and greatest. I recently bought an M2 32GB machine but that was to replace my 2015 macbook lol.


Quick aside: This should have a (2020) tag.

The keyboard and mousepad on my Thinkpad have died twice. I've replaced it twice for 30 dollars each time. I would have had to buy a new laptop if I had something else. The conclusion about this being a hack and not a new economic model is why I'm hoping Framework seems success. Unfortunately I think my Thinkpad is on the way out for good and I plan to replace it with the Framework 16 when it is available, which I hope will last me just as long as my Thinkpad


If the keyboard or trackpad died on my laptop I would simply take it to be serviced. The company that made it has storefronts around the country where I can bring it in. They have an app that makes the scheduling easy.

In fact I just had the battery replaced after 5-6 years (I don’t quite remember). Works for me.


That's great that whoever makes your device(assuming Apple) has stores where they can do repairs. Having a repairable device doesn't mean you have to repair it yourself, it just means you can.


ThinkPads are absolutely magic. Here's one I built for travelling: https://maxrozen.com/replacing-my-macbook-m1-with-thinkpad-t...


They used to be. I am writing this from the above-mentioned T480. It is 3 years old, has both usb-c ports replaced, small crack on the backside of the screen and the keyboard zif is broken, so the tape cable is... well taped to the connector.

On the other hand my X200, T420, T450 are still in mint condition. To be completely fair, modern macs are also way off from well-built. Shiny yes, but sturdy is the last thing I'd call them.

BTW, in older ThinkPads, the drain-pipes under the keyboards actually worked - you could pour water on it just for kicks. Now it is there only for decoration.


It really is a shame that the USB C ports are not modular on newer ThinkPads, you basically have to send the laptop for board work to get them replaced once the USB C ports go.


Not really, I had mine replaced at revolt.bg for about 60 EUR, both labour and spares. Miraculously, they had both of them on stock, so it was a Thurday to Wednesday job. Massive respect to those guys.


Anecdata, but before going all-in on ThinkPads I thought I share my experience.

For the ThinkPad x13 2nd gen Ryzen I received for a customer project (with the customer not interested in getting it back after the project was completed) it's too early to say anything about reliability, but somehow the touchpad isn't centered properly and I always hit left click when I want right click - it's unbearable and surprisingly bad for a ThinkPad.

Before, I had an ThinkPad E495 Ryzen that worked ok but after two years the touchpad got stuck - the pointless (for me!) construction with touch point and extra mechanical buttons (only acting as barrier) certainly didn't help.

Before, I had my beloved Dell XPS 2016 which however had a swollen battery. Still, I consider it one of the best notebooks I had; gnome 2 and Unity also just got it right.

In the last four years I got five (!) ThinkPad and Dell Precision notebooks for customer projects having battery or memory problems OOTB (the x13 I mentioned above was a notebook that finally worked).

The notebook I used for the longest time (7 years) was a PowerBook G4 2003. I hope my current MacBook M1 will last me about as long, but at least 5 years. In terms of power management, display, and usability, a Mac notebook is in another league compared to Dells and ThinkPads with Linux, it's not even funny. Linux desktops have regressed to the point they have ceased to be usable for me. With the newer MacBooks, I think Apple listened to customers and nailed it; I'd just prefer lighter (non-alu) materials.


They really are. For example, I have a P53s (basically a renamed T590) that will randomly clock the i7 CPU it came with down to 400MHz under active use for extended periods of time, and I can't run anything that actually uses the cores on the i7 for more than a minute without hitting the 95 degree PROCHOT. The dedicated GPU also will clock itself down to 100MHz and shift the termal throttling range from 75 degrees to 65 degrees randomly, on a system with a single fan cooling solution.

Just magical, and I can't wait to see Lenovo's next trick.


I unsubscribed from Lenovo at their Superfish debacle. Intercepting HTTPS communication with their own certificate? Now ain't that somethin'.

And the next up was when they had a special place in the laptop for their bundled bullshit, so that they persist EVEN IF you reinstall Windows on the lappy.

https://en.wikipedia.org/wiki/Lenovo#Security_and_privacy_in...


What did the support say about it? Did you get a replacement / refund?


Have you changed your thermal paste?


Yes, but it only helps so much when the actual cooling solution is as unsuitable as it is.


Which websites or store in Europe and UK would you suggest to get refurbished laptops and mobiles?


ebay or backmarket - look for IT refurbished laptops rather than individuals selling their old laptop


I work for a non profit org.

We sometimes purchase reconditioned laptops as they are good value for some use cases.

We have a lot of WfH users so need to courier laptops to some staff.

We recently had an insurance claim denied for a damaged reconditioned laptop as TNT/Fedex used a "no second hand goods" clause in their insurance to wriggle out paying up.

This has not stopped me purchasing reconditioned laptops, but something to be aware of and taken into account.

In fact just buying another reconditioned laptop was not a large expense and damaged laptops in transit are relatively rare.


You sorta hint at this in your post, but I assume that eating the cost of the rare damaged-in-transit laptop is still cheaper than paying full price for all-new hardware.

And while I certainly understand that environmental impact might not be the chief concern for a company (even a non-profit), continuing to give secondhand hardware a new life is certainly way better for the environment.

Regardless, it seems bonkers to me that a courier can get away with such a clause. You're paying them to ship something safely from point A to point B; it should be completely irrelevant what that thing is when it comes to paying you out for their mistakes.


If you look at it from the courier's side it makes more sense though.

It's obvious if an item was new and you can safely assume it was the delivery company that damaged it.

Whereas for secondhand items there's no way of knowing the original shipping condition. And it would open up massive amounts of fraud where people would buy a cheap broken item on eBay (sold for parts), ship it to a nearby friend with full insurance, and then claim the shipper broke it.

Insurance should still cover the case of an item lost in shipping however, regardless of new/secondhand, since there's no opportunity for fraud there.


Good to know, but in practice this just means more and cheaper second hand devices to be available to the public.


“Laptops don’t change”

Really? because a few years ago battery life was maybe 7 hours and now it’s 18. And with faster processors, thunderbolt 4 and USB C charging, wifi 6, better screens and lighter.

I keep hearing this argument for not buying a new laptop every year and it just doesn’t hold water. I say buy the latest MacBook, expense it, factor it in as a monthly cost. These are expendable and essential tools. They pay for themselves

And these framework laptops are pretty awful. They run hot and are poorly designed. The modular usb addons are a joke and exchangeable GPUs? heavy, power hungry, not for laptops. 3 hour battery life. The main product feature AFAICS is virtue signaling. These products are not open hardware either. Then, there’s this modularity argument. Like this is the last laptop you’ll ever own and forever will be replacing it piece by piece. Disassembling over and over. But that’s not realistic. You’ll have to upgrade to faster processors and the form factor of the mb will change regardless of what they say, and you’re counting on the company being around in a few years which is a poor bet. They can’t compete with Apple or even Lenovo


So if framework can let you reuse the "shell" for 2-3 upgrade cycles until they have to change the motherboard, that's a lot better than having to buy an entire new laptop.

It's not about keeping the same form factor forever, that's unrealistic. It's about reducing the amount of stuff you have to buy for each upgrade.

Opening and closing stuff is realistic, if it's easy.

I've opened and upgraded various components of my steam deck half a dozen times in the last 6 months.

It's reduce, reuse, recycle.

I don't own a framework.


>> So if framework can let you reuse the "shell" for 2-3 upgrade cycles

I badly want the framework laptop idea to succeed - the idea of re-using your old laptop battery as a battery bank when you upgrade, that's just brilliant. The idea of taking your old motherboard and sticking it in a slim case that bolts to the vesa mount on the back of your monitor, beautiful.

However, it's far more expensive that just buying a MacBook. I bought the base model M1 Air for £999 at launch. I just sold it, what's this, 3 years later, with 90% battery life remaining for £600 plus shipping, after eBay & paypal's cut I'm £537. That's £462 for 3 years of having faster single core / interactive responsiveness than anything else at comparable price on the market.

How much more would I have had to pay over a similar time frame for a slower heavier framework with much much less battery life.

I considered the framework for this laptop cycle but instead I just traded up to a second hand 16" M1 Pro/16Gb/1Tb with 16 cycles count on the battery - i.e. brand new. Sure it's 25% slower than the equiv £2800 M2 Pro version but this cost me just £1500 delivered in pristine condition. I'm feeling very confident I'm going to come out ahead again in total cost of ownership.


My framework (latest revision i5 DIY edition) is lighter than my wife’s maxed out M1 MBA. She has 16gb of RAM and 2tb of storage, and it cost $2500 with taxes and shipping. I have the latest i5 DIY edition and I forwent everything (charger, RAM, storage). I got the fastest 2tb m2 drive supported and the fastest 64gb of RAM supported on Amazon. Total cost with taxes and shipping for everything was $1250.

My framework compiles the linux kernel faster than her MBA and lasts all day on a single charge. I did spend an enjoyable afternoon dialing things in, although I know not everyone would enjoy that. I wouldn’t recommend a linux laptop to anyone who doesn’t understand init systems and how to manage config files.

They’re both great laptops, but with different target markets. Let people enjoy things!


Those comparisons work against any Windows business laptop though, which Framework competes well with on price and features.

No doubt Apple Silicon has really exposed AMD/Intel and x86/64 in general


Too bad Apple's hardware doesn't play nice with Linux.


It's not doing to badly with Asahi Linux. There are still shortcomings to work out, but there are people using Asahi on their Apple Silicon machines for their daily driver.


Yeah but it's nowhere near the choice you get with a typical x86_64 laptop with an AMD/Intel chipset.

I have minor problems with my Lenovos and Framework, but by and large they do work out of the box.


Asahi Linux basically works. The main issue is that Linux software, especially proprietary, does not play nice with ARM.


That can only be proprietary sw. Almost every program I know works on arm64, thanks to availability of the platform through raspberry pi.


Discord, Spotify, and MS Teams desktop apps are not usable. Some of them you can use the web apps, but for things like Spotify, not even the web app works since the web DRM doesn't work on ARM.

My point is that this isn't Apple's doing. The hardware works fine and is mostly unrestricted. It's the end user software that causes major issues.


Linus released the last kernel from an ARM MBA


>It's not about keeping the same form factor forever, that's unrealistic.

It's not that unrealistic, if we constraint forever to like 10-15 years.

In a lot of laptops I'd rather they keep most of the ports (and just e.g. update USB2 to USB3, and add a few new ones) but keep the form factor the shape.

Often new keyboards (Apple is notorious, but holds for others) are worse than in the previous form factor.

And when they go and "make it thinner", beyond some point, I'd often rather they added more battery, or added a space for an extra use installable SSD, or put bigger speaker cones, or just leave the space inside for better thermal dissipation...


I love that they’re doing this. However, in my use-case I keep a laptop 5+ years now. At that point, it’s pretty worn out and usually warrants a total replacement.

All the individual components keep me happy for so darn long now. After 5+ years, there have usually been pretty big improvements in every area.


Upgradeable hardware sounds great in theory but doesn’t work in real life, just like PCs you configure it once and it will be that HW until it is obsolete for your purpose.


> just like PCs you configure it once and it will be that HW until it is obsolete for your purpose.

Nearly every PC I own has had its disk replaced at least once, and many have more RAM than they started with; both of these massively extend their useful lifespan.


RAM, storage, and GPU upgrades are great.

Unfortunately CPU upgrades often require a new socket/motherboard.


CPU upgrades on desktops did not make much sense for several years, until Zen came out.


> Upgradeable hardware sounds great in theory but doesn’t work in real life, just like PCs you configure it once and it will be that HW until it is obsolete for your purpose.

This isn't really the case for many folks out there.

If I get a better CPU, the previous one might go to one of my homelab servers (consumer hardware). The same goes for motherboards, that's how I got a second homelab server (bought a mobo with more RAM slots for main PC). I occasionally swap out drives and now my homelab servers also have SSDs like my desktop. Once I bought faster RAM for my desktop, I moved some of the old sticks to the servers.

You can easily replace homelab servers with a family PC or another machine that you have (or just spare parts in case something fails) in the example and it will still make sense.

Actually a lot of that hardware is also second hand - since it was easier to just get affordable first gen Ryzens for my desktop, instead of saving up money for a while. Those homelab servers both also use 200GEs because of the low TDP, which others might consider obsolete, but which have a second life here.


I think you can often make a similar hand-me-down argument for non-upgradable (or minimally upgradable) hardware as well.

For example, obsolete thin client PCs can be repurposed as home servers or control systems. With a USB GPIO interface they can even do Raspberry Pi-like things.

Apple makes it harder since you may fall out of the 7-ish year macOS security patch window, but you can often install Linux or NetBSD if you plan to connect to the internet.


> I think you can often make a similar hand-me-down argument for non-upgradable (or minimally upgradable) hardware as well.

That's fair and I'd probably also make that argument as well: as long as the hardware/software isn't locked down and utterly unsupported, then even older pieces of kit can be utilized well, instead of being thrown in a landfill somewhere.

I do have a netbook with a N4000 CPU and 4 GB of RAM that is still good enough for web browsing, note taking, some light development and even using as a stream dashboard when I'm streaming on the main PC. As far as I'm concerned, that is only possible due to good drivers and support for Linux distros that are lightweight.

But at the same time, if it had more RAM slots, it'd last me years longer than a soldered offering - it's easy to imagine an old netbook being repurposed as a homelab node, for an internal Wiki, maybe some project management software, internal file repository, or some other simple goal like that.

That's also why I'm upset at some Android phones: that go out of support in a few years, with no way to easily install an up to date release, weird driver situation and locked down bootloaders and sometimes even batteries that cannot be replaced! It's like they're the ultimate form of planned obsolescence, even though the same hardware could last me close to a decade.


Terrible example, a large proportion of gamers (including myself and almost everyone I know with a PC) builds their own.

Upgrades are piecemeal and old components are variously sold, given to friends, or re-used in home servers.


I built a desktop 5 years ago. I recently upgraded it. I was able to reuse a lot of the components including SSD, cooling, case, power, etc. And that computer had been upgraded a few times over its life before it was retired too.


I'm a huge Framework fan but I worry that the upgradeability produces more e-waste rather than less. I already see Framework laptop owners tossing mainboards every generation. I think the e-waste angle is the weakest- I like the upgradeability because it saves me money. As a cheap bastard I almost see it as a challenge to use my devices for as long as possible.


> I already see Framework laptop owners tossing mainboards every generation.

That would be extremely stupid, especially after they released the case to turn the Framework mainboard into a mini PC.

https://frame.work/products/cooler-master-mainboard-case


You know people who are literally tossing old mainboards? I’d be happy to take one off of someone’s hands if they’d otherwise be throwing it away.


> The main product AFAICS is virtue signaling.

Friendly reminder that people do care about things you don't care about. Claims of virtue signaling say more about you than them, quite frankly. And really, you should care about the massive amount of e-waste generated every year.


This.

I lose any hope for us as a species being able to reverse climate change if the top comment about an article that goes quite deep into meaningful ways to “reduce, reuse, recycle" is that this is virtue signaling, and that we should buy Macbooks every year instead because they are "expendable tools".

Guess what, their impact on the environment is very much noticeable. There is no way we are going to reverse or even meaningfully delay climate change with this kind of collective attitude.


Laptops are expendable tools, no question.

The problem is that their price does not include the expenses of the global warming accelerated due to their production and transportation. So they are sold artificially less expensively than they will turn out to be.

When laptop production, transportation, and recycling become largely carbon-neutral, it would be easier to argue for throwing them away once a slightly better model is available.


Worth remembering this is HN, where all too often, people hide behind screens and make less-than-considered comments for points. This is just one of many echo chambers. Apple worship in particular is quite prevalent in this thread -- but in reality, way more people use Windows.

The folks who spearhead the real fight against climate change aren't likely to spend much time on threads like this. They're in meetings with supply chain and logistics folks trying to figure out how to hit an ambitious decarbonization quota.

My point: there is hope, and usually, it's not found online, but out in the real world.


Who is buying macbooks every year though? I'm using the M1 2020 for work and it still feels like a brand new laptop.


That issue can’t be solved by reusing laptops or sorting cans from cardboard. The argument that every little helps is totally flawed. Fantasy, make believe. The other 8 billion people simply don’t care and why would they? 50 years from now they’ll all be dead. Since when did people care about future generations and the good of humanity? It’s a very recent western fashion. The only thing that truly motivated the vast majority of people is sex and money or objectives closely descended from that tree. To modify mass human behavior you have to incentivize based on that leverage. But - even modification of the behavior of the crowd won’t be sufficient. The only way to fix this is w new technology


You're falling prey to the fallacy that just because 1 person's actions can't make a world of difference on their own, those actions aren't worth taking. By your logic, you may as well not vote either, because everyone else is voting. Every little bit DOES count, because the more people believe they can make a difference, the more of a difference they will make collectively. You have to lead by example. Rolling over and dying because you dont think you can make a difference makes you part of the problem.

New tech certainly would help, and it isnt/shouldnt be on the shoulders of just individuals to make all the change. But Im not going to use that as an excuse not to change my own behavior for the better.


> By your logic, you may as well not vote either, because everyone else is voting.

Correct. Voting is the worst possible example, because while e.g. individual recycling has a negligible but technically non-zero impact, voting is overwhelmingly* likely to have exactly zero impact.

* unless the voting base is either extremely small, or extremely close. Your vote only matters as much as the likelihood that at least one result is decided by exactly one vote, which is only realistic in something like a local town election. Which are the only elections most people should actually pay attention to.

> Every little bit DOES count, because the more people believe they can make a difference, the more of a difference they will make collectively.

That's pure wishful thinking.

What actually happens is that people believe they are making a difference, even when they actually aren't, but since they feel personally good about having put in an effort, they stop worrying about the actual problem, and stop even thinking (let alone acting) towards solutions that might actually work.

Why do you think the "carbon footprint" idea was actively pushed by BP and other super-polluters? Out of the goodness of their great? Or because it drew attention away from other, less corporate-friendly CO2 reduction measures?


> Your vote only matters as much as the likelihood that at least one result is decided by exactly one vote

This is entirely wrong. The value of your vote is the relative ratio of the fraction of alternate universes in which you vote in the election and your side wins, over the fraction of alternate universes in which you don't vote in the election and your side wins.

The value of your vote, as a result, is emphatically and enormously greater than the likelihood of the election being decided by a single vote. That's because there is an enormous amount of correlation between you not voting and people in the same voting demographic as you not voting. The question of whether you will vote should be seen as largely a consequence of the turnout characteristics of your voting block, not a free choice of the "uncaused cause" variety.

In fact, the real danger of the your argument is that by taking it seriously, you greatly reduce your voting power by making your demographic that of the tiny group of people who decide whether to vote based on meta-arguments about the value of their single vote. And this particular demographic basically never swings elections, so by putting yourself in that demographic, you effectively make the value of your vote zero.


You're basically describing superrationality, which is a model I don't buy.

Defecting still wins the one-shot PD, which in this case maps to "not wasting dozens of hours researching political candidates" and everyone who votes like you is a prisoner.

If you want to be safe against network effects, just lie and tell everyone that you voted when asked.


I mean you’ve got to appreciate the scale of the problem here and be pragmatic. The danger of recycling is it gives the illusion of progress, but I’m not against it otherwise. Philosophically you might be correct but voting blue in a non swing red state is completely meaningless


We need not be locked in a two party system forever. If enough of us demand same day primaries and ranked choice then other parties have a chance.

Only voting in blue strongholds certainly isn't going to break the duopoly.


There's 8 billion people so my actions don't matter. I guess I'll pour all my old paint and paint thinner down the storm drains. Maybe my motor oil and used coolants as well. After all I'm just one person out of 8 billion, my actions don't matter.

I'll just litter all my stuff as well. Single use plastics? Sign me up, I'll just throw it out in the street. I'm only one of a few million in my city, my actions don't really matter.

Or maybe my actions really do matter.


Your actions do not, indeed, matter. Neither does your vote, or your boycotting of $global_evil_corp, precisely because there's millions or billions of people involved. (To be pedantic: while not literally zero, your actions matter several order of magnitude less than would be necessary to be able to perceive their effects.)

But that's a really hard pill to swallow. Your post is a beautiful example of that difficulty: you spell out examples of how your actions don't matter [as much as you would like], and then wildly swing into open denial with "Or maybe my actions do really matter".

My advice - if you focus on the consequences of your actions that you can actually perceive, instead of the ones you only wish you could, you'll be less frustrated. For example, helping your local community will typically give you far better returns than trying to improve any global issue.


So you're saying my neighbors and I should feel fine throwing old car batteries in the lake and switching back to leaded gas?


I'm saying that lakes don't get saved when people virtuously choose to individually stop throwing trash into it: they get saved when the entity in control of the lake forbids throwing trash into them, and enforces that ban.

If leaded gas was still being legally sold alongside unleaded one, people like you would be passionately arguing for decades about why it's everyone's moral duty to pay extra for unleaded gas, and you would feel so proud every time you stopped at the green pump instead of the black one.

And meanwhile we would all be breathing lead.


So your stance is feel free to keep doing things you know is overall bad for society until we bother passing a law and pushing enforcement to eliminate the activity?

I don't know about you, but if pumps had leaded gas and unleaded gas, I'd still be going for the unleaded every time along with trying to lobby for change. I'd absolutely be telling everyone I know not to use the leaded gas and try and change everyone around me to get rid of it. I would not just shrug and say "lots of other people are using leaded gas, I might as well as well, my use doesn't matter!" Meanwhile I guess your stance is to just use the leaded gas despite knowing how bad it is and telling everyone else to keep using it anyways until they finally tear the pumps out of the ground.

It's people thinking "my use doesn't really matter!" that is massive problem. Imagine how much easier things would be if people just did the right thing instead of assuming their actions don't matter.

Each rain drop isn't much water, they don't really matter. Yet floods destroy cities. Where does the flood come from?


Japan stays free of litter because residents keep it that way (despite not having many trash cans in public), not because someone's there watching over everyone and enforcing it: https://www.bbc.com/travel/article/20191006-what-japan-can-t...

Individual actions may not make much of a difference on their own but in aggregate they do.


This is lazy defeatism and an immediate disqualifier I use in the selection of both friends and employees.


I agree, all branches of science should invent ways to recycle almost anything. We know we have a waste problem, yet society doesn't demand enough to have ways created to deal with recycling. Forget Mars and looking at the Titanic till we find ways to deal with real world problems.

Burying our heads in the sand will hurt future generations!


The evidence indicates that recycling is burying our heads in the sand.

It is reduce>>>>reuse>>>>>>>>>>>>>>>recycle.

As far as I can tell, very little recycling actually ever happened or does happen. It was always just poorer countries willing to look the other way and trash it and claim it was recycled.


I'm guessing you're assuming I meant throw everything into the landfill? That's the problem with assuming, it's almost always dead wrong. And besides, I didn't say that. Perhaps you though I said "throw away"?

I meant we have to quit adding to landfills and reuse materials from discarded objects, or in other words "recycle".


How much climate change do you think has been allayed by sorting garbage into 3 kinds of trash cans and drinking from paper straws?


Owning a macbook vs owning a framework laptop isn't going to effect climate change in any way. In fact, Apple's track record is that their products generally outlast the competition, even when they do otherwise stupid things like software DRMed screen and battery replacements and soldered SSDs.

It is purely virtue signaling. If you want to do good by the climate and still want a powerful tool there is no better option than a used macbook.


I don't think Framework is expecting to compete with Apple. They're competing with Dell, HP, Lenovo, Acer, ASUS, etc. Apple's laptop market share is what, still under 10%?

Assuming Framework's thesis bears out, that their customers will actually upgrade their laptops piece by piece when given the opportunity to do so, rather than buy a whole new laptop, then it's a matter of increasing their market share, especially through corporate IT contracts.

Every new company/concept has to start from zero. Maybe Framework has a sub-1% market share right now, but what if, over the next decade, they can grow that to 10%, maybe beyond? While there will certainly be some e-waste (likely most users will not find a use for their old Framework mainboard when they upgrade, and some mainboard upgrades will presumably require new RAM), it'll still be a lot less than people chucking complete laptops, including chassis, keyboard, touchpad, and screen, once every few years.

> Apple's track record is that their products generally outlast the competition

Not sure I agree with this, though my personal experience is pre-Apple-Silicon, which may have changed things for the better. When I used Apple laptops for software development, I only got around three years out of them before they felt uncomfortably sluggish, given the general bloat increases in software development tools (and the increasing number of lines of code we need to compile) over time. Apple certainly does fairly well by their users who have more modest computational needs; my partner has a ~10 year old MacBook Air that's still usable for light web browsing and video playback, though not much more.

At any rate, telling people that their desire to own an upgradeable laptop is mere virtue signaling is just lazy thinking. I might venture to suggest it's a defensive reaction to justify their own purchasing habits, but perhaps that goes a little too far.


> Assuming Framework's thesis bears out, that their customers will actually upgrade their laptops piece by piece when given the opportunity to do so, rather than buy a whole new laptop, then it's a matter of increasing their market share, especially through corporate IT contracts.

You can't upgrade a framework laptop 'by piece', unless you mean swapping in more ram or a bigger/faster ssd. Not really any different from other mainstream laptop vendors in that regard. The CPU is soldered, so you can't upgrade within the CPU platform generation, there's no drop in replacement options for the screen except the IPS panel they ship with (so no high dpi options or OLED/microled, no high refresh rate option, etc), and only proprietary recessed USB-C dongles for changing ports. It's a joke product.

> When I used Apple laptops for software development, I only got around three years out of them before they felt uncomfortably sluggish

Sounds like your usecase calls for a desktop system then, I know plenty of people still using 2013-2015 retina macbook pros for daily development work and they do just fine. There wasn't much performance improvement after those years until the apple silicon transition, you could get 6 and 8 core coffee lake CPUs on the last year of the intel macbook pros but they were essentially unusable for laptop usecases since they had to aggressively thermal throttle and nuked battery life as a result (something that is an even worse problem now with Intel and AMD both juicing core count and clock speeds to chase benchmark wins rather than improving perf per watt and overall system usability).


That is not what he means by "Laptops don't change". He means

> A 2015 research paper discovered that the embodied energy of laptops is static over time.

> Improvements in functionality balance the efficiency gains obtained in the manufacturing process. Battery mass, memory, and hard disk drive mass decreased per unit of functionality but showed roughly constant totals per year.

> New laptops may be more energy-efficient per computational power, but these gains are offset by more computational power. Jevon’s paradox is nowhere as evident as it is in computing.


2015 was the nadir of processor improvements. At that point, Intel had effectively used the same processor in every laptop since 2008. I would like to see this analysis post-Ryzen and Apple Silicon.


Yes, I previously owned a pre-touchbar 15" MacBook Pro (the last "good" Intel model) and though it was a great laptop, my M1 Max MBP is dramatically better across the board. Way more power, far better battery life, much less heat and fan noise, and a significantly better screen and speakers.

Granted, leaps like that aren't a given but they can and do happen sometimes.


Well, go to Geekbench and compare CPUs from 2015 with CPUs of today.


So… you can do more, faster, for longer, then?

Sounds like a good argument for upgrading


The article author is a journalist, I bet his speed depends on his speed of typing and not the speed of his laptop.

He could to more, faster and longer but if he doesn't the extra costs are useless.


I suspect the next big laptop update I have will be whatever lets me run GPT-4 size models natively.


> more, faster

Not in the world de live in: https://jmmv.dev/2023/06/fast-machines-slow-machines.html


Maybe not your world, but mine? Definitely true.


Until software causes performance improvements not to be worthwhile.

Though nobody can argue that battery lives haven't improved.


The author is choosing to optimize for environmental impact over raw performance.


Given newer MacBooks are also both highly recyclable AND made of mostly recycled material… also seems an incorrect statement then


You have a laptop x that already exists.. At what point does buying laptop y, which needs to be manufactured, have the lower environmental footprint?


I didn't find that point all that interesting. Once laptops got as small as is practical (full-size keyboard, just thick enough for a USB port), we continued cramming in as much tech and battery as it would fit and not get too hot. This shouldn't be a surprised.


Do you actually buy a new laptop every year? I feel that's a bit delusional. Year to year most upgrades are pretty minor.

An argument can be made maybe every 3 years if you're doing some incredibly resource intense stuff.


> Year to year most upgrades are pretty minor.

Really this depends on the year. We're in a more chunked paradigm now when significant progress is happening more intermittently.

That doesn't mean buy new. Wait it out a year or two and buy used.

Some years matter. Most don't. You can exploit this.


Fair, and I agree. Especially if you're out of sync with the CPU/GPU upgrade cycle.

But even then the odds it will have a major impact on most workloads is very unlikely.

I think as tech people we tend to try to maximize when there is no need.


What’s delusional is highly paid engineers not investing money in essential tools to get their work done. A new laptop is 1-2% of the annual salary of many people here.


If productivity is so important, why are people using laptops at all? Desktop processors are faster.


If you just work in one location only, sure. I work from home, the office, coffee shop. When Im at home I often like to code from the couch rather than the home office. Laptops add flexibility. I like their middle ground of flexibility and performance.

I’m also one of those “weird” people that don’t use an external monitor. And yet I’m still just as productive, probably more so, than most of my coworkers.


I like having two large monitors and a desktop-style "mechanical" keyboard, but there's absolutely a benefit to having a full productive work environment with you, with your data and work in progress, anywhere you might want to work - even if you're not online. And when you're in an office (or home office) environment, you just plug in and continue from where you left off.

A high-end MacBook Pro may not be cheap, but it packs some serious mobile compute power along with good battery life.


List of places I worked from last month:

- my desk at home

- my couch at home

- my dinner table at home (in a boring call while preparing and eating lunch)

- my desk at work

- like 15 different meeting rooms in offices across two continents

- couch on terrace at work

- beanbag on terrace at work

- hotel room desk

- airport lounge

- airplane (if you count watching conference talks as working)

Flexibility is great!


And cheaper! I personally run on a M1 Mac Mini (and iMac before that)

I paid less than $700 for the Mac Mini right when the M1 chip came out. It’s hands down the fastest computer I’ve ever owned.


  > I paid less than $700 for the Mac Mini right when the M1 chip came out. It’s hands down the fastest computer I’ve ever owned.
same, though i did kind of regret getting only 8GB ram (for what i am doing).


is BYOC that common at software companies these days? even if I wanted to "invest money in essential tools", I wouldn't be allowed to connect them to any company networks.

in any case, the most resource-intensive thing my work laptop does is have outlook and firefox open at the same time. the "heavy lifting" is done on a cloud instance that can be easily scaled up or down depending on what I'm currently working on.


I haven't used a company issued machine since 2017+18, and prior to that, literally never. So 13 months or so out of a nearly decade career. It helps I guess that I'm upfront with companies about being picky about hardware and Linux being a strict requirement (that one job was the lone and exception), they just shrug and go "sure, go for it dude".


I think there is nothing wrong with investing in equipment. I think there is something wasteful with replacing a laptop every year.

Unless its some crazy jump in hardware between cpu/gpu generations that you weren't in sync for.


You can sell your mac every year then upgrade don't cost full price


If everyone insists on new hardware, then there is no one to sell your used Mac to.


I do a lot of Android and iOS app compilation (Lots of flavors/schemes of apps). So lately the upgrades in cores has meant a faster upgrade cycle for my work dev machine than for the intel based MacBook Pro era. I’m sure Apple can’t keep up this pace of the last few years though.

Back in early 2000s we used distcc for distributed compilation of QT apps and reused old workstations in a server room. But with XCode and Android studio it isn’t pragmatic.


Why don't you buy an EPYC/Threadripper workstation then? Even best M2 laptops still can't compete with hackintosh on dated 3970x


Because it’s not faster. Well at least the 3990x wasn’t.

Compilation rapidly drops from parallel to single thread bound for most workloads.

https://twitter.com/kiratpandya/status/1457438725680480257


Depends on your workload. For me 3970x is faster in all my workflows.


I also have a maxed out Mac Studio when I need the extra cores and don’t need the portability. But I haven’t looked into hackintosh options for a long time. Perhaps I’ll check it out again some time. Thanks for the tip.


I think you severely underestimate how fast the M2 series is.


I compared. They are better on bang per watt but I'm still getting significantly shorter compilation times on 3970x.


> When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

> Be kind. Don't be snarky. Edit out swipes.

https://news.ycombinator.com/newsguidelines.html


Yikes. Don’t be like that.


I had young kids. I took my laptop to conferences and events. Performances.

Things happen in these circumstances. Drinks spill. Laptops fall from heights. Cords get pulled.

You know what happens when a MBP encounters a liquid? It depends. But one thing almost every device will have in common in such an encounter: they cannot be repaired.

You have to replace the entire mainboard. And the cost is enough that you may as well buy a new one. The rest ends up in an unlicensed dump somewhere in the global south where a kid is going to get exposed to cancer causing chemicals when they drag it out and bring it to some back alley disassembly shop.

The cost for a laptop is pretty high. Building them to be bench repairable and upgradeable help extend their useful life. As the article points out the best we can do is use them for as long as possible.


This is part of why Apple is pushing trade in when you buy a new device. They recycle it.


They claim to. I would love to read an independent audit of their recycling process. I suspect the vast majority of devices they extract the battery, the screen if it’s viable, and the rest ends up in a dump. “Recycled,” in a sense but not really.


If you read the rest of the site, you would see that the argument against not buying a new laptop every year is to find ways to lower the insane amount of electronics waste we're generating any single year.

It's not just about lack of features, it's also just ridiculously wasteful even for an industry already known for it. At least my PC, I'm just swapping out parts for upgrades. I'm not dumping my entire PC tower in the garbage every few years.

We're so goddamn lucky that PCs haven't become these disposable machines like every other piece of tech out there. Video game consoles, phones and laptops are just the worst for this and while it's cheaper in the short term, the costs are there we just don't see them.


And this is another reason I avoid Nvidia whenever possible.

At least with laptops, I have perfectly serviceable Intel and AMD systems, but I have old laptops with Nvidia that are no longer supported and basically can't be used anymore for productive purposes.


> buy the newest MacBook, expense it

Obviously if the cost (to you, because your company will pay for it) drops to zero, you should consume a new laptop every year. It costs nothing and if nothing else the battery is newer!

Of course, that logic applies to almost anything (take free stuff) even if it is objectively stupid. Your company will pay for your laptop but not part of your electric bill for your home office, just buy a new MacBook every two days with a fully charged battery!


> a few years ago battery life was maybe 7 hours and now it’s 18

At what point do battery improvements become completely unnecessary? Who's out here using their laptops for 18 hours continuously? It's like an electric car that can go 2,000 miles on a single charge; YAGNI. The vast majority of people would be perfectly well-served by a smaller, lighter, less expensive laptop that cuts down the battery to just 10 hours.


18 hours advertised is usually running a locally stored video file on loop at medium to low screen brightness and some light web browsing.

My 18-hour laptop is more like 8-10 hours of real work usage, which is still fantastic compared to the 2-3 hours I would get at similar tasks 6-7 years ago.

Another big change is improvements to sleep and standby time. I can use half the battery, close the lid, and be reasonably confident that the other half will be ready and waiting the next time I need it.

EVs may actually be an apt comparison, where an EPA rating of 300 miles is more likely to give you 240 miles at highway speeds, or 150-200mi between charging stops since the optimal charge speeds fall off above a 70% charge.


10 years ago I could write software and take notes and browse the Internet for 8h a day on a MacBook air.


I recall the same - the first and second gen airs were to me the epitome of what a laptop should be. Solid feature delivery in a portable light package that was still humming after a flight from LA to DFW.


I want my EV to last a month on a single charge and yet some people are arguing against battery improvements. Crazy.


This is a straw man. The argument is not against improving batteries. It's about engineering products for what people actually need, and not over-engineering them for the fantasies that we construct regarding what their needs are. To continue the EV example, the majority of trips by car in the US are less than 3 miles. You could comfortably reduce an EV's range from 300 miles to 30 and sell a car that still suffices for the majority of people, and what you gain from that is decreased costs and increased efficiency, which disproportionately benefits both overall power consumption and charge time (remember the tyranny of the rocket fuel equation). Most people don't need a car that charges once a month, they need a car that lasts just long enough for the time between when it's convenient to charge.

The truth is that almost nobody actually needs a computer that can last more than a single eight-hour workday on a single charge. In fact, most people need even less than that; if you're working somewhere with wifi, you're working somewhere with electricity. It's rare that I need my laptop to last more than two hours, let alone eight.


> if you're working somewhere with wifi, you're working somewhere with electricity

I agree with most of what you said, but I think this isn't necessarily true. Many people who want to work from a cafe have access to wifi but not electricity. Sometimes those outlets are in high demand and most people don't get one.

Or maybe you're on a plane, and your laptop is trying to draw too much current than the plane's crappy socket will give you (this just happened to me on a flight a week ago), so it only intermittently charges, but not enough to keep the charge percentage from steadily dropping.

Not wifi, but you might be tethering to your phone, away from electricity entirely.

Or maybe you just are offline entirely, without wifi or electricity.

Granted, I do agree with your overall point: the vast majority of the time my laptop is plugged in, and when it isn't, I expect I'll be near an outlet again within 2-3 hours at the most. Though I wonder how much of that is inherent to my usage patterns, and how much of that is me modifying my behavior because my laptop gets pretty bad battery life.


Most people don't need a lot of things that they have.

Most people don't need laptops, or nice monitors, or ergonomic desks and chairs.

The issue with arguing about need in this way is deciding who is the arbiter of what people need versus letting them buy things that they find useful or convenient or just nice.


I'm pretty sure you're not ready to pay a premium for that.


> My 18-hour laptop is more like 8-10 hours of real work usage, which is still fantastic compared to the 2-3 hours I would get at similar tasks 6-7 years ago.

I’m not sure if you’re using an M1/2 MacBook but if you’re using windows laptops - 2 to 3 hours is pretty poor even for an old laptop (assuming it didn’t have an H series processor which is a whole another story).

My (dad’s) HP Probook with its 6th gen i5 processor could get between 6-10 hours of battery life when I got it around 2020, after some years of use. Sure, old batteries degrade, but 2-3 hours for a laptop less than 3-4 years old (or older, but with a new battery) is kinda poor to start with.


I have a fairly recent Dell XPS, maybe 2 years old. i9 and mobile RTX 3070. Realistically I get about 3-4 hours on it before it's dead. Pretty similar battery life if I'm on my linux partition too.


  > 2 to 3 hours is pretty poor even for an old laptop
i get this on my macbook pro m1, but it happens when i have some webpages that suck battery life (google docs, jira etc) and/or xcode/android studio open (just sitting there) oh and corporate mandated ant-virus...

i think it might be highly dependent on work patters/environments


Never had a windows laptop that would last more than three hours...brand new or remanufactured. And I have used a LOT of windows laptops in my time, both used and new.

Except for the Lenovo Thinkpad X, or whatever they called it. I think I got ~8 solid hours of use out of them.

My 2022 MacBook Pro easily goes all day and into the evening under heavy use.


I’m comparing my experience with Macbooks with M1/M2 Max vs Dell XPS with Intel H-series and DGPUs.

Back in 2017-2019 3 hours of battery with visual studio, docker, etc. running was pretty standard for those as I remember it.


> My 18-hour laptop is more like 8-10 hours of real work usage,

Ans in which context do you find yourself needing to be even 8 hours in a row on your computer without being close to a power plug? You can even charge your computer in planes now!


> which context do you find yourself needing to be even 8 hours in a row on your computer without being close to a power plug?

It's summer. I was writing a paper in an outdoor pub. No plug available, unless I wanted to go indoors (I didn't). I used the entire battery. Granted, it was mostly VS Code being a power hog.

> You can even charge your computer in planes now!

Not if you're flying economy on British Airways.


Need? Fairly rarely. Not having to watch the battery percent tick down like a clock is nice though.

It charges when docked to my monitor at my desk, and I basically never have to worry about running low in any normal situation.

I can be in meetings all morning, plug it in over lunch, and be in meetings all afternoon without issue.

When working at home I can use it on the couch, or the patio, and when I'm done put it back in the bag for tomorrow without running to the charger.

Do I need it? No. I also don't need a nice keyboard, or a good monitor, or an ergonomic desk setup.


Part of the appeal is the convenience. Not having to pull out the brick and charging cable, sit next to an outlet, and then pack it all back up is nice.

Longer battery life also means that for trips, you don't necessarily need to bring a full size laptop brick at all and instead can charge overnight off of a tiny phone charger.


> Not having to pull out the brick and charging cable, sit next to an outlet, and then pack it all back up is nice.

But then even 18h isn't enough, and at some point you end up running to the plug in the middle of a meeting, because of course in the end you need to charge it (this is happening to a different co-worker almost every week in remotr, and I dont even spend that much time in meetings).

> Longer battery life also means that for trips, you don't necessarily need to bring a full size laptop brick at all and instead can charge overnight off of a tiny phone charger.

Given that a small charger is already enough to give you 30W, I'm not sure it's going to make too much of a difference: if you keep tour computer plugged while working, the battery will slowly discharge, but under most load I suspect it will still survive the day without issues.


> At what point do battery improvements become completely unnecessary?

A week of heavy CPU/GPU use, on full brightness. We should strive for better. I want to go on a trip without carrying my charger.


This is asking for a battery with hundreds of times more capacity, which, regardless of any future advancements in battery tech, would result in laptops with batteries which are hundreds of times heavier, hundreds of times more expensive, and using hundreds of times more materials than they could otherwise be (to say nothing of being hundreds of times more dangerous in the event of a battery failure).


When I first got my 2020 MacBook Air, could use it from getting up in the morning until going to bed at night...close to 18 hours on a weekend. Some of that was sleep time, but most was browsing, writing, watching video.

Now, almost three years later, I get noticeably shorter time, but I can still open my laptop at 7 in the morning, unplug it, and use it until mid evening without plugging it in. I've never had a laptop that lasted 18 hours, or even 12-14 hours as it does now.

When I got my MBA, my goal was to use it like an iPad...don't plug it in unless required to. I still don't have to worry much about it, even with battery wear.


>Who's out here using their laptops for 18 hours continuously?

"We should stop battery improvements because no one I know uses it" (to put it with a bit exaggeration) is certainly a novel take. Sometimes I put my laptop on the kitchen counter after doing stuff and not have to worry it'll be dead next afternoon when I need it for something.

You're forgetting one important take. Batteries degrade. In 3 years your "10 hour battery" will be more like 3 hours.


You need an 18 hour battery to get a solid, reliable 10 hours of use.


> At what point do battery improvements become completely unnecessary?

I find when I can swap in a charged battery I care less about any given battery's life.

Similarly when the battery is soldered to the laptop I would rather it have a life longer than I am likely to ever use just in case.


Or even the older setups where you had a relatively small battery in the device and a bigger wedge shaped battery that could clip on the bottom.

Or there were a couple of devices with dual battery slots so you could hot swap.

Then on odd occasions when you need it just take the extra battery/ies.


They change, but my 2015 ThinkPad 13 still delivers on performance, battery, screen, etc.

I'm a grown up now. I've got better uses for £2k right now. I'll get a new laptop when I need one.


You sound like you've fallen for some marketing. They want you to think that your laptop is "old" and you must upgrade. They want you to chase specs regardless of the benefit to your workflow because specs will improve forever.

I don't upgrade unless my computer can't do a thing that I need it to do.


"ThinkPad® X260 will go as long as you do with over 21 hours of continuous battery life". Thats a quote from Lenovo website from 2018. I can confirm this was in practice the case with extended battery pack.


Doing what kind of work? And with what kind of screen brightness?

I like Lenovo laptops but I never got that kind of battery life on them I got maybe half or sometimes only a third of what they advertised.


So you buy a new laptop every year?

What do you do with your old one? Throw it away? Or sell it? If you're selling it who do you think is buying it?

You do realise most people don't need the fastest processor, an 18 hour battery life, thunderbolt 4 or WiFi 6

And better screens and lighter weight are relative. If I upgrade my 10 year old laptop to a 5 year old laptop I'm still realising those gains. Next year a better screen will be out. Presumably we can agree that today's best screen is adequate, despite the knowledge that in the future it won't be the best screen, so if it's adequate today, why would it cease to be adequate in 1/3/5 years?


Well in 5 years I expect most dev work will be done in VR collaboratively with remote teams, w multiple virtual screens and AI assistance. Another world. Laptops and cell phones will be gathering dust with the desktops. Things change fast. Remember the world without the iPhone? That was 16 years ago. So when you talk about using a 5 year old laptop to save a few dollars it just doesn’t make sense. You can listen to music on a CD player, sounds great, doesn’t make sense. Using a heavy hot i7 MBP w 6 hours battery life from 2017 also doesn’t make sense, to save what $1200 rather than buying a 15 inch MBA. Not today - this machine is your everything. Spend some money already


>this machine is your everything

Philosophically I disagree.

Further I spend more time in my bed, but I didn't spend anywhere near that when I got it, and I haven't replaced that in 10 years.

>5 years I expect most dev work will be done in VR collaboratively with remote teams

Really?

You know that people lives are different to yours. I'd guess that most people on HN aren't employed as Devs, myself included. Of those that are Devs I'd guess that most aren't using the latest and greatest processes and languages.

There's still production COBOL code, do you think those Devs will be moving to VR anytime soon. And that's assuming your prediction is correct, which I highly, highly doubt.


VR sucks for work. And multiple virtual screens... FVWM did virtual desktops since... 1994?


How about we save that $1,200 and donate it to something like Rainforest Alliance instead?


Tbf the whole point of the the article is to use less stuff. If we all did that then we'd hopefully not be in the situation of having to chop down the rainforest.

So logically as it's consumption that's causing that destruction, you should really be donating the $1200 when you buy the new laptop, not when you don't.


It is well-known that planar transistors are more reliable than FINFET.

It is usually agreed that the end of planar transistors was the last process node of 28 nanometers. Everything below this (14, 10, 7, 5, 3, and 2 nanometers so far) is less reliable.

The fastest, lowest power, and most reliable CPU in a laptop will be on a 28nm process node. Everything beyond this sacrifices reliability for speed and lower power.

(And if you really care about lower power, then you are on ARM.)


The failure rate of CPUs could go up 10x and it would still be a rounding error when quantifying what causes laptops to be decommissioned.


A quick check of Google shows that TSMC does have a 16nm finfet process that is tested and approved for use in automobiles, which is a much more punishing environment than laptops or phones.

As a measure of redundancy, multiple finfet transistors can be wired together to act as one. It would not surprise me if this was heavily exploited in automobile applications.

https://electronics360.globalspec.com/article/16795/2-automo...


I like my framework and the 2 mini pc made from upgrading the internals. The slots are pass through thunderbolt 4 just like laptops with only USB’s ports. Not sure why they offended you. Get 4 usbc inserts and pretend they aren’t changeable if that’ll calm you down. Battery life does suck though I’ll give you that. It’s good overall at the cost though


> I say buy the latest MacBook, expense it, factor it in as a monthly cost. These are expendable and essential tools. They pay for themselves

Yay, another top comment that steamrolls the posted article. I say don't buy the latest Macbook: instead repair, refurbish, and get true lifetime use out of these expensive devices. They're expensive to manufacture, expensive to buy, and expensive for our planet and people to just throw them away every year.

They're not expendable at all, here is just one example of the human sacrifice involved in their manufacturing: https://en.wikipedia.org/wiki/Protests_against_Samsung


If you're trading it in then it's going to see more life, refurbished.


>> “Laptops don’t change”

> because a few years ago battery life was maybe 7 hours and now it’s 18

seconded for linux as well. a few years ago battery life was maybe 4 hours, and now it's 1.


For OPs use-case which is mainly writing articles, reading and such - an older laptop will work fine. Especially when powered from A/C.

My MacBook Pro 2014 base model can still hold a candle to MacBook Air M1 for general use (even dev work to an extent), and that’s saying something. It’s slower, but doesn’t feel 9-years-old slow - if you know what I mean. It doesn’t drive me nuts to use it and I can be productive on it (my daily driver is a MacBook Pro M1 with 64GB RAM and I own a MacBook Air M1, so I speak from experience).

Also, we already do this for cars (used cars market is huge), so why not laptops. In fact, with cars - there are arguably stronger reasons to go with newer tech (safety, automatic cruise control) than with a newer laptop.


Always assumed it was because laptops start at $300 and cars start at $30,000.

Don’t call me out on the actual numbers, I am only really paying attention to the zeros.


The framework laptops definitely run hot, I gave up on mine because it would overheat several times a day throttling down to 200Mhz. It sits in a box unused now and I consider it ewaste, I don't have the time or patience to bother with it anymore. I reached out to support maybe 6+ months ago but I haven't finished following their instructions because the laptop was just so bad and I wasted so much time dealing with it that I don't care anymore.


> I consider it ewaste

Leave it in a box in a cupboard and it is. Give it to somebody who will use it (at the next time you are hanging with geeks) and it isn’t e-waste. Offer it for free pickup next Framework thread - somebody wants it enough to not cost you more time than a drive format.


Yeah I would suggest you do this instead of keeping it in the cupboard. Lots of people looking to pick one up for cheap.


Just being able to replace the battery is a huge win.


> I keep hearing this argument for not buying a new laptop every year and it just doesn’t hold water. I say buy the latest MacBook, expense it, factor it in as a monthly cost.

this is why Apple is a trillion dollar company. because people mindlessly buy any new thing they put out, regardless if it makes sense or not.

https://youtube.com/watch?v=Ydje1ivynIM


  > this is why Apple is a trillion dollar company. because people mindlessly buy any new thing they put out, regardless if it makes sense or not.
not to defend apple or anything but whats the complaint here?... that apple is really good at capitalism?


> whats the complaint here

Its dont mindlessly buy any new thing Apple puts out


why? whats wrong with people buying things that they like?


you seem to keep ignoring this key word:

mindlessly


yes, because you haven't provided any evidence for these "mindless" consumers so i'm not going to try and debate that point (besides calling apple customers mindless and then saying thats why they have 3 trillion dollar market cap is nearing troll territory)

my point is:

if i (mindlessly perhaps?) concede the point that those consumers are mindlessly spending money on useless things... then so what? its their money, let them do what they want with it. why make value judgments?


because we aren't talking about candy here. we are talking about purchases that are hundreds or thousands of dollars. and people are normalizing making these purchases yearly. sure if you're dumb and rich go ahead buy what you want. but the culture pushes this mindset on people who cant afford it as well.

unless you have the money to blow, then having critical thinking is an important skill. to ask yourself "is this a waste of money?", "should I really be buying one of these every year?"


>we are talking about purchases that are hundreds or thousands of dollars

...with their own money. For items they see a use for. What they consider to be worth it.

----

>to ask yourself "is this a waste of money?", "should I really be buying one of these every year?"

"Hm, is this worth it to me? Yes." There you go.

---

I agree every year is a bit much, but I'm not trying to stop people from spending their money in way they find useful. It's not my money. I'm ok with that.


  > but the culture pushes this mindset on people who cant afford it as well.
so the problem is with consumerism isn't it?


at the moment the problem is with you.


there's no actual need to be an ass. If you're unable to have a discussion, don't comment.


follow your own advice


> 3 hour battery life.

Maybe if "laptops don't change", sure: https://www.youtube.com/watch?v=tuw-YpbFkkM


Using the modular GPU


That sounds like batteries change


Every year is a lot

But I think the article is more about impulse control taken to an extreme because they lacked it in the past


I got tired of laptops being: non-upgradable, poorly designed, overheating, having cheap components, cheap materials, so I'm switching to a backpackable mini-ITX build. For a journalist, even a first gen iPad should do the trick, but for quick compilation and development in general with all those heavy apps and daemons even a relatively fresh laptop CPU just doesn't keep up. The only two downsides are: no battery, space.


> the only two downsides are: no battery, space

And not being able to use it on the couch, on the toilet, outdoors in your backyard, on a plane, on a train and so on.


It's not that big of a problem compared to a performance oriented laptop, which is most probably on the cord 95% of the time anyways.


> on the toilet

sure you can, there's even a little shelf to hold the keyboard while you poop


There are also dedicated backpack PCs that are designed to run in the backpack, usually for virtual reality. I think those are super expensive though... (search "VR backpack")

Also they probably take up the whole backpack instead of being "backpackable". And they might be built like laptops (proprietary form factor, mobile/ultrabook CPU, etc.) depending on the model.


Nah, they are based on laptop components and cost twice as much.


I have not directly bought a computer in 13 years. I’ve been a programmer for 10.

My daughter uses the one I was issued at coding bootcamp in 2013. My home recording studio uses a 2010 mac mini. My wife uses an out-of-warranty one from my work, as do I for my home dev machine. My son uses a really old cheap laptop that was laying around.

I installed SSDs and maxed out the RAM on all of them.

Maybe people are attracted to the latest shiny thing, or give up early on a computer that would go to scrap without a little TLC.


I feel like 2010-2020 was a stagnant time for hardware, which was most visible in the lack of huge leaps in gaming console quality. AI is changing that. Having tons of RAM and VRAM really does open up options that would simply be impossible before.

Running large AI models locally is a bit niche right now but I'm doubtful it will stay that way, so I say it's a good time to value bleeding edge machines


2010-2020 includes the launches of Intels Sandy Bridge (and to lesser degree Haswell) and AMD Zen (and with it Threadripper), many great products from nvidia (980, 1080, RTX, Titan). 2010s were also the decade of Apple Silicon, mostly on mobile though but culminating on M1 launch in 2020. In 2010 SSDs were just becoming a thing, but were still pretty anemic small sata ones with notoriously buggy firmwares, no comparison to modern-day nvme drives. And finally, after decades of stagnation, in 2010s we saw great improvements in displays, both getting hidpi, hdr, and larger gamuts.

Sure, it might pale in comparison to the absolute wild ride that were the 90s, but I would not call 2010s stagnant as a whole.


> Furthermore, Linux operating systems do not steal your personal data and do not try to lock you in, like the newest operating systems from both Microsoft and Apple do.

Is recent macOS accused of stealing or shown to steal personal data? Platform lock-in is not news, but that would be. This is an earnest question; maybe I'm in a bubble and missed a big story.


Note that the only data tracking Apple was caught doing was in iOS, not macOS, but there are many signs that Apple collects aggregated usage data in macOS (which probably cannot be later reduced to expose the behavior of any single customer).

2021 Jun 7:

   > "Anyone opting out of tracking right now is basically having the
   > same level of data collected as they were before. Apple hasn't
   > actually deterred the behavior that they have called out as being
   > so reprehensible, so they are kind of complicit in it happening,"
   > Seufert explained.
* https://www.macrumors.com/2021/06/07/apps-continuing-to-trac...

Apple did finally close this loophole in iOS 14.5, causing Facebook to lose what some analysts think was ~$10 Billion. https://en.wikipedia.org/wiki/Identifier_for_Advertisers - however, by March 18, 2021, Mark Zuckerberg stated that it could actually encourage advertisers to use the explicit targetting and tracking happening on facebook, and he made it sound like a win-win.

2022 Nov 8:

   > The recent changes that Apple has made to App Store ads should
   > raise many #privacy concerns. It seems that the #AppStore app
   > on iOS 14.6 sends every tap you make in the app to Apple. This
   > data is sent in one request: (data usage & personalized ads are off)
* https://gizmodo.com/apple-iphone-analytics-tracking-even-whe...

* https://techcrunch.com/2022/11/14/apple-faces-new-lawsuit-ov...


Thanks for all the details. The App Store app bit is a very bad look. The rest seems on the up and up, or at least isn't the OS phoning home your personal data.

I'm inclined to think the quote only meant to claim that Windows and macOS try to lock you in, and that Linux doesn't steal your personal data. The sentence structure is ambiguous.


One really good reason to buy new gear: Discovery of new UEFI rootkit exposes an ugly truth: The attacks are invisible to us Turns out they're not all that rare. We just don't know how to find them. https://arstechnica.com/information-technology/2022/07/resea...


Oh how I miss the form-factor+keyboard of my X40/X60. Seeing that tent of an X60s has me feeling very nostalgic. One of the few devices capable of doing that for me.

It's too bad they had so many flimsy plastic parts until the X220 era, which still tends to crack palmrests and bezels with use.

If someone packaged a modern SoC beneath an X40/X60 keyboard in a rugged light chassis with all the modern display/battery junk I'd be all over it.


Same. It really was the best screen size/ratio for me. I was basically carrying it everywhere.


IBM used to make some genuinely great end-user stuff.


I used to think that as long as all you needed to do was some web browsing that older or underspeced machines were fine. But in the last few years websites have become insanely resource intensive. Even using something as seemingly simple as Gmail is a frustrating experience on older laptops (speaking as someone who just upgraded from a 2017 dual core MacBook Pro). I also have a hard time believing that the same hardware could handle that task much better if it used some lightweight linux distro. The browser engines are the same after all. You could swap Gmail for a lightweight email client, but I find that this approach of choosing software compatible with a capabilities of these low powered machines runs into some limitations sooner or later, e.g. when someone decides to sends you a Microsoft Office file, asks you to collaborate on a Google doc, insists on doing a meeting via Google Meet or Zoom. I may also want some of the other amenities modern laptops have to offer, like bright screens, long lasting batteries, the possibility of using a modern IDE or IDE like editor instead of having to learn VIM.


> The browser engines are the same after all.

I find Firefox and derivatives are overwhelmingly preferred to Chromium and derivatives on older hardware.

Chrome has a history of disrespecting users' machines to make itself seem more performant.

https://arstechnica.com/gadgets/2014/07/why-google-took-year...


I have the same problem on my 2015 Macbook Pro, I can do all usual work I want, but website are getting slower and slower burdened by heaps of JavaScript, autoplaying video and what not. Sometimes I even have input lag when typing in a search bar(!). The Ikea website is a good example of bloated website.

Adblock only helps a bit, and most websites don't work without javascript.


>Even using something as seemingly simple as Gmail is a frustrating experience on older laptops

I don't think Gmail is simple. It runs a lot of Javascript in background.


Huh, I have a Thinkpad t450s from 2015 which I’m sure has less power than a 2017 MBP, and I use it for email and streaming YouTube just fine. It runs Linux and I use an adblocker, so maybe that’s a difference? Or maybe I don’t mind the decreased performance as much.

I don’t do my programming on it, so I can’t speak to that.


I have an older laptop which runs Debian 12 with xfce rather slow. Browsing the web with Firefox goes fairly slow. Clicking a button can take a second before something happens. I changed the OS for GhostBSD (=FreeBSD) and now have a major improvement in speed. Buttons and links in Firefox feel snappy again.

I was not aware that FreeBSD would be this much resource-friendly.


I last bought a laptop (in fact any PC at all) in 2015.

It was pretty beefy for the time. A desktop 4790K CPU, 32gb RAM and a GTX 980M. It was a long time ago, but I think I paid around £1500 for it.

Shout out to https://www.pcspecialist.co.uk/ for a great long lasting machine that is very open for maintenance, long before Framework was around. I will absolutely buy from them again next time. Removable battery, replaceable hard drive, ram, cpu, graphics card. Easily open-able case (screws, but I've had it open). They sold me a replacement keyboard for something like £30 when I smashed a few keys by dropping my camera on it.

My inner tech geek is keen to buy something new, but this one is still going strong, and I can't justify spending money when this one works so well. It still plays all the latest games. It's only had two problems over the years. 1) Obviously, the original battery doesn't last very long now. 2) The CPU started overheating after about 5 years when pushed, so I popped the cover open, took the heatsink off and reapplied fresh thermal paste. Has worked perfectly since.

I mean honestly, the single threaded CPU performance only seems to be about 50% behind modern chips. I don't really see many machines even today with 32gb ram.

Things it struggles with these days that I suppose would require a new machine if it bothered you: - 4k gaming. - VR games are a bit too choppy unless you turn the settings right down. (HL:Alyx is playable at pretty much lowest settings) - Probably going to struggle with any generative AI stuff given the older GPU, but I've not tried it.

I code on it, but I don't do high intensive stuff like video editing, 3d work, rendering, etc, so perhaps it would struggle with that kind of stuff.


This making sense depends on your career. It makes sense for writers, but not for everyone. Why? Because silicon is cheaper than carbon ie the hourly costs for most machines tends to be far cheaper than my salary. In my case it’s more expensive to buy something cheap and slow, than it is to buy something that allows me to maximize my throughput per hour.


I really like the idea of using an SD card for the home directory, because that's the thing that changes constantly, and it's a pain trying to synchronise the home directory over several machines. With one instance, that's backed up regularly, it makes life fairly simple. Now I just have to work out the mechanics of implementing it smoothly.

I have an archive disk/directory that holds all the 'write once, keep forever' stuff like photos, music, ebooks, etc. That can be rsync'd periodically and that won't take much time because the archive grows fairly slowly over time, and rsync will only need to transfer those relatively few additions.

I'm also somebody who prefers the feel of my old Lenovo ThinkPad T410 from back in 2010 to my much later P53. Perhaps I should resurrect it with the upgrading of its 2TB hard drive to a 2 or 4 TB SSD. I must look into that.


A portable home dir works if you have the same os and software on all the laptops.

When I bought a macbook the first time (ostensibly to develop for iphone), I hated the UI, so ran linux in a virtualbox as my main os.

I think that would work really well from an external drive - but an sdcard is way too slow, it should be on a usb c nvme.


but an sdcard is way too slow, it should be on a usb c nvme.

Hmm, yes, that would probably be better and faster. And not only that, it could be possible to include a grub boot partition as well. I wonder if that would easily be doable, or would be just another can of worms.


I have tried this, and also using e-sata, though not as a daily driver, and it does work, but as you say it can be a bit wormy to setup; bios vs gpt partition table, efi vs legacy bios support, disk numbers can change depending what other disks are on the machine, etc.

One other problem is that boot-from-usb is usually not enabled by default so you have to fiddle with bios boot order every time.


One other problem is that boot-from-usb is usually not enabled by default so you have to fiddle with bios boot order every time.

No, leave the normal boot order as-is. To boot from the USB, just hit the 'boot-menu' key on boot-up. The 'boot-menu' key varies according to brand of computer. In a Lenovo, it's F12, in an ASUS, IIRC, it's ESC. Just select the correct boot-device and away you go. It's the same method as when you install the OS on the computer from a USB.)


"The 'boot-menu' key varies according to brand of computer."

Right I have a hard time remembering the bios keys, let alone the boot menu ones too, but yes this is better on foreign machines, and if you can't or don't want to permanently set the boot order.


I have a hard time remembering the bios keys

For what it's worth, I often put a tiny sticker on the bottom of the monitor that just gives the two keys for getting into the BIOS menu and the Boot menu. It doesn't need to remain there for long till it soaks in. Then again, some brands of machines will display a prompt at boot time with those keys, if the BIOS setting is enabled (Lenovo, etc).


I knew a guy who would buy a laptop, use it for like 3 months, then flip it on eBay and buy another. He said this is how you get good value for money -- flip it before it loses much of its resale value and buy new with the proceeds.

I'm not sure why people would do this. I use computer equipment to the point of exhaustion. I still have laptops from the 90s that work fine; while they will never be a main they are fun to play with. My main laptop currently is a ThinkPad from 2014 -- it does just about all that I need it to do without a hitch or stutter.

My main PC is a Ryzen desktop I bought in 2018. It's probably due for some upgrades, but I cannot afford them currently.


These days you’d have to keep extensive tax records to do that scheme because eBay is going to report earnings over $600 to the IRS.

And you’ll get eaten alive by eBay fees.


You can use Facebook marketplace, Craigslist or another site.


I buy back my decomissioned work laptop for $50, which means I get a fairly new ThinkPad that works pretty perfectly with Linux.


I should be illegal to sell laptops where the dissipative power of the cooling system is less than the thermal power of the CPU/etc.

Also should be illegal to lock me out of fan control, like Dell does. As always, never buy Dell, people. Teach 'em a lesson for advertising "space age cooling" that slowly cooks the laptop because their default fan profile was apparently designed by a pretty face in a Versace suit.


I recently did the same - needed a laptop, but balked at the costs for an "occasional use" device. Even a non-shite Chromebook (so decent ram and non-awful CPU) was £500+.

I got a used ThinkPad X1 from 2017 for £200 from ebay and it has been superb so far. It is modern enough for me - USB-c charging, Wifi-5, win11, 3ghz quad-core i5 etc.

I feel like it is very much luck of the draw for what the condition is like when you buy used, but for the ThinkPad X1 I got it appears to have held up very well.

Highly recommended. Will do the same again when I need another.


>>Even a non-shite Chromebook

I don't have a shit chromebook exactly but I have a shit big-box retail HP laptop that was ~$200 new that I bought on clearance in 2016 (so in the box but not recent), wiped the windows install and put CentOS7 on it.

It got me through most of a CS minor and a geology (focus geophysics) major.

It still works pretty well for an "occasional use" device, didn't handle Office365 very well but I think that was as much firefox as it was the hardware.


HP ZBook 15 first gen here, 9.5 yo. I use it to develop with Django, Rails, Phoenix. Docker, psql, Mongo, mysql, whatever js frontend my customers choose. Debian 11. I upgraded it to a double SSD configuration, 32 GB RAM. I replaced worn out keyboards a few times, the battery once a few months ago. It works well and I hope that it keeps going for a while. It still sells second hand for about 600 Euros last time I checked, too much to buy two of them as mines of spare parts.


I've actually gone even farther. The last 4 laptops I've bought have all been 50 dollar used Chromebooks. Pull out the write-protect screw, flash new firmware and install linux. If it's got enough eMMC space Windows 10 LTSC runs too (depending on model) but I find xubuntu or Fedora XFCE plays nicer with 16 gigs of space.

I have a beefier desktop for serious things, but in terms of everyday machines they work fantastically.


Interesting. Any models you'd care to recommend? I'm looking for a cheap, functional Linux laptop to replace my ThinkPad, and this option sounds promising.


For the aftermarket firmware that allows non-ChromeOS OSs, theres a compatibility list at https://mrchromebox.tech/#devices that will show you if full UEFI firmware is available, and the write-protect method you need to disable.

I usually browse around Mercari looking for a system with specs that are acceptable, then check the list to see if its supported. I haven't paid more then around 50 bucks for any of them.


I have a Thinkpad T460p myself, and its battery lasts as long, or longer, than most new laptops I've seen! I was just searching for a new one for a friend, and noted that most new laptops are absolute garbage, total crap. Most have the same specs as mine, almost a decade later, WTF (yeah, "new gens", whatever) It's really sad, we are dropping all quality for mere quantity and consumerism...


I bought Lenovo 11e chromebook for $24 with free delivery on ebay. Disassembled the unit and unscrewed write protection screw, installed new BIOS and GalliumOS. Works pretty OK'ish. I use it to watch videos: youtube and video services I need (movies online) work well. Battery is pretty good as well.

There is no magic though. You get what you pay for - I can't do modern JavaScript development on it.


The original article makes a good point but neither it nor the comments here mention my main concern with used computers-- They don't have the same clean/new-system "security" properties as a new computer.

Who knows what malware/security rootkit was installed on some consumer machine that persisted in some corner of it and didn't get detected/wiped? It's kinda a crapshoot in my view.


I'm more or less in that boat. I did by my laptop, a 2013 MacBook Pro, new, but 10 years later it's still fine for my purposes. For productivity purposes, the only real concessions to its age are that I did have to get the battery replaced about a year ago, and I do some light adblocking (no autoplaying videos or heavy scripting or anything like that, but I still let static ads through) to keep webpages usable.

For light gaming I recently bought a refurbished 2019 desktop for $250. I mostly only play older games (because I'm cheap), so that was more about Apple dropping support for 32-bit apps than it is about the MacBook not being powerful enough, though I am admittedly happy about having a computer that can play KSP without everything turned down to the lowest quality level. Overall, happy there, too - it's hard for me to imagine that a new computer would provide enough utility to justify the added cost. The latest generation of this computer retails for four times as much money as I paid for the used one, but the specs are very, very similar.


I use a Core2Duo laptop on a regular basis and all my tasks are met except of poor battery life. More cores does not give me such a boost as migration from a single-core Pentium 4. I still use Windows 7 on half of my machines as the best OS in my life ever but I am migrating on GNU/Systemd/Linux on all new laptops because Windows 7 is as much vendor's presence as I tolerate.


The last 3 laptops I bought were a pair of Lenovo e570 machines and a Lenovo e580.

Two went to college kids and I keep the third.

When I configure a laptop, I use less RAM than I would otherwise want (16 GB lately) and make up for it with a fast NVME boot drive as swap space. Intel NVME drives are fast and power efficient. That gives me a better mix of performance and battery life than would a laptop with more RAM.


Wouldn't the increased swap load reduce the lifetime of the nvme drive?


I'm not worried about service life when I buy Intel NVME drives.


My laptop runs on Linux Lite, one of several open-source operating systems specially designed to work on old computers.

That use to be a better tip than it is now. These days the thing that bogs the old computer down isn't a heavy OS, but all the heavy sites you visit with that OS. There's only so much you can do to get your old Bay Trail to play 4K Youtube.


> According to the most recent life cycle analysis, it takes 3,010 to 4,340 megajoules of primary energy to make a laptop

4340 MJ is around 135 litres of gasoline in terms of primary energy, or a little less than 36 US gallons.

Commuters blow through this much in 1-2 months, depending on distance.

My point is that if you're concerned about embedded energy, there's a lot of other, lower hanging fruit to pick.


Energy usage isn’t the only thing to be concerned about. There are resources that have to be extracted, and non-recyclable waste piling up in huge quantities.

Seems like a fine fruit to pick if you consider broader environmental impact.

(Also, we can do multiple things at once. Burning an extra 36 gallons of oil if you don’t have to is still bad.)


I want to agree, but fossil fuels are generally non-recyclable and extracted on a much larger scale.

I mean, global production of cobalt - a highly controversial material - is around 200k tonnes.

By weight this is approximately 1.6% that of daily oil production - 90mln barrels a day.


I agree with TFA. Probably the best value for the buck is a 3 year old high-end Thinkpad (T or X series); you can often find one for €250-350, as many businesses will be replacing whatever is at the end of their 3 year cycle.

I've been getting a new one roughly every 3-5 years and they definitely stay good enough for another 5. My latest score is a T495 (replacing an 11 (!) year old X230); it has 4 cores (8 threads), 24GB of memory, two NVMe drives, 4-5 hours of battery (~270 cycles), full HD 400 nits screen, and a GPU that actually handles some games. It's definitely less powerful than my M1 Mac mini, but it's good enough to the point that I'm not missing the extra horsepower all that much, even when doing "Actual Work" (it can get tad hot tho).

Also, developing software on less powerful hardware has another, indirect benefit: your software tends to suck less, because you have less cycles to waste.


Well, I'm retired, so I don't do much emails or zoom anymore. I do serious stuff now, like Cyberpunk, MFS and Hogwarts. So I need serious hardware, updated often. And I'm constantly on the move too, so serious hardware needs to fit into my backpack. Sorry, snowflake millennial applebooks not up for the task.


Believe it or not, Apple laptops are now quite capable of gaming, and Apple’s recent release of a game development porting toolkit shows that they’re getting serious about the space:

https://youtu.be/bIPl71ZeOYk

40fps in Cyberpunk 2077 on a Mac laptop including a translation layer.


1080p, no ray tracing ... long way to go.


Sure, but you’re really missing the point I’m making here.

It’s good enough to play any modern game you throw at it, even poorly optimized and demanding ones like Cyberpunk.

So all it needs is better software compatibility and you have a lot of Mac users who are no longer motivated to buy a second system. Apple sells 1/4 of all laptops in the US to the most affluent segment of the market.


I would still be able to use 2013 Macbook Pro if I could have someone around who understand that GPU related issue would be solved by "holding u8900 chip down with rubber material"

https://www.youtube.com/watch?v=lumJ0cN5oyY

https://www.reddit.com/r/macbookpro/comments/ek0pmi/comment/...

However, Apple declares your device "vintage" and repair shops avoid it like the plague. Because they only know who to replace the entire board. That's when you device to buy another laptop.


Nice read. I do the same, but with 3-6 year old laptops. I recently bought an HP ProBook 440 for 450€, you can still buy it new for 1050. The “used” one was brand new, there are people that buy such stock from companies and when they say “No signs of usage” they mean it, I so far got 2 spotless machines.


I rely on laptops provided by corporations I work at, so here's the breakdown of the remaining electronics:

1) samsung s10 (2014) -> iphone 8 (2018) -> iphone 13 mini (2022) -> no replacement planned since Apple dropped mini lineup.

2) gaming pc: amd fx-8350 (2012) -> i7-6700k (2016) -> 7800x3d (2023) -> hopefully no replacement till 2027

3) tablet: nexus 7 (2012) -> galaxy tab s3 (2018) -> ipad pro 2021 (2022) -> unless battery completely dies earlier than in 4 years like it happened with nexus 7 (making it stationary device).

So overall, my update cycle is every 4 years on avg, with PC upgrades having longer cycles thanks to exceptional performance of the chosen platform at the time

I.e. I still have i7-6700k working at another location and delivering strong both for productivity and gaming use cases.


Somewhat meta, but having seen the breath-takingly shitty hinges on many HP laptops including mine, followed by all the inevitable hardware issues with nearly every laptop at home (choose between buggy/unreliable/disabled x touchpad/left speaker/audio jack/webcam/Bluetooth/WiFi/screen/hinge/chassis shell/keys) made me realise that the only “true” solution is to spin your own laptop or similar.

Frameworks may be good starting points for most, I prefer smaller devices and hope to build a cyberdeck with a 4”*4” SBC as soon as I have the necessary resources.

I am done with troubleshooting glitchy hardware/software. (Modern) HP can go burn in an eternal hell. (Oh, and Lenovo and Asus are at the gates as well.)


+1

At the high-end windows laptop are usually a coin flip. If you want any guaranty you need to basically sign on for a PhD level of research and review watching to make sure you don't get a turd.

At the low-end/middle end... God damn it's rough... The average quality from poor screen to poor keyboard, bad termal you name it ... just plain sucks.


I'm still getting great mileage out of a T430 I bought in 2017, and I'm not particularly nice to my laptops. The one frustration I've had recently is that it can't drive my nice 38-inch monitor at its full resolution, so I've started eyeing a T470.


The T430 is a rugged laptop, but it’s getting pretty far up there in age. For example, I use mine in the shop to read PDFs, update the project’s Trello board, fetch software updates and drop them on SD cards, etc.

I used to think Trello kinda lagged. Great tool, but man it’s kinda slow. Then I opened it on a modern computer - it’s just the old hardware holding it back. Same with PDFs - these are graphical 11x17 pages, and it really does get bogged down easy. You’re gonna be amazed how much faster everything feels when you get to the newer machine.


Go for the T480, it has 4 cores and 8 threads compared to the T470 2 cores 4 threads. Big jump if you do anything like compiling.


my brother is fanatically opposed to buying anything new. when i wrecked my last new laptop, by spilling a gin & tonic into it, he got me a S/H HP Z15 workstation on ebay, and i must admit i have been pretty happy with it. i will think again before buying new.


Although capitalism could provide us with used laptops for decades to come, the strategy outlined above should be considered a hack, not an economical model. It’s a way to deal with or escape from an economic system that tries to force you and me to consume as much as possible. It’s an attempt to break that system, but it’s not a solution in itself. We need another economical model, in which we build all laptops like pre-2011 Thinkpads. As a consequence, laptop sales would go down, but that’s precisely what we need. Furthermore, with today’s computing efficiency, we could significantly reduce the operational and embodied energy use of a laptop if we reversed the trend towards ever higher functionality.


laptop sales would go down, but that’s precisely what we need. Furthermore, with today’s computing efficiency...

Somehow these conversations never seem to talk about how we would not have current levels of compute if everybody kept their computers for one or two decades before buying another. Good or bad, right or wrong, the present state of affairs only happened with the present replacement cycle.

I'm not arguing we have to keep the present replacement cycle, but I think it needs addressing.


> Good or bad, right or wrong, the present state of affairs only happened with the present replacement cycle.

Hmmmm... I am not sure that those two things are as correlated (much less causally linked) as it might first appear.

We could have the same "average" pace of innovation, just delivered between longer intervals.


Ok, you’re probably right as long as your total spend in dollars is the same. If everyone buys laptops half as often but pay twice as much. It’s the spend that funds the advancements.


Content aside for a second:

I have been trying to find your magazine again for so long. This scratches an itch. Thank goodness.


> This is facilitated by the fact that laptops are now a mature technology and have more than sufficient computational power.

Depends what you do with them, if you program, edit movies, do 3D rendering or play games, there will never be enough power.


depends on what you're programming tho. For web work just about any old laptop will work. AI, number crunching, 3d? yeah you need the best you can afford.


I'm the family IT guru so I always give this advice when buying a new laptop or desktop: Set a budget on how much you're willing to spend and don't try to save from that amount.

Buy the best you can for the amount you can afford and you won't feel bad about it. If you could only afford something really cheap and it dies in 2-3 years, you won't feel too bad about it and if you can afford something else, rinse and repeat. If not, try to get it fixed.


Apple is a business first and formost. Software closing to replace the same part with different id is trash. more money to them

looking back what mac have is it closer to Linux than win (I know subsystem Linux exist) - Linux is but Programs is key for market You can't import ios apps to win It's monolith

Unix system and then modules for desktop etc (hmm it wouldn't work how I imagine) /

Animation is a problem Graphics 4k Before you click then need to be spectrum of color time to react to do a function it's weird you didn't speed this too much features bare bones isn't great either Hmm


You can get i5 T430-T480 or X220 to X280 for around 200usd (exclude tax and shipping) depending on spec configuration. For journalism, you don't need fancy laptop more expensive than 200usd.


Same here, among my personal computers, the most recent one is 5 years old.

Even for 3D programming, if the Web is the main target, the 3D standards are made for 2009 hardware (WebGL 2.0), or 2015 (WebGPU), so having the latest GPU model doesn't make any difference.

As much as I like to program graphic related stuff, I am not a one person AAA studio, nor going to create the next AI startup, so the NVidia MX150 is mostly on vacations.

Dealing with Docker and friends, is something I leave for work, so again, nothing that really impacts my private computing needs.


I used a 12" Apple PowerBook G4 for years, long after Apple switched to Intel and came out with Macbook Air and Macbook Pro lines. I loved that laptop. Long after Apple stopped providing updates to the OS or browser I used TenFourFox to run a reasonably up-to-date FireFox on it. I finally gave it up, the battery life had degraded a lot and while that could have been replaced I was growing increasingly nervous about using such an out-of-date OS on an internet-connected laptop. I still miss it.


For what it’s worth - I bought one of the “controversial” Alienware laptops in 2015. It won’t play the latest games but it still runs beautifully to everything I need including coding and Excel work.

While a few of these brands are a bit pricey, I’ve bought family members other brands like Dell and Acer laptops, and they’ve got a few good years out at most before they become clunky and close, even with a fresh operating system reinstall.


I can use old laptops because I can't run what I need, but at least I won't buying new cars as old cars seem to have a better bang for the buck.

I like my actual car because is quite reliable and if something breaks, it's easy to fix. Also, being Diesel it's also cheap to run.

My next car will also be an old Diesel. Probably I should buy more cars at some point because they'll forbid new cars with internal combustion engines and it's good to have backups.


Ugh, diesels need to go, they’re an air quality nightmare.

I agree with you that new cars are a bad value, they’re a rapidly depreciating asset, but that has little to do with the drivetrain.

The truth is that if you give any shit about the environment or people breathing the air around you, EVs or at least a gasoline hybrid are the verified lesser evil. They’re just so much more efficient on energy consumption during operation that it easily negates their production impact.

But the best thing to do is reduce car trips and car dependency entirely, because cars themselves are not sustainable and waste all our hard-earned money on inefficient infrastructure.


I never buy new computers more than once every 5 years because it takes like 3 days to get local versions of openssl and node or whatever to run correctly enough to run an app and it's the hours of my life I most abhor. It's just lost time on earth.

That being said it obviously pays off in productivity at least every 5 years. Maybe 3 is net positive? At this point i think we are climate doomed without acceleration so productivity is environmentalism


I am still on my MacBook Pro 2015 because of the Keyboard and Trackpad.

So in about 2 years time this will be 10 years of usage. I dont intend to replace it any time soon. For browsing it is fast enough. And if you look at the Louis Rossman Channels it seems newer MBP just aren't built the same.

It may be worth looking at it again I 2025. How little ( or big ) the past 10 years of Laptop has changed.


I'm a software developer and an older laptop is not great for productivity. I have an old Dell Latitude from 2017 and I pumped as much life into it as I could. Bought more RAM, an SSD but despite all that, it's a pain to run the IDEs I need to use for work. I keep it as a backup and work on an M2 Max now.

Same goes with my S.O. who is a Chemist. Older computers are just not ideal for new software.


Age is not really the only aspect here; almost certainly top tier 2017 Latitude is more powerful machine than bottom tier brand-new Latitude today.


I decided to buy a Mini-STX (AsRock X300 64GB RAM/Ryzen 5700G) and wait for a good laptop to become a available. It's still portable, much better performance/price ratio and you can buy a good (portable) high brightness matte display of your own choice. It may not be portable as in battery portable, but it's small enough to go in suitcase.


"Laptops don’t change" - this is most important reason if we're talking about Apple laptops. My 10 years old MacBook Pro is almost absolutely the same as MacBook Air of 2023. Performance, screen, OS, weight, battery, keyboard. And old laptops look even better; look how ugly are new M1/M2 Air cases in comparison with first versions of Airs.


your 10 yo mabook pro is not nearly as fast as a macbook pro M2, not even close. Does your old Pro work as good for your -needs-, that is quite possible depending on what you're doing with it.


Typing this from my `MacBook Pro (Retina, 15-inch, Late 2013)` ... Also have two working ThinkPads in use in my household that are older where I've had to replace the CPU fans, at some point they just crumble, but otherwise work OK-ish (but maybe getting to a point where I should really retire them).


The Pre-touchbar Retina Macbook Pros were the perfect laptop for the era. They could run a bit hot (but what didn't, back then?). Excellent hardware quality, and can generally still run well today if you don't do anything demanding. I've got a 2013 13" that doesn't have a single issue other than age.


The heat and constant fan noise was a real problem.

That's the entire reason we upgraded my wife from a 2015 15" MBP i7 to a 2020 MBA M1. The extra hours of battery was a nice bonus, the main gain was the lack of noise.


The touchbar feels to me like a time bomb that explodes when Apple thinks it's time for you to give another stack of cash to them.


I still use 13y old ThinkPad x201 with Fedora. Of course with SSD and upgraded RAM to 8GB. I cannot imagine more kid-proof laptop. It's indestructible construction, yet compact and not heavy. Perfect for home with kids. No wonders that it is commonly used in space stations!


"I calculated that my keyboard had 90 keys and that replacing them all just once would cost me 1,350 euros."

This is a problem in the godawful 2016- macbooks, but with thinkpads, you can replace the entire keyboard, and do it yourself, for $30.


i think this widely varies between people and what they need their laptop for.

personally my previous laptop (7+ years old; the replacement is over 3) was ample for basic tasks even today, but it did not meet my personal needs (gaming, deep learning on cuda).

but there is something liberating in getting a very cheap but well-built old laptop if you can get away with using it like a thin client. personally, going forward it makes sense to instead invest in a repairable laptop like framework. while the old thinkpads can only be repaired around the core hardware, this way you should be able to get much more mileage.


>Between 2000 and 2017, I consumed three laptops

This is the story I really want to hear wtf


I am hoarding older thinkpads at this point. The soldered RAM ones I am not touching. The older ones are cheap to upgrade and pretty snappy. I havent bought a new laptop since 2009.


I have an old toshiba laptop from 2008. I use it every day to learn m. my kids use it as an emulator of old games. it still has up to date software thanks to guix


I have a still working 2010 Acer 4810TG upgraded with a SSD and 8GB RAM, I use it to play very old games and a little of Internet browsing, for that it still works.


There is one, galaxy-sized, deal-breaking issue with any used laptop: you can never clean them thoroughly enough. I'll leave it at that.


My version: Intel did not do shit for 10 years. That’s it.


Rather than buying an old laptop which was high end at its time, it’s better to buy a cheap laptop which is new. The processor will be more efficient and powerful, it’s going to have multiple newer ports and you don’t need to run an obscure version of Linux which won’t run tools that people in your industry use. Running Linux as your daily driver is hard as it is, running an obscure version of Linux on an old laptop as your primary machine will drive you crazy. You would end up losing productivity fixing all the bugs and random issues.


On my development machine I run at the same time: at least two instances of Visual Studio each containing a big microservice based app, an instance of Visual Studio Code containing a large front-end, Docker Desktop, Postgres, Redis, MongoDB, management apps for databases, at least three Windows Terminal tabs, dozens of tabs in browser, dozens of tabs in Notepad++, Teams, Outlook and some other random stuff.

I need beefy laptops and I usually change them after two years.


I always buy my phones used, from swappa.com... saved hundreds of $ for phones that are in mint condition


ram, disk, and cpu cache all continue to improve. if you live on your computer and can afford to it is absolutely worth to continuously upgrade.

donate your old machines to charity! many people can’t afford computers.


The same reasoning applies to cars even more.


I average 8 years and two batteries for a laptop since I switched to Macbooks, prior it was about 5 years and one battery.

MacBooks have a much better lifecycle.


Aren't new models with soldered batteries as well? The lifecycle is a disaster, especially if you would like to change the drive or upgrade the ram, you simply can't


In some countries Apple provides a battery replacement service for the Macbooks at a fixed price, but you have to leave it with them for a little while like any other service.

I think the price is reasonable for the quality of the batteries. The Apple price was much lower than a local Apple reseller store quoted for battery replacement - they pretended not to know about Apple's own service, and when I showed them they said they couldn't match Apple's price as it was too low.

The inability to upgrade storage space is a big problem though. Apple storage is prohibitively expensive up front yet you know you might run out of space during the long life of the device. External drives work and are fast so that's the practical solution but it's not as convenient.


yet another example of Lindy effect. I've been using Thinkpad T40s since 2019.


it would be pretty cool if someone released a 10 or 15 year chromebook.


I mean... you can save a few bucks on a laptop. Sure.

But the real cost is the downtime.

If I have an employee that I value at $100,000 a year, working 1800 hours, that's $55 / hour.

Assuming their work doesn't impact anyone else, it would be silly for me to take that employee out of commission for any expense that costs less than $440 / day.

Way smarter for me to buy a new laptop, which is less likely to have any issues, and then sell the old ones to offset the costs.

And I'd also point out that most people in tech make more than $100,000.

Also I'd point out... the new Apple Silicon is really nice. Battery life is a lot longer. Heat is a lot less. Screens are fantastic. No stupid touch-screen function bar. Great keyboards. MagSafe power... like, honest to god if you haven't got one of the new Apple Silicon laptops you're really doing yourself a disservice. You should get one! (=

(Also... my opinion... every other laptop maker should just quit. Just close shop. They can't touch the new Apple machines. That's how much better the Apple Silicon laptops are. Everything else that's being made is just e-waste by comparison. Even the Intel MacBooks from 3-4 years ago are just dogwater in comparison.)


> Way smarter for me to buy a new laptop, which is less likely to have any issues

I don't think that's a safe assumption.


It's crazy that you think that actually makes sense in practice. Yeah you can put the numbers out there and it may "seem" correct but in reality that is ridiculous logic.


And that is just one factor in having pre-emptive laptop renewal cycle.

In general it makes IT happier to have more homogenous fleet, makes managing things easier, possible to have spare parts at hand etc etc.

It also removes the need of negotiations if someone "deserves"/needs new laptop or not, which might otherwise be discussion wasting the workers, IT depts, and possibly managers time.


Most people in tech definitely do not make over $100k US, what an American centric POV.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: