Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What does one look for in a laptop these days?
300 points by godDLL on Oct 16, 2021 | hide | past | favorite | 719 comments
I'd like to understand if I'm missing on any new developments since 2015 or so. Here's what I'd like to have:

*. Goes: preference and alternative, deal-breakers:

1. SIM preferable, WiFi acceptible, but Bluetooth has to be good

2. Spill and dust resistant keyboard that doesn't feel like typing on nothing

3. Trackpoint or trackpad, that works

4. Stylus or touch-screen that doesn't glare

5. Good power management, lasts through the day, done charging in 2.5 hours

6. Runs the hacky-mac or Slax, has a head-set jack

7. Good GPU, fast storage, two fast external storage ports

From my understanding I fit a kind of profile, and am very much not alone. But I'd like to know what the HN crowd take on this is.




1. S3 Standby/suspend support.

If your laptop supports that, you can close the lid and it will power ~everything except the RAM down (sometimes a USB port still charges). In 'new' laptops, including almost everything Tiger lake and Intel, S3 is being phased out for Microsoft's "Modern Standby", known as S0ix (or sometimes InstantGo). This is much more like a mobile phone, where it stays on but attempts to use as little battery as possible.

So, peripherals might shut down, but you might still be able to get critical Windows Updates or receive an important notification -- all while your laptop is unplugged, lid closed, in your bag overnight. This is pretty bad for heat, battery, and often not what you might expect. My new laptop uses about 15% battery overnight, just doing nothing.

In the future, perhaps this mode will work as well as S3 used to for battery life, but finding your laptop fans on while it's closed inside your bag and getting really hot is not fun. This applies to lots of laptops currently.

2. A good screen. Brightness that goes high enough to be seen outside (400, or more) - and it'll usually auto turn down the rest of the time to save battery. Hard to go wrong on a Mac, but really variable screen quality across other brands and models. You'll spend a lot of time looking at the screen.


Was going to post the same thing (albeit far more poorly explained).

Isn't Microsoft embarrassed about this? This has been an issue for my last two laptops over the last 5 years. How is it acceptable for this problem to still exist?

My Lenovo supports S3 sleep, but I didn't discover it until probably a year into having bought it, so at that point my battery's ability to hold charge had already been sufficiently degraded, to the point where I'm having to charge this thing every day when I'm mostly only using it for basic web browsing.

I will never buy another Windows laptop, and if Visual Studio ran on Mac I'd probably switch my PC out also. I'm not a mac guy, I find MacOS fairly unintuitive, and really hate the placement of the Ctrl key, but at this point it's clear that Apple's model better lends itself to quality, and I'm sick of all the broken crap that is deemed acceptable in the Windows world.

As I was typing this, a great example of said broken crap, is Lenovo popping up a MessageBox asking me to reboot to update my firmware. Didn't software developers all agree well over a decade ago that windows stealing focus was not a good thing? So why then is a top of the line brand in the PC world writing shit like this?


Your battery is likely worn from keeping it charged at 100% constantly, lenovo have an option to limit max charge to something like 80% which will greatly reduce battery wear, you just have to turn it off if you are actually planning to run it without AC for as long as possible.


This is really a poor software solution to an old hardware solution that disappeared for the sake of thinness: removable batteries. In the past with easily removable battery packs you could simply pop them out once they're fully charged and pop them in when you wanted to make sure they were charged up. When manufacturers started touting long life cycle batteries to justify internalizing them and making them difficult to replace and impossible to quickly pop out, software manufacturers added this in to attempt or extend battery life.

Before, I simply removed my battery when plugged in all day and significantly extended my battery life. If I needed it, it was mostly charged. It would of course dissipate over time so if I had any plans of using my laptop in an actual mobile fashion or wanted to move my laptop from one long term power source to the next, I'd pop it in, make the move and pop it back out. I of course lost the UPS feature having a laptop with a battery pack also served but I could assess when I thought my power source was/wasn't stable and pop the battery in under those conditions (not very often).

Most people I know use laptops as portable workstations. If you frequently operate in situations where you desire battery alone then the OS charge management works quite well. I don't and most people I know don't. In today's world, mos the use cases I operate under those conditions are better served by my smartphone.


They should just install a physical off switch for the battery, along with physical switches for the camera and networking.


What's wrong with the current solution of having a software switch, aside from lack of use/knowledge about it? What's the advantage to making it physical?


Software switches can be controlled remotely without your knowledge. A physical switch requires physical access to the device.


I am just talking about the battery. I don't see any privacy issue there.


Sometimes you really need to make sure there is no power flowing through your device, which you can't be sure of using software when you are trying to fix software problems. Of course I should have said recording devices and networking, rather than just cameras.

Also, true power off guarantees nobody is remotely activating eavesdropping on your device.


Lenovo have a battery disconnect bios option specifically for doing service work.


Maybe the concern there is devices appearing to be turned off but really still running in low power mode? The only way to be sure is to physically interrupt the power supply.


The people writing the software have a vested interest in your battery wearing out so you have to buy a new laptop


Or externalize the camera and networking modules, so that one can pop them in and out like a battery. Yes.


This is why I still use my Thinkpad W520. If I don't need the battery I just take it out until I need it. I hate that we got rid of removable batteries in basically all devices.


My HP G8 zBook Power have this feature as well, but its only accessible through bios. Luckily their BiosConfigUtility64.exe allows changing bios variables from windows, and the change takes effect immediately. My two .bats for 80 and 100% charging limits.

C:\Programs\SP107705\BIOSConfigUtility64.exe /setvalue:"Battery Health Manager","Maximize my battery health"

C:\Programs\SP107705\BIOSConfigUtility64.exe /setvalue:"Battery Health Manager","Let HP manage my battery charging"


Most batteries on laptops are worn out by heat. They really do not like heat, but will take lots of it, when placed and used in unfavorable conditions, which most laptops probably are.


My laptop only heats up in sleep mode, unfortunately.


Heat mostly matters when charging or discharging rapidly, if wear is concerned. So might be less significant than you'd think.


Well, I have a gaming laptop I bought quite new, but second hand, that was almost never used without cabel (I saw the formerly setup) - and the batterie was allmost dead and blown up in size after some months of use. But since it is a gaming laptop - it generated massive amounts of heat under load. I bought a new battery and took care of the heat: and it is still quite good, after roughly the same time and amount of use and I do charge till 100%.


Throttling of CPU and GPU should surely account for 90% of what you describe, coupled with an old, uncycled battery that can't move ions no more.


My two-year-old Dell Inspiron also has a BIOS option for a "Mostly on AC power" charging profile. I'll still give the power cord a yank every now and then and let it run on battery for a while.


dell has had this feature since i was in college twenty years ago.. it's such a great one


I set it via tlp[0]:

    sudo tlp setcharge $start_precent $stop_percent BAT0
[0]: https://linrunner.de/tlp/


His battery is likely worn by recharging cycles. As the computer is being discharged more often.


Nope, its from charging to 100% constantly which accelerates wear massively.

I've had my Lenovo carbon x1 for a few years set to 80% max charge and its battery life is still good.

Battery stats report 480 charging cycles, capacity has dropped only 8 percent from rated.


It’s not charging to 100% that wears batteries out. It’s storage at 100% that does that.

If you charge to 50% and keep it there, it basically last years.

If you use your notebook primarily on AC, battery is charged and kept on “storage” until you unplug from AC. Over the years, this is equivalent to a battery stored on a box.

So, time stored on a box with 50% is better than 80%, which is also better than 100%.

But. If your notebook is used primarily unplugged from AC, that 100% is better than 80%, because you’ll have fewer deeper discharges.

Deep discharges is what form dentrites. Dentrities is what kill lithium batteries.


> If you use your notebook primarily on AC, battery is charged and kept on “storage” until you unplug from AC. Over the years, this is equivalent to a battery stored on a box.

I'm not sure what you mean, by default most laptops will charge to 100% and keep trickle charging it to that level consistently while on AC.

Setting a charge max of 40% would be ideal, but there is point where you want to maintain some level of runtime when removed from AC, so 80% is a reasonable compromise.


It is terrible to limit max battery charge, so max battery charge doesn't decline!!!


Its akin to not driving your car in the redline constantly - sure you can do it, but its going to reduce engine life.

Its the same with batteries, you keep them at 100% constantly they will wear out. Its just a limitation of the technology.

Some devices now try and use "machine learning" to determine max charge, for example iphones will maintain a medium charge throughout the night then do a final top up charge to reach 100% just before you wake up.


> Its akin to not driving your car in the redline constantly - sure you can do it, but its going to reduce engine life.

Yes, of course. That's not the issue, though.

> Some devices now try and use "machine learning" to determine max charge, for example iphones will maintain a medium charge throughout the night then do a final top up charge to reach 100% just before you wake up.

That's different and even opposite from advertising X max charge and only delivering 80% of X as the max charge. What I referred to is a bait and switch, where the manufacturer advertises one thing but provides another. Your example justification, however, is not a bait and switch but aligns the technical MVPs and marketing promises.

The situation is similar to purchasing a fridge that only operates at 80%. All food spoils more quickly, but the fridge lasts longer. If that had been disclosed, instead of the opposite implication from the marketing, then consumers would have almsot certainly made different purchasing decisions. That is why these sorts of things are generally illegal.

In your car example, if a car is advertised with 400 HP but never goes above 320 HP, that's called false advertising. Sure, the engine can get to 400 HP tested on a bench, but that's not why the car was purchased.


Right, so what vendors are doing now is calling batteries consumables that last one, maybe two years with the default 100% charging limit to get max run time on battery.

Consumers can either accept this and pay for a new battery every few years, or if they want they can limit to 80% and get way longer life battery life if they can live with the reduced runtime.

I'm not aware of any vendors claiming x capacity battery with mandatory limiting charging to 80% of x, rather its an option you have the choice to explicitly enable.


Isn't Microsoft embarrassed about this?

Yeah, I was astounded when I recently found out about the "modern standby" idiocy. I've almost always used Mac laptops, but I just assumed that at some point in the last 20 years the PC world would figure out reliable sleep. And it seems like they mostly did, and then Microsoft broke it for no good reason?


I actually just want them to fix connected standby and properly be able to do these things with low power.

My phone (a oneplus 6) can be on standby for nearly a week. That windows laptops attempt at this mode drains the battery by the end of a workday is just pathetic. Instant resume should be available with no practical compromises.


That is seriously bad.

Just as a data point, my 2019 mac sleeps perfectly for multiple days, no problem.


My 2019 Razer Blade does the same.


As a Mac user with the same gripe of the ctrl key placement: Remapping it to the caps lock works wonders :)


I use so many keyboards, not all of which are under my control.. I wish I could, but it would bring chaos to my typing world.


Something like that exists for DVORAK. Maybe someone else makes a similar product that works like an any keyboard.

https://www.keyghost.com/qido/


You can set this in the OS itself.


Wouldn't you need root/admin access for that ?


> As I was typing this, a great example of said broken crap, is Lenovo popping up a MessageBox asking me to reboot to update my firmware. Didn't software developers all agree well over a decade ago that windows stealing focus was not a good thing? So why then is a top of the line brand in the PC world writing shit like this?

Don't think that MacOS spares you about that either. Update notifications are equally distracting.


> Don't think that MacOS spares you about that either. Update notifications are equally distracting.

You're right but if I recall correctly notifications do not steal the focus.


> why then is a top of the line brand in the PC world writing shit like this

"It's OK when we do it"


My bet is that Microsoft and Windows laptop manufacturers aren’t embarrassed by this because approximately no one uses their laptop as a portable device. The laptops just sit on the desk charging 99.9% of the time. People use their smartphones the rest of the time.


Really? Maybe in 2020 or 2021, but not in 2019. My work laptop got dragged to meetings and gets taken on flights to remote offices. My personal laptop gets used in every room of the house.


Most companies issue laptops instead of desktops.

While commuting back and forth to work those are definitely asleep in a bag/briefcase.


Microsoft released windows 11, so, no, they have no shame. Microsoft is a company without leadership, without direction, and without a central vision. The only feature of windows that people like, is that it continues to run the software they need. Other than that the OS itself gets in the way, stops your workflow, and makes your life harder. And the only other commercial alternative is just as bad in other ways. The natural result of letting the computer marker be controlled by a few big players.


JetBrains Rider. If you care about quality, don't bother with Visual Studio at all.


Or CLion, if you’re doing C++ or Rust development


Battery drain a problem? Join the Framework laptop movement. Mac bothers you slightly less than Windows does (and costs more for the same hardware)? Make the jump to Ubuntu and dip your leg into desktop Linux. If you're inseparable from Visual Studio, I'd think Wine could handle it.


> really hate the placement of the Ctrl key

Macs used to be really customiseable, and they haven't got around to ripping all of that out yet.

System Preferences > Keyboard > Modifier Keys

You can control how the various keys behave.


> and if Visual Studio ran on Mac I'd probably switch my PC out also

Not sure if it supports all the features you need, but it exists.

https://visualstudio.microsoft.com/vs/mac/


That’s just a rebranded Xamarin IDE - but for .NET development on Mac, I’ve found JetBrains Rider does most of what the full Visual Studio package does


I like Rider better even on Windows. My favorite thing about Visual Studio on Windows was always the debugger, and I think Clion and Rider have lapped them there in the past two years.


Yes it's just Xamarin modified to look like a native Visual Studio for Mac.

Much better off using Rider (also works nicely on Linux).


IntelliJ is still a little broken on wayland I think. Last I checked they were waiting on a patch to Swing, so right now it's only running under XWayland, it's fairly buggy and doesn't scale nicely. You also need to set some env variables or you'll get a blank screen. It still works pretty well though, which is better than any other similar IDE.


Seconding Rider if you're on macOS. It shares its analysis code with ReSharper, so it's extremely featureful. In fact, it's usually faster than VS + ReSharper because it isn't running two code analysis engines at once and it doesn't have to run the analysis out-of-process.


Note that if OP expects anything like the "real" Visual Studio that runs on Windows, this is really nothing compared to it. I'd even go further and actually prefer Vscode with the right extensions compared to this "Visual Studio".


sounds like when we try to run iTunes on Windows - each competitor is intentionally crapping on the other's users


Don't worry, the Music app on macOS is just as bad as iTunes on Windows...


It's more that Visual Studio is a massive project that was built around running on Windows. It would be a huge undertaking to bring it over to another OS and probably not add nearly enough value. The only reason macOS has "Visual Studio" at all is because Microsoft bought out Xamarin and rebranded their Xamarin Studio product.


Yup it would just not be worth it to bring real VS to macOS. Then why decrease its brand value by marketing a crappy product like Xamarin (which I think has always been bad, and I honestly don't know why MS bothered to buy it) as if it was VS? Just call it Xamarin, or VS "lite", or anything, but not VS as it is not VS.


Consistent branding. Visual Studio is the name they're using for all their code editing software. I don't think it's a good idea given the confusion it causes when people talk about the various programs, but they are distinctly named Visual Studio, Visual Studio Code, and Visual Studio for Mac.

Microsoft has never been good at naming things. They made that mess of Windows naming at multiple points and they are still making a mess with the Xbox brand.


iTunes is crappy in Macs, too!!


> Isn’t Microsoft embarrassed about this?

Do Microsoft Surface devices have this problem (the heat/fans spinning/high battery drain while sleeping, I mean)? If not, then Microsoft has no need to be embarrassed; it’s the rest of the PC laptop industry that should be embarrassed for not shipping hardware that does the right thing when the OS asks it to.

(Though, I mean, the rest of the PC industry should already be embarrassed, for still shipping new PCs in 2021 that don’t have Thunderbolt/USB4 ports, and desktop PCs that don’t even have a single USB-C port.)

There’s also the fact that Microsoft is (perhaps by accident of fate) “building for the future”; a lot of Windows 11’s assumptions in particular make sense given hardware and CPU steppings in mind that should have shipped a year ago, but which are caught in the COVID logistical conga-line and so won’t be seen for another year. That includes e.g. Intel CPUs with ARM-like big/LITTLE designs.


Yes, Surface devices have this problem. It was a frequent occurrence for me to put my Surface Book into my bag and later take it out completely dead or hot and drained.


Wouldn't put all the blame on hardware. It's a back and forth sometimes to get the OS and hardware working together.

It's not a 50/50 responsibility split, but it's definitely not 100% on hardware.



Isn't this the MonoDevelop/Xamarin Studio version of VS?


> Visual Studio ran on Mac

Visual Studio Code does. Is that the same?

Even has native M1 support.


It's not the same at all. Visual Studio is an IDE (like Eclipse) and Visual Studio Code is an extensible text editor (like Emacs).

It is pretty good though. It's enjoying a big surge in popularity lately and seems to deserve it.


Visual studio running on Mac? I’ve been using visual studio code on it for years no problem. I’m now realizing there’s a difference between the two. Care to explain a little more why you use VS and not VSC?


Visual Studio and Visual Studio Code are completely different things. VSC can do a subset of what VS can do, but some people really need the full power of a traditional IDE. VS vs VSC is like the difference between a smartphone camera and a full camera platform. You can rig up an iPhone (VSC) to do everything a D6 (VS) can do, but it won't be as usable.


But Visual Studio 2019 is available for mac.


It shares some code nowadays (started with a MonoDevelop codebase). However, Visual Studio (not Code) for Mac is mostly targeted at at C# or mobile development.

If you want to make Windows apps, it just doesn't support that scenario.


And they just announced they’re rewriting it to native for Mac in 2022…


> if Visual Studio ran on Mac

It is! What do you mean? :)


The Mac version of Visual Studio is closer to a fork of MonoDevelop than mainline VS. Rider is a better experience than either though, IMO.


The implementation of the none S3 sleep on most windows laptops these days is not microsofts fault. Critical updates (firmware) always have priority and even on mac's you get negging pop-ups for updates all the time. That said you can fix everything what you said with one checkbox. It's not the pc which sucks it's you the user.

For all the people downvoting. He is the equivalent of people making "screenshots" with their mobile phone instead of using printscrn. I bet you are complaining about your own users all the time and here you defend someone who can't even be bothered to look at the options he has. smh


The complaint was not that the firmware notification existed or was urgent, but that it stole input focus.


This ! Support forums from all brands (big ones like Dell or Lenovo, but also niche brands like Framework) are plagued with requests from unhappy users with their laptop dying in their bag. All because of the Modern Suspend feature. Let's hope AMD mobile processors will work better, and that we'll be able to find decent laptops using them.


I had an AMD Ryzen ThinkPad. It has all the same issues with modern sleep. The solution is to enable S3 sleep in the firmware.

Unfortunately, even in S3 sleep, the battery would drain in a day.

I am now back to a MacBook Air (M1) and sleep works perfectly. Life is to short to deal with the terrible state of sleep on Windows/Linux laptops.


I'm fairly certain that your Macbook is actually hibernating, and not merely suspending. This seems to be the case with my 2015 Macbook. Suspend alone will drain your battery.

I have a Ryzen ThinkPad p14s with Linux. What you do is turn on S3 mode (they call it "Linux") in BIOS as you did. Then in /etc/systemd/sleep.conf you enable AllowSuspendThenHibernate. You can adjust the time. I have mine set to 30 minutes. I also have suspend set when the laptop lid closes. Which means when I close the laptop, it suspends for 30 minutes. If I open before 30 minutes, it resumes right away. If not, it hibernates. The battery will last for practically days if you configure it this way.

You have to configure Linux to hibernate and this will depend on your setup. Just know that it is possible to use LUKS encryption and hibernate to a file (or partition).

Another factor with my Macbook is that it seems to automatically resume from hibernate in the morning. Not sure what it's doing. Maybe it's keeping track of when I typically open my laptop. But you can get this same behavior on Linux with rtcwake. If using LUKS though, you'll probably need to use a keyfile rather than a passphrase for this to work. It would be neat if someone wrote a program that could monitor the time you open the laptop and automatically adjust the rtc wakeup times based on historical data.


I'm fairly certain that your Macbook is actually hibernating, and not merely suspending. This seems to be the case with my 2015 Macbook.

Actually, as far as I understand it's a mixture like modern sleep in Windows. Like modern sleep, the system also wakes up to fetch e-mail, calendar events, etc. But in contrast to Windows, it actually works well and is very restrictive (only certain whitelisted apps can update). Apple calls it Power Nap:

https://support.apple.com/guide/mac-help/what-is-power-nap-m...

Which means when I close the laptop, it suspends for 30 minutes. If I open before 30 minutes, it resumes right away. If not, it hibernates. The battery will last for practically days if you configure it this way

I know, but hibernate on Linux has many issues. First of all, properly restoring hardware state is even more of a hit and miss than S3 sleep (where e.g. the trackpad would often not come up correctly).

Hibernate on Linux also has all kinds of security issues. Hibernate currently does not work with Secure Boot + kernel_lockdown [1]. Hibernate with randomly-keyed encrypted swap requires all kinds of workarounds.

I am not willing to forgo security mechanisms that have been supported for some time on other systems.

[1] https://mjg59.dreamwidth.org/55845.html


I wouldn't be surprised if most people didn't notice that their MacBook was hibernating. Apple tries to hide it by showing you the login screen immediately, but that period where you can't interact is the device resuming from hibernation. It's a similar trick to what they do on iOS. When you open up an application that was suspended, it shows an image while reloading the saved state to make it seem like the app never closed. (As documented here: https://developer.apple.com/documentation/uikit/uiapplicatio... )


Thanks for the tip. I had a Lenovo Ryzen that would require a Win10 reset of the Broadcom WiFi after each resume. Frustrating. Finally got an Intel Lenovo -- no WiFi issues.


> Let's hope AMD mobile processors will work better, and that we'll be able to find decent laptops using them.

It doesn't have anything to do with the processor does it? Surely neither intel nor amd have removed S3 support?

AFAIK it's integrators (motivated by Microsoft's push of "modern standby"?) which make it unavailable in their bioses: S3 needs bios integration, modern standby is just S0. Or do you mean "hopefully AMD will have a better S0 idle"?


Intel 11th gen CPUs are 100% Modern Standby and DO NOT have S3 sleep. I don't know about AMDs support.


Wow, TIL I'm never getting an intel CPU again. Not that I was planning to, mind.


Lenovo still puts it back on some of the laptops, like the X1 Carbon 9th Gen. So, I suppose this is not strictly dependent on the CPU.


After looking for confirmation of this, S3 support seems to be either missing or broken in various ways, there'a a reddit thread where a System76 engineer talks about the mess that was: https://old.reddit.com/r/System76/comments/k7xrtz/ill_have_w...


> get critical Windows Updates

So you mean Microsoft can force your laptop to restart and lose all your work even when it's "off"? Amazing.


No, not when it’s actually off. If you’ve shut down, then it won’t do this, but if you put it to sleep it can.

That being said, I hate this, with my old laptop I could close the lid and pop it in the bag and it would lose maybe 5-10% battery per day, with new laptop it happily unsleeps itself in the bag which has zero ventilation.

So for safety reasons I have to shut my laptop down before putting it in my bag, then have to wait for it to start up (which is slow due to corporate software) before being able to use it again.


I’ve been using Macs for over five years now and before that I was all Linux. From reading this it looks like Windows has actually reverted to where Linux was over five years ago: having to fiddle with the machine to get sleep to work. That’s just sad.


Windows wasn't good at this in the past either. Didn't they used to have sleep, hibernate and a load of other options too?


Just sleep/hibernate/restart/shut down, as far as I can remember? - sleep vs hibernate being a distinction you want, in my view, as the battery drain/restart time behaviours are different. I'm sure I don't need to explain why separate restart and shut down options are a good idea.

(Never liked not having the sleep/hibernate choice on macOS. I know better than the computer what my upcoming plans are!)


Windows ha(s/d) suspend/sleep/hibernate/restart/shutdown.

Sleep was a suspend/hibernate hybrid that allowed fast restarts if you still had power, but would restore from disk hibernate-style if you ran out of power while suspended.


You could make Macs hibernate via the CLI until a few versions back at least, maybe you still can? I used that occasionally, but at some point sleep got so good I stopped bothering. If I disable the waking up to fetch mail and the like thingy in Settings, mine will hardly lose any battery even when sleeping for extended periods.


Macs can't even toggle session saving behavior through the CLI; your only chance to toggle session saving it is logging out via the GUI.

My experiences with macOS involved tons of unexpected sharp corners like that. Really feels half-finished when you're coming from Linux, as ironic as that sounds


I’ve been on Mac so long that all 5 options feel erroneous. I just close my laptop lid when I’m done.


I've had the same problem on Mac, if you leave an SD card in it drains completely.


I just to try and gett TV rtf TTa we t feed define the t TTreally I just wh from G they rrdand I


(me typing on a mac keyboard)


Try Hibernate.


Not all windows updates require reboots.


Most don't even.


Quite a lot more stuff supports auto resume after reboot, including unsaved Notepad files.


But not REPLs and Jupyter notebooks.



nor Windows Explorer.


For the File Explorer it's HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Advanced Value PersistBrowsers set to 1.


You don't need to manually edit the registry for this. There's a checkbox in Explorer's settings.


Thank you! This reminds me of the old annoyances.org that listed a bunch of little tricks to improve your Windows 95 experience.


Also Linux does not work well in this mode (problems to wake up), so it is important to be able to switch it to S3 in the BIOS.


> In the future, perhaps this mode will work as well as S3 used to for battery life, but finding your laptop fans on while it's closed inside your bag and getting really hot is not fun. This applies to lots of laptops currently.

Modern standby was designed for ARM-based Windows 10 Mobile initially, but I do feel that Intel had plead to Microsoft to allow them to implement it too. Regardless, even the best Intel cores (except for their clean-room Atom designs, not all Atom-branded CPUs have these) runs hot that it's a disgrace that Microsoft haven't insisted on requiring S3 sleep.


So which laptops do a good job of reliably supporting S3 standby/suspend? Any idea if the Framework laptop does?


You can check which ThinkPad BIOS has the option at their BIOS emulation site: https://download.lenovo.com/bsco/index.html

For mine it's under Config // Power // Sleep State. Windows 10 means useless Modern Standby, Linux means S3. https://i.imgur.com/Y5CchL9.png

In this covid helped me because I learned about this before I cooked my expensive new laptop in its bag :(


Wow, that is an incredibly cool thing for an OEM to offer. Never seen this for any other vendor before.


What does it mean if there is no sleep setting here? I haven't actually rebooted my P50 yet, but the simulator doesn't have this option showing.


The P50 very well might predate this Modern stuff.

Type

powercfg -a

and see:

The following sleep states are available on this system:

Standby (S3)


True enough, thanks for the info. I can't complain either way, whatever suspend mode I have works well and quickly. Hibernate does work (all under Linux here as a rule).

I'll have to try powercfg next time I boot into Windows.


The Framework Laptop does not support S3 suspend. I have it running with S2deep in Linux, which works okay, but it isn’t a fantastic scenario. Hibernation is the way when the processor doesn’t support S3, unfortunately. Even Windows would fall back to it after a short duration.


Yes, this is my only beef with the laptop so far. I really hope we get a firmware update with S3 enabled soon.


Yep. The thinkpads I've had running Linux has supported s3 sleep but it feels like it's going away more and more.

You start with a laptop at 80% charged if you use Microsofts stupid system. The more I use Linux, the more annoyed I get at Microsoft and how they keep worsening the entire pc experience every year.

Users are not users, they are cattle to be managed by Microsoft.


Is it really Microsoft's fault that vendors only listen to it? The fact is that Apple exists in it's own closed ecosystem, and Linux runs second hand on systems built for Windows. There are very few other folks dictating requirements. Laptop vendors for the most part don't want to be in the business of maintaining an OS. Companies like system76 and framework are the way to address this.


Bahahaha.

Sorry, long time Linux user here. I just presumed the sucky state of this sort of thing was SOLELY due to my insistence on using Linux.

Relatedly; seriously - besides the obvious "mindshare" and so on, and I suppose "mostly for the people here;" Macs I get, but why do people still use Windows?


Also a longtime Linux user here, and I have the same question: why do people keep using Windows? Everything I hear about it sounds superb frustrating, while for me, Debian stable has been rock-solid for me for years.


I used to put my Windows 10 laptop in my bag at night, but every now and then I'd realise later that it had powered itself on, the fans were going nuts, and it was obvious really, really hot!

I don't think it was actually installing updates (I use Enterprise eds), though I've no idea what it was doing. I've since started leaving it plugged in on my desk, and haven't noticed it turning itself on at all since then. Weird.

Seems mad they'd try to install updates while not on AC power, and I'd think it's fairly common for sleeping laptops to be placed in a bag overnight.


Check your wakeup settings. I've have to use corporate Windows machine and for some reason they're all setup to wake up on some wifi/ethernet signal.


Reasonably fast boot time. Yes my XPS seems to be booting about half the time I open it, due to power management or windows updates, which is annoying. But what makes that extra annoying is that it spends like 45 sec each time just in the BIOS step. What on earth does the BIOS do that needs so much time to run?

(Other than this I do actually quite like my XPS. Though I do wish I had gone without the touchscreen. It's just annoying because you can't wipe grime off without it causing a touch.


You can mitigate it by configuring the power settings so that after 30 minutes the laptop hibernates. You lose your connected standby of course


That's hibernation though, so it takes way longer to wake up and also does a lot of I/O on the probably not replaceable SSD.

(Also, why does your post show as being made on the 2021-10-14 when this "Ask HN" is from today? Wat)


Anecdata but my 5-year ThinkPad and 3-year Dell are doing fine (also a 10-year Asus). I've been always hibernating for the night. 16GB RAM, takes at most few seconds. I don't get this allergy to hibernation, is it such a killer to wait a few seconds when I know I won't be using device for hours/days?


Takes seconds to resume and saves a lot of battery since the machine is truly powered off.


Sometimes posts get put on the "try again" queue by the moderator, where they are floated to the bottom of the home page to see if they will attract attention the second go-around. I don't know if that's the case here.


Ah indeed, the Ask HN itself is from the 14th and various comments are from then as well. Seems like it got put through that process which causes HN to "fake" the short timestamps to make everything look like fresh discussion.


How does S0ix work on Linux? Is there a way to powerdown more?


> 1. S3 Standby/suspend support.

Intel diasabled S3 in Tiger Lake


Luckily there are AMD thinkpads and they are great.


What is the reason for this decision? Idgi


Force everybody to use Win11, as a gesture to Microsoft.


> if I'm missing on any new developments since 2015

Yes, you missed something huge.

Apple Silicon M1 or nothing. It's just too good. Low power consumption allows the laptop to actually be used as a laptop. The power use-to-performance ratio is unmatched by anything on the market. Apple is breaking us free from decades of stagnation on the hardware front.

Nothing else on the market comes close. I expect this to change in the future. Will happily switch to a linux laptop once an energy efficient alternative to x86 becomes mainstream for linux. There's just no way I can go back to x86 after tasting the sweet nectar of M1.


Agreed. I hate that M1 is so good. I got an M1 macbook pro when my older laptop broke and I needed to temporarily deal with some iOS work. I bought it thinking I would regret it and have to replace it with another machine especially being used to 15inch devices. But it turned out to be such a great experience using this device without ever worrying about charging and rosetta2 worked so much better than expected.

I hate that this device is so great that when I look at other options with linux support I just get drawn back to this. It is really frustrating to have such a great hardware be limited by the software, rather the userspace(kernel and drivers seem solid). MacOS is just so frustrating at times. Asahi team's linux support can't come fast enough.


What do you feel like you're missing in userspace?


Most of the important ones like package managers, containers, not calling home etc are mentioned by the others. So I will mention things I don't think are discussed enough by linux switchers.

The one I didn't think I would yearn for but am really reminded of everyday are- desktop environments and window managers. My friends couldn't stop praising the UX but even after a long time trying to get used to it I still am annoyed by the application switching keyboard shortcuts, lack of ability to set system wide shortcuts for launching basic things like terminal. I won't talk about the lack of some cli applications I loved because I expected that when I moved(and it is better than I thought). I have tried iterm2, hyperswitch, rectangle. I understand that different systems have different behaviours and I can't expect it to behave the same but I have observed my friends, and others and the level of keyboard driven userflow I can achieve even in something like windows just seems worse here. It's not that it isn't possible but rather the behaviour has subtle quirks that I can't always rely on it. And lastly animations and latency. Even after turning all animations, it just is slow. I was used to having applications open before my keyboard would even leave the shortcut key and here I find myself waiting for things to start. I thought I was being nitpicky until I had to use my 2014 model haswell laptop for something and it just was such a breeze compared to the provenly faster processor and storage of m1. What's everyone's experience like?


My friends couldn't stop praising the UX but even after a long time trying to get used to it I still am annoyed by the application switching keyboard shortcuts, lack of ability to set system wide shortcuts for launching basic things like terminal.

Hammerspoon is the answer to all of these. The common solution is to rebind caps lock (using Karabiner Elements) to be “hyper” (cmd + alt + shift + ctrl) and then bind global keyboard shortcuts with that using Hammerspoon.


If someone were complaining about inability to configure keyboard shortcuts on Linux and you recommended a solution like Hammerspoon, it'd invite an onslaught of ‘this is why Linux is too overcomplicated and annoying for normal people’ comments.

Setting global keyboard shortcuts on Linux doesn't require you to learn a whole-ass LUA API. It's just there, for all the popular DEs.


Oh I absolutely agree. MacOS in general is not very good for power users out of the box.


I vaguely remember having tried hammerspoon but can't remember why I didn't continue using it. Will give it a proper shot. Thanks!


Okay. Just tried hammerspoon and I love it. Love the documentation as well. Thanks.


Use iTerm2 and set the “hotkey”. That gives instant switch to terminal with one key.

Can you give other examples of command line app problems or keyboard driven workflow problems you’re having? I expect people here will be able to show you how to solve them.


Command+space t<Enter> for terminal (at first you’ll have to type “terminal”, but after 10 or so it becomes the first result for “t”).

The same for other aplications.

You can also, from terminal, do a open -a application_name if you want.


This is what I use and it is part of muscle memory now. But this is a part of thousand cuts that make the experience less than amazing.


This is a very poor replacement for something that really should be instant and cerebral like app shortcuts.


Even on Linux I rarely bother hitting a hotkey to open a terminal. Why? Because I have it load on startup and almost never close it.


- A good (not Homebrew and not quite Macports) package manager.

- Ways to permanently disable Gatekeeper

- Some kind of quick setup. On Linux I use a bootstrapping script, MacOS does not offer the same extensibility.

- Real virtualization support like QEMU

- Real container support (if I set up another local Docker host on a Mac I swear to god...)

- Up-to-date coreutils

- A way to install Git without dragging along a few gigs of dependencies

- Ways to monitor my network usage and ensure Apple isn't spying on me

- Real encryption (eg. Apple/NSA isn't holding a master key)

- CUDA support

- 32-bit app execution

- A way to disable the hash that gets sent to Apple every time I open an app

That's just the starter kit, if I sat down with some pen and paper I could probably keep going ad-infinitum.


I feel you, but for me it's not a list of deficiencies in OSX. I came from Linux then used OSX almost exclusively (because work) for about 4 years and I still hate it. I agree the list of things above are real problems, but you could solve all of them and I'd still hate it.

For me it basically it comes down to this:

Macs are just similar enough to the Linux to fool me into thinking I know what I'm doing. But then things are just subtly different such that nothing quite works. It's like, if you were Satan and your Q4 objective was to design a computer that you could give to me -- a "computer expert" -- to make me feel like a moron every time I try to do literally anything at all on it... Apple computers pretty much hit it out of the park.

--

It's also a lot of little usability things. I thought Apple was supposed to be the usability king? Yet it patently cannot handle simple things like:

* I want "natural" scrolling on the touchpad, but "regular" scrolling when using a plugged-in mouse with a mousewheel. The options are in TWO DIFFERENT settings windows, but changing one changes the other. Why? Even Linux, not famed as the Usability King, had this figured out in 2012.

* I want one of my monitors to be vertical. When I plug them in, it's a coin-toss as to whether or not it will remember which one it is. 50% of the time I have to tilt my head 90 degrees and set the 270* monitor to 0* and the 0* monitor to 270*. In the end I wrote an apple-script to do this automatically and bound it to a keyboard key.


> I want "natural" scrolling on the touchpad, but "regular" scrolling when using a plugged-in mouse with a mousewheel. The options are in TWO DIFFERENT settings windows, but changing one changes the other. Why? Even Linux, not famed as the Usability King, had this figured out in 2012.

This also drives me bonkers. The reason I think it is still this way is that at Apple everyone is using the Magic Mouse which has touch scrolling on it (you scroll like a trackpad with your finger) and so presumably most never toggle this. Another example of pain if you go off the happy path the apple developers use and keep tidy and clean.


I have this exact preference as well, and my solution was this little utility app: https://pilotmoon.com/scrollreverser/ Being a Mac user for many years, I’ve found that there’s almost always been some simple app that I’ve been able to find that can fit my UX preferences.


> - Ways to monitor my network usage and ensure Apple isn't spying on me

This may not be complete monitoring, but Little Snitch (from obdev.at) can monitor outgoing connections, alert you and let you to block or allow those. Last year when macOS Big Sur was released, this didn’t work for a while for Apple’s programs, but that was later changed by Apple.


- A good (not Homebrew and not quite Macports) package manager.

nix?

- Some kind of quick setup. On Linux I use a bootstrapping script, MacOS does not offer the same extensibility.

Apple mdm implies that the hooks exist, although I know notthing of using that and it very well might be insufficient.

- Real virtualization support like QEMU

Isn't hyperkit decent?

I agree with the general thrust of your comment, though; just unsure of some details.


Wait, are you saying that if you set up FileVault to not use your iCloud account, Apple still holds a master key? I thought that was the whole point of using your own recovery key


Seconding this question!

2017 article says Apple doesn't have a copy without iCloud enabled, https://www.macworld.com/article/229963/want-to-recover-a-fi...


I can't respond in good faith without access to the source code. Everything that I would tell you is just conjecture based on either trust or hatred of Apple, which is equally useless to the consumer. Use your best judgement.


Re: package managers, have really enjoyed pkgsrc. Only wish there were more prebuilt packages.


What do you want to do that homebrew doesn’t do well?


Is ARM really the reason the M1 is so much better than the current x86 CPUs? If you have a look at the competition (Qualcomm), it sounds to me that it's Apple expertise at designing a SoC and using a more advanced node that makes it such a good CPU, not really the architecture.

Said another way, I highly doubt we will see a non-Apple ARM CPU outclass the x86 competition in such a way anytime soon, so saying "never x86 again" might not be very wise. ARM is not a magic bullet.


100% agree.

A lot of people asking for ARM based machines do not really realize that the reason is not because of ARM, but they just want faster/more efficient machines. I think to make a analogy for HN frontend sw devs, imagine a non-technical marketing person going to a developer and saying "I want this website in AMP, because AMP is fast.". And sure, a lot of sites with AMP framework are fast... but that doesn't mean that you have to build a site in AMP to make it fast. You could build a simple site with similar principles and get it fast too. Similarly, not all ARM laptops are fast. Even if the laptops hw is fast, you want something that can efficiently run x86 code as well for compatibility purposes for many years.


It’s true that ARM alone isn’t the reason for the M1’s performance, but it’s definitely a significant factor. x86 is old — modern x86 chips are still backwards-compatible with the original 8086 from 1978 — and it’s stuck with plenty of design decisions that might have been the correct choice sometime within the past 45 years but no longer today. Whereas the M1 only implements AArch64, a complete redesign of the ARM architecture from 2012, so it doesn’t have to deal with legacy architectural baggage. (We’ve known x86 was the wrong design since the 80’s — hence why there’s no Intel chips in smartphones — but it hasn’t been realistic for anybody except Apple to spend 10 years and billions of dollars to make a high-performance non-x86 chip.)

Some examples:

- x86 guarantees strong memory ordering on multi-processor systems, which adds completely unnecessary overhead to every memory access. arm64 uses a weak memory model instead, providing atomic instructions with relaxed or acquire/release semantics (see https://youtu.be/KeLBd2EJLOU?t=28m19s for a more detailed discussion). This significantly improves performance all around the board, but especially with reference counting operations (which are extremely common and often a bottleneck in code written in ObjC/Swift): https://twitter.com/Catfish_Man/status/1326238434235568128

> fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1

- x86 instruction decode is pretty awful, a significant bottleneck, and not parallelizable due to the haphazardly-designed variable-length CISC instruction set. arm64’s instruction set is highly regular and easy to decode, so Apple can decode up to 8 instructions per clock (as opposed to 4 for x86 chips). Most sources agree this is why the M1 can have such a big out-of-order-execution window and achieve such high instruction-level parallelism compared to Intel/AMD.

- x86_64 has only 16 architectural registers, compared to 32 for arm64. This means the compiler has a much harder time generating efficient, parallelizable code and must resort to spilling registers much more often.


The issue for me is that ARM is also really old now. I mean, just look at the ISA Apple has to use to run their MacOS on it: it's littered with NEON extensions and more cruft than you can shake a stick at. Simply put, Apple's implementation of ARM is decidedly CISC. On top of this, I'm still dumbfounded by the fact that they didn't go for a chiplet design where ARM could truly shine: if Apple had went the chiplet route, the M1 could have had a much higher IO ceiling and might have a shot at addressing more than 16 gigs of RAM.

Apple has a much bigger issue, though. ARM doesn't scale: it's a fundamental conceit of the architecture, one that a lot of people are probably willing to take on a laptop that will mostly be used for Twitter and YouTube. This presents issues for the rest of the market though, and it will be fascinating to see how Apple retains their pro userbase while missing out on the high-performance hardware sector entirely.

I think x86 is pretty terrible too, if it's any consolation, but really it's the only option you've got as a programmer in the 21st century. I hopped on the Raspberry Pi bandwagon when I was still in middle school, I grew up rooting for the little guy here. Looking out on the future landscape of computer hardware though, I really only see RISC-V. ARM is an improvement on x86, but I don't think it's profound enough to make people care. RISC-V, on the other hand, blows both of them out of the water. On consumer hardware, it's able to accelerate pretty much any workload while sipping a few mW. On professional hardware, you can strap a few hundred of those cores together and they'll work together to create highly complex pipelines for data processing. On server hardware, it will probably move like gangbusters. Even assuming that cloud providers pocket half the improvements, a 5x price/performance increase will have the business sector racing to support it.

So yeah, it is a pretty complex situation. Apple did a cool thing with the M1, but they have a long ways to go if they want to dethrone x86 in it's entirety.


Where to start?

> ARM is really old now.

Well aarch64 was announced in 2011 so not really that old.

> Apple’s implementation of ARM is decidedly CISC.

CISC is a description of the instruction set not the implementation.

> ARM doesn’t scale.

No idea what this means but you can get 128 core Arm CPUs and address huge amounts of memory but perhaps you have another definition of scaling.

And so on.


As far as I understand it, “CISC” doesn’t mean “has a lot of instructions”, it means the individual instructions are themselves complex/composable/expressing more than one hardware operation. For instance, on x86 you can write an instruction like ‘ADD [rax + 0x1234 + 8*rbx], rcx’ that performs a multi-step address calculation with two registers, reads from memory, adds a third register, and writes the result back to memory — and you can stick on prefix bytes to do even more things. ARM doesn’t have anything like that; it is a strict load/store architecture where every instruction is fixed-width with a regular format and either accesses memory or performs a computation on registers.

Stuff like hardware primitives for AES/SHA, or the FJCVTZS “JavaScript instruction” don’t make a processor CISC just because they’re specialized. They all encode trivial, single-cycle hardware operations that would otherwise be difficult to express in software (even though they may be a bit more specialized than something like “add”, they’re not any more complex). x86 is CISC because the instruction encoding is more complicated, specifying many hardware operations with one software instruction.

I’m not exactly sure whar all the “cruft” is in ARM that you’re referring to. The M1 only implements AArch64, which is less than 10 years old and is a completely new architecture that is not backwards-compatible with 32-bit ARM (it has been described as being closer to MIPS than to arm32). NEON doesn’t strike me as a good example of cruft because SIMD provides substantial performance gains for math-heavy programs, and in any case 10 years of cruft is much better than 45.

I’m curious as to why RISC-V is different or better? I don’t know much about RISC-V — but looking at the Wikipedia article, it just looks like a generic RISC similar to MIPS or AArch64 (and it’s a couple years older than AArch64 as well). Is there some sort of drastic design difference I’m missing?


The only advantage I’ve heard put forward for RISC-V on single threaded applications is the existence of compressed instructions - which could reduce cache misses albeit at the expense of a slightly more complex decoder. I’m a bit sceptical as to whether this is material though as cache sizes increase.

Of course the flexibility of the RISC-V model allows approaches such as that being pursued by Esperanto [1] with lots and lots of simpler cores.

[1] https://www.esperanto.ai/wp-content/uploads/2021/08/HC2021.E...


ARM had THUMB, which definitely improved performance back in the GameBoy days — but they dropped that with AArch64, so presumably they decided it wasn’t beneficial anymore.


Indeed and IIRC the increased code density got them into Nokia phones too.

I find it hard to believe that they didn't drop Thumb from AArch64 without a lot of analysis of the impact on performance.


> On top of this, I'm still dumbfounded by the fact that they didn't go for a chiplet design where ARM could truly shine: if Apple had went the chiplet route, the M1 could have had a much higher IO ceiling and might have a shot at addressing more than 16 gigs of RAM.

Remember that M1 is just a mobile SoC that work for iPad/MacBook Air. It's exceptionally great so people tend to confuse M1 is targeted higher end. 16GB max is fine for a mobile SoC in 2021. I can't wait M1X.


If you don't think arm can scale any further, why do you think x86 can? They could easily double all the specs in the "M2" and slap two+ of them into a mac pro.


ARM isn’t _the_ reason. It is a reason.

If we were to go back in time to before apple introduced it’s own SoC and before it had acquired chip design start ups, even before than ARM was on the table as that is what the iPhone ran.

RISC-V wasn’t a thing back then. So the alternatives were MIPS, Power or maybe a home grown instruction set.

So we just have to go with what the reality is now. The apple silicon has been well optimised for performance per watt. And it runs on an ARM instruction set that apple itself helped in the design of.


On the competition, like Qualcomm, many Apple engineers (IIRC several tens of them) from the chip design team left and formed a new company called Nuvia, which was then acquired by Qualcomm in the beginning of this year. Apple has had a big lead on SoCs for some years, but I wonder when/if Qualcomm’s acquisition will start beating Apple’s lead here.


Agreed that architecture is only a (possibly small) part but Arm’s business model and the other things that Arm brings to the table are also relevant. For example the small / large core approach that Apple uses hasn’t previously been an x86 feature (although clearly that is changing).

Also non Apple Arm CPUs do outclass x86 in a huge segment of the cpu marketplace - on smartphones and tablets.

I’d don’t think that anyone previously has thrown the amount of cash at laptop / desktop / server Arm designs that Intel spends on x86 designs - we are now seeing a number of firms having a serious go (AWS / Ampere / Qualcomm / Nvidia possibly) so with Intel fighting back on its process issues it will be an interesting few years!


I'm curious to see how Alder Lake's clone of the ARM big.LITTLE design performs. I wouldn't be surprised if at the very least it lets Intel beat out AMD in the mobile CPU space. (Which would be great simply because Ryzen 4000 laptops are near impossible to get)


ARM is not a magic bullet but the M1 stems from the trend of phone processors growing up to be competitive in the PC market.

Since almost (all?) mobile phones use ARM I can see a similar process of adapting other ARM processors to the PC market.

But of course Apple is so far ahead that we really don't know wether and when there will be a competitive alternative, so it may as well be a streamlined x64, or a new ISA, sure.

It seems to all depend on some very experienced CPU design teams and it seems that actually Apple lost some of them in the last year, I would track these news to get a better model of predicting the future than the ARM/x64 divide.


From listening to interviews of Apple tech folks, they also have done a lot of optimization of their CPUs based on calls frequently used by Mac OS. I don’t recall specific examples but the fact that they can control both the hardware and software allows them to profile and tune the system as a whole.


M1 has very fast memory and BUS access. It takes money and is not so sexy for marketing, but it works.


Like so many people, I love the hardware but hate the software. Hoping that once Linux is solid on there, I'll be able to get one.


I was in the same boat around 2012. After 11 years of linux on the desktop and crappy student laptop, I needed a proper laptop for freelance jobs. Bought a macbook to put linux on. Ruined macos on day one. Reinstalled macos. Got used to it. Loved that all the hardware and software just works. Never installed ubuntu desktop again.

Would like to use linux again on the desktop, but macos is so nicely integrated with everything. The desktop experience is so much better than the gnome or kde version of the day. I code embedded c and web (python). Also do the occasional cad drawing and 3d printing. All this software just works on mac. Even corporate slack works okay, ms teams as well. Word, even! It sucks to be a linux desktop user or fanboy in professional environments. Not a mac fanboy, would switch if there was something better.

One drawback is that docker is just awfully slow compared to running it on native linux.


"Better to die on your feet than to live on your knees."

A reference to the freedom from corporate bondage you trade away for convenience in macos.

(I don't have any significant issues with Ubuntu Mate, but I don't use it for many things however.)


From a practical point of view the software works pretty well (I've got M1 Air, Big Sur) but I can understand if people want independence from Apple.

It seems that virtualized Ubuntu with UTM works ok: https://youtu.be/hnwK-nkXolc


I was going to replace my thinkpad with a new thinkpad, but when apple released the M1 macbook air I had to change my mind. I paid the same that I would have paid for a T series, but got something that has the sleek thin and light build quality on par with an X1 with performance of a P series and battery life that hasn't existed on thinkpads ever since they dropped power bridge. Oh, and it is fanless. If my macbook air fell into the abyss I would replace it with another air right away, it is that good.

If lenovo still made thinkpads like they used to: reasonably priced, easy to upgrade and repair, with large external batteries, and with top-notch keyboards, I might have opted for a thinkpad still. But they don't make those. All new thinkpads have so-so keyboards, they all have internal batteries (and often too small), and they're getting harder to upgrade and repair with every generation. Lenovo has chosen to chase after the macbook design, and that makes me less likely to buy thinkpads, because a thinkpad will never be as good a macbook as a macbook.


> If lenovo still made thinkpads like they used to: [..] and with top-notch keyboards, I might have opted for a thinkpad still. But they don't make those. All new thinkpads have so-so keyboards

I've seen this a few times and am curious: I got a Thinkpad at the end of 2019, and the keyboard is waaay better than the various HP, Dell, and Asus laptops I've had in the past. Did I get lucky or were all of those just way worse than I thought? What's the difference between old and current Thinkpad keyboards?


It depends on your definition of good, as for me most laptops have bad keyboards, and so-so is actually somewhat of a compliment. I do think thinkpad keyboards are still better than macbook keyboards, but not by as large a degree as they used to be.

This article is a good summary of the keyboard situation: https://www.notebookcheck.net/X1-Carbon-Gen-9-Lenovo-has-to-...


Not so, and here's why:

Apple is chasing a target, and so does Lenovo. But Lenovo is chasing the way a MacBook used to be, which is good in my opinion.

And Apple is skating wherever THE PUCK ARE THEY SKATING NOW SRSLY


Also, the M1X laptops (probably) release Monday and will be even more ridiculous.


I’ve been waiting for years, finally time to replace my 2015!


M1X will be a trade off. Trading battery life for speed. It will still have great battery life relative the dumpster fire of x86 laptops.


If the M1X 16" keeps the battery size of the current model, which is the largest possible battery you can take on an airplane, it will run a bit shorter than the M1 Air at full throttle (but get more done in that time), but significantly longer than the M1 Air for casual use, because it's presumably using the exact same efficiency cores.


Depends on what they do with GPU. If they put in desktop-class(-ish) graphics, then that would impact battery life heavily (and make me super happy OTOH).


You're actually right.

MacBook Air M1: 49.9Wh | MacBook Pro 16" 2019: 100Wh

The 16" MacBook Pro is rumored to have double the performance CPU cores, and either double or 4x the GPU cores. So for tasks that are strictly GPU bound, battery life could be about half of the M1 Air.

Considering that this is the absolute worst case scenario battery life (assuming of course that they don't make the battery smaller), the new 16" Pro is looking pretty exciting.


And CPU/GPU core count is only one factor for battery life, so 4x the cores would not even result in 4x the power draw.


Beefier graphics often improves battery life (more cores at lower clock, like nvidia mobile vs desktop for the same model number: lower TDP, same perf). They might move to more power hungry RAM as part of it though.


The on-chip GPU Apple ships in their MBPs draws way less power than the dedicated Nvidia chip. This is especially noticeable in the battery life drop that occurs when the dedicated chip is used exclusively.


On chip/package has many power advantages, and Apple is on a much newer process node with M1 (TSMC 5nm).

The point I was making was that e.g. a GTX 2070 desktop has around the same compute power as the mobile version, but the mobile version uses much less power. It does this by being beefier (2560 CUDA cores mobile 2070, 2304 CUDA cores desktop 2070), but running at a lower clock.


It probably wouldn’t, at least not in every situation. It would probably be low power until more is needed.


They better bring some "pro" memory and IO along with it, they won't be replacing my Thinkpad unless I've got 64 gigs of ram in my backpack.


What are you doing on your laptop where you can see such a big difference? My x86 laptop (Purism Librem 14) has an all-day battery and plenty more performance than I need. But I don't do things like compile code all day. If I were running a really big compute task I'd use a desktop or cluster.


This is what I don't get about high performance laptops.

Low end Ryzen 3 laptops have four cores, single thread performance within a few percent of the fastest processors in existence and will do something like render a web page at a speed indistinguishable from instantaneous.

The fastest laptops have twice that many cores. And if you actually try to use them, either half of them are low performance cores or they all clock down to a speed that makes them equivalent to a quad core desktop.

Whereas for the same price as a high end laptop, you get a desktop with 12 or 16 high performance cores and >16GB memory which is so much faster for intensive tasks that the laptop should be embarrassed to show up.

Why should I do anything other than buy an inexpensive laptop with a good screen and keyboard and a fast desktop I can access remotely for any kind of intensive workload?


I do agree ultrabook + powerful desktop is the best combo for most people who actually need a powerful computer. that's what I ran for many years. there are a few reasons why you might want a powerful laptop though, mostly coming down the limitations of remote access.

using a GUI over remote kinda sucks imo. even something like text editing is not as smooth as it would be locally. photo/video editing and especially anything 3D (CAD, gaming, etc.) is really painful.

there's also the security implications of enabling remote desktop. it's not a showstopper, but it's something I'd rather not worry about for my personal computer.

and finally, if you have the money for a powerful desktop and a powerful laptop, why not get both?

anyways, I just gave away my old xps 13 and replaced it with a zephyrus g15. it's not nearly as fast as my 9700k + 1080ti desktop, but it runs most games acceptably at 1440p. c++ compilation is very fast, though obviously not comparable to an overclocked 9700k. the portability and overall fit and finish aren't as nice as the xps 13, but now that I'm not taking the laptop everywhere as a student, I don't find it to be much of a compromise.


I agree in principle.

The real answer for why most do not do this is that IT-organisations are already under-dimensioned and over-burdened and it’s cheaper to maintain a small set of laptop models (which most likely are bought with some sort of support contract) than it is to assemble and support desktop computers.


I just bought one and have to agree. I was aiming for the i9 because audio production tools hadn’t migrated fully but it wasn’t available anymore so I went with an M1. inside of a month any of the tools or plugins I’ve bought in the past have all updated for the m1. Including Waves plug-ins now.

Outside of software work I run Logic as my primary DAW and it’s like there are no limits. It’s fantastic.


The TDP of the M1 is between 15 and 20 Watts: I use one third of that with Intel...

Crunching some numbers, some benchmarks seem to place the M1 as three times faster. Through raw numbers, the M1 seems not much more "performant per watt" than the Pentium Silver - only slightly better. And, for a laptop, the importance is (in general) more on energy efficiency than in speed.

Edit: danieldk rightly noted below that TDP is a "high load consumption estimate". Official data for the "situational" consumptions are a bit harder to find.


The TDP of the M1 is between 15 and 20 Watts

That's the TDP. For light daily tasks (casual web browsing, e-mail, etc.) the M1 primarily relies on the energy-efficient Icestorm cores. While typing this comment and having some applications in the background (like Slack), package power use hovers between 40 and 300mW (roughly 100mW on average).

The energy efficiency of M1 MacBook is remarkable. My wife had an M1 before I did. She was in some video meetings on battery for some hours, I wouldn't believe her when she told me she still had 80% battery remaining.


>I wouldn't believe her when she told me she still had 80% battery remaining.

Really? I feel like that's just having any 2021 laptop. I have a 13" Linux laptop with the i7-1185g. I'm not going to pretend it matches the performance/watt of an M1, but a few hours of video calling makes a completely negligible difference to battery life for me. 20% for a few hours of video calls just feels baseline.

I've never been in a situation where I needed to get back to a charger, and can happily go the whole day of development and meetings with charge left over to spare.


We should also compare the batteries to talk about "hours". This one is around 40Wh, according to `upower -d`; when it was new, I could do light work for 12..14h (Linux "desktop").

I have been trying getting some numbers from `powerstat`, intrigued by danieldk's results of 0.1W consumption: I am getting 3.7W during this browsing session. That 0.1W must be related to the CPU only. The result from `powerstat` is aggregate (if I raise the display lightness to an uncomfortable level the consumption raises to 4.5W). I do not know how to only get the consumption of the CPU. Trying to check...

Edit: I am really not sure how to disaggregate consumption and find a number comparable to that 0.1W. Maybe, danieldk, you could provide instead the aggregate consumption - the total (not just CPU) average consumption of your system? And eertami? Just to get an idea.


Really? I feel like that's just having any 2021 laptop. [...] 20% for a few hours of video calls just feels baseline.

I had a brief Linux/Windows excursion with a ThinkPad T14 Ryzen laptop. Three hours of video meetings was enough to drain most of the battery in Linux (worse power management, plus no hardware encoding/decoding in some video meeting apps). In Windows, it lasted a bit longer, but the battery would definitely be drained 50%, often more.

The ThinkPad was not a lemon. This seems like pretty normal drain if you look at people's experiences on the web.


The M1 is an SoC with CPU + GPU + NPU + NAND Controller and many others, being used in a Fanless condition under 10W on MacBook Air.

>Official data for the "situational" consumptions are a bit harder to find.

A single HP Core uses ~4W at peak. It's performance per watt is quite far ahead than anything in x86 at the moment. Intel AlderLake might help a bit. But will still be behind in pref / watt.


This system for example, according to `powerstat`, seems to work at around 3.5W, with minimum values below 3W and maximum around 7.5W.

It would be interesting to get a "powerstat database" of different systems for comparison.


From one point of view, non-x86 efficient laptops have been mainstream for Linux users for a long time in the form of ARM Chromebooks.


Is this user specific though? I mainly still use a 2013 MacBook Air and I’ve never had power management issues. I can work 10-12 hours on a full charge. Recharge is pretty quick. 90% of my usage is browser, text editor, light terminal, and a pretty basic dev environment. The only times I’ve hit performance issues is working with video editing and graphics. I don’t game and don’t edit videos , so that was a minor pothole on this long journey.

If this is the main benefit of M1 I’m likely to be under welmed is what I’m thinking. I’ll upgrade to it next, but it’s not going to force me to upgrade. I expect at some point Apple will start making my device suck via MacOS updates like they’ve done with iOS and those devices.


+1 absolutely correct.

I bought an M1 MacBook Pro early this year and it is a beast, performance wise.

I just started a new ML job and my company sent me a new Intel 17” MBP, and while it is beast-like, not a beast.

One trick to M1 is installing the M1 ARM version of Homebrew, so programs and libraries installed are M1 native.


The M1 Air not having fans is the real game changer for me. If you haven't used a laptop without fans you probably don't realize how good things could be.


>>> Apple Silicon M1 or nothing

I've used ARM based chromebooks. As well as ARM embedded boards running linux as a desktop. But I am intrigued by the Mac Mini M1 with 6K out. I wish I could get a demo of the 6K 32" Pro XDR Display. At $5K its hard to justify, but it looks like the only 6K monitor available right now ;)


M1 with official Linux will be perfect except for price (especially ram and storage price , I need 128gb ram and 2 tb ssd), I think m1 is 'cheap' because Apple is buying apple customers not just selling laptops..


I've been hoping to see a RiscV laptop for Linux, equivalent to the M1 Mac, but not seeing anything on the horizon.


That was a significant development, six years late to the table my personal count. So yes, I'm aware.

If only it did not come packaged in a sissy mirror powder-box I'd love it. They flimsy keyboard with no travel, and the fragile glossy screen with no touch/stylus are deal-breakers for me. I did try use those at the store, no good.


AMD 5850U is almost as good as M1 and it beats it in power performance benchmarks as well in multicore performance:

https://www.cpubenchmark.net/compare/AMD-Ryzen-7-PRO-5850U-v...

Both have TDP of 15 WATT.

I have Thinkpad P14S Gen 2 (AMD) and can work on the machine the whole day without charger and it never ever get hot.


TDP != actual power usage especially when they're pushed to the limit


Lol


No, he's correct on the side of speed. But that only ticks one of seven.


0. Linux compatibility, I am not super picky about getting everything to work perfectly but I should be able to trust that the software I prefer to use will run and my laptop will connect to wifi and play audio properly and so on.

1. IPS screen (mostly a given nowadays, but this wasn't always so)

2. Trackpoint, I hate trackpads. Pretty rare option so I could possibly be convinced otherwise if the trackpad is particularly good.

3. Decent keyboard, any laptop will always be primarily a text processing machine for me. (tablet version) 4. User serviceability, with battery and RAM being highest priority but expandable storage is nice too, and all other replaceable parts are a nice bonus.

5. General ruggedness, the whole point of a laptop is being able to travel with it and it's the kind of tech I expect to last me for years.

6. Battery life, though I am not super picky about this.

7. Everything else including performance, higher resolutions than 768p, thinness, appearance, touchscreen (roughly listed in order of preference).

Posted from my trusty Thinkpad X230t which is the only laptop I'm aware of that ticks nearly all my boxes. It's bulky and makes a vacuum cleaner noise when compiling and the battery only lasts 4-5 hours of normal use, though. Still, good enough for web browsing, various forms of text processing, watching videos, listening to music and the vast majority of what you might use a computer for.

I did look into the Framework laptop recently but while it's probably the closet things in a long time the keyboard doesn't look good and there's no trackpoint (nor even physical mouse buttons)


Similar here. I've been spoiled by Lenovo, although it's not perfect. It's mostly a coding machine for me too.

0. Linux compatibility: minimal driver problems, hibernation can fail less than 1% of the time. 1. Touchpad with mouse buttons (ideally 3) is not negotiable. 2. A decent keyboard with all keys from the home/end/... block. Full sized arrows would be nice, half-sized are a no. 3. Not bigger than a sheet of A4 paper. 4. A non-glare, wide angle display. 5. Good WiFi sensitivity and performance. 6. A hard drive activity light is quite useful. 7. When it inevitable gets thermally throttled, it should still be useful for web browsing. 8. Mute/mic mute buttons are nice. So are volume buttons. 9. Open firmware: WiFi, coreboot, libreboot. I'm a hacker, and don't like fences.

Same thoughts about the Framework laptop. I hope it becomes an option in the future.


> I could possibly be convinced otherwise if the trackpad is particularly good.

Genuine question: what about Macbook trackpads? We have a MBP 2013, and it runs leaps and bounds over my Windows laptop which came out in the same year. I'm not sure if other laptops have caught up with Apple yet, but Macbook trackpads are commonly referred to as the best in the business.


I've heard they're good, but Macbooks tick almost none of the boxes in the list, so I am not very interested in them.


I will say that Macbooks are more rugged than they appear, and the battery life on the M1 models is best-in-class. That said, they're not very open or Linux-compatible (yet, at least). I bought an M1 Air and love it, but it's very use-case dependent, I think.


Apple leapt ahead of other laptops again with their trackpad that uses haptics instead of an actually button. It takes consistent pressure across the whole trackpad instead of being more difficult to press as you get closer to the hinge. It feels ridiculously convincing that you're actually pushing a button and even has a cool feature where you have a second action by pushing harder. And it really does feel like you've pushed a button deeper.


Yeah, the haptic stuff is mad! I was amazed when I learned about it, I thought it was a real button with some sort of touch sensor.

You can literally change the feedback from the "click" (and the push-down strength) in Settings. Even in 2015 that felt decades ahead of its time.

If anything I still prefer that trackpad over a real button.


I’ve tried a lot of Windows laptops, and there’s still nothing on the Win/Linux side, even in Dell and MS laptops, that compares with the precision and natural feeling of a Mac trackpad.


They are basically an iPhone digitizer in landscape mode, so yeah, they are very much OK. Trackpoint is still preferable for when editing text.


> I did look into the Framework laptop recently but while it's probably the closet things in a long time the keyboard doesn't look good and there's no trackpoint (nor even physical mouse buttons)

It also has a screen that needs fractional scaling which makes Framework a deal breaker for me, no second thoughts about it. I'm not buying anything with a fractionally scaled display if I can help it.


Why not? Waylanyd has gotten to a stage where I can use it as a daily driver and have a better experience than with X.

Is it X support or something else?


200dpi according to calculator, which is about right for 2x. What would you prefer?

2256x1504, 13.5”

https://www.calculators.tech/dpi-calculator#dpi-viewing-dist...

Says it is good from about three feet away.


200dpi on a laptop screen would ideally need 1.6x scaling, not 2x. It could vary between 1.4x to 1.8x but definitely not 2x. Not sure what's wrong with that website.

At 13.5 inches, it would've been better to take a page from some Surface laptops and go for 3000x2000 or 3200x1800, depending on the preference for aspect ratio. At ~260 PPI, 2x scaling works fine.


My tastes are fairly similar to yours. I'm curious which boxes aren't ticked by the more recent X200 series Thinkpads since the X230 in your view?

I recently purchased one, and I did a bunch of research before deciding which to get. I settled on an X270 because it's user upgradeable, supports two SSDs, has two batteries, has hardware VP9 encoding/decoding, includes a USB C port, and is hopefully much closer to the ruggedness of the impeccable X200-X230 range than the newer models. The only thing thing I felt like I compromised on was the lack of support for 32 GB of memory. I was able to purchase an i7 X270, 4TB SSD, 16GB RAM, and replacement batteries all for under $1k, and it blows any new laptop that costs twice as much out of the water for my needs.


The X270 has a 6th or 7th gen dual core CPU. If you get an 11th gen i5 like you'll find in the sub-$1000 laptop range you'll get something that is literally twice as fast as even the highest end 7th gen i7. That is a lot of performance to leave on the table.


you'll find in the sub-$1000 laptop range you'll get something that is literally twice as fast

That's true, but it would require some combination of major compromises on the SSD, RAM, hot swappable battery, build quality, form factor, and keyboard/trackpoint. I'm almost never CPU limited in my day to day workflows, even on a 2nd gen i5, and I have cloud resources available when I need to do something CPU intensive like model training for work. It's a tradeoff that isn't right for everyone, but those other factors are far more important to me than the CPU.


X220 is my primary computer. It never uses more than 4 of the 16GB RAM. I ran the fan almost all the time at full speed tough. The killer app (in a bad sense) is Zoom. It kills the CPU and can't handle virtual backgrounds.


How much RAM is used is determined by you pattern of application usage (and OS), not by your choice of computer.


I know. I just wanted to emphasize on Zoom being the only thing that makes an otherwise perfect computer not powerful enough (for me of course).


I feel like the X-series Thinkpads after the X230 are an unhappy compromise, somewhat better in some ways and worse in others. The further away in time you get from the X230 the more they sacrifice things I care about.

The X270 feels more like a side-grade, trading in a certain amount of ruggedness and maintainability for a better screen and better battery life, but a pretty minor performance improvement.


ApparentlyApparently 3rd party manufacturers are looking at building keyboards with trackpoint for the Framework laptops.


Have you ever used a trackpoint-style pointing device from anyone other than IBM or Lenovo and had it be any good?

I think the current Lenovo trackpoints aren't as good as the IBM devices used to be in the late 1990s. I assume they were sacrificed on the same altar as the keyboards.

But the ones I've used from Dell, Toshiba and HP were so aggressively terrible I'd rather just carry around an external mouse.

So I like the idea of a framework with a trackpoint-style pointer, but I am afraid it's likely to be awful.


I have used some of them. Indeed not as good but useable.

Sadly Lenovo is taking alot of wrong directions imo, non-removeable battery, soldering memory now etc... I'd be happy to try something else and love the concepts behind the Framework.


Lenovo's descent into soldered RAM and the elimination of external batteries has me rooting for Framework too.

My daily driver is currently a T-480 that's about 3.5 years old. The relative stagnation of CPU and even IO in laptops has it still feeling reasonable to use. I'm hoping that by the time I want to replace it, Framework will really have found their stride and my decision will be obvious.

And I'd rather be stuck with a touchpad than a soldered SSD, which feels like Lenovo's next logical move.


That would definitely incline me towards it a lot more, but I'd have to see it first.


I see a lot of mentions of mainstream brands in this thread. If Linux compatibility is a criterion, why not a laptop built with that in mind?

Just a reminder that now exist, in no particular order: - Juno Computers - Tuxedo Computers - System76 - Slimbook - Starlabs - Pine - Framework - Manjaro Computers

Admittedly, the same chassis are reused across a couple of these examples, but there's choice


How smooth is video playback of newer codecs or even higher resolution of any codecs at 4k? I thought about bringing back my w520 from the dead but the cost didn't seem worth it.

Coding videos below 1080p are harder to read and unless it's a school lecture, 720p is a minimum.

And is it you main, do you have a separate system for media consumption?


Yes you did describe explicitly a thinkpad. Why is trackpoint a preference?


Because it's just much more pleasant to use. Plus, being able to manipulate the mouse without taking my hands off the keyboard is also a big advantage.

I've never used a trackpad that did not have at least one of: bad palm rejection, inaccuracy around the edges, clicking that feels bad and/or makes the mouse cursor drift while trying to click.

Granted I have not used an expensive, modern laptop in a while and it could be much better now, but even then I just think the trackpoint is better UX and better posture, there's something uncomfortable about having to awkwardly squish your arm between you and the laptop to manipulate the cursor.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: