Every time I hear about mobile Linux releases I get excited just for the chance to get away from Android and iOS, then I get disappointed to find that the list of things that don't work includes like half the phone
I have a feeling that the Linux developer community in general just constantly prioritizes the wrong things. They love to boast about technical achievements instead of doing something that would actually enable real-world regular-usage scenarios. Not helped by the fact that stable ABIs are basically nonexistent, especially for kernel modules.
Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
> They love to boast about technical achievements instead of doing something that would actually enable real-world regular-usage scenarios.
For mobile Linux in particular, I found that it's quite the opposite, I see projects like Phosh and KDE Plasma Mobile constantly showing UI and UX improvemnts (albeit at a slower pace than desktop projects), while basic hardware support is non-functional.
Of course I'm not expecting every UX/UI dev to abandon their project to jump into low-level kernel development and bring support for more devices, but it feels like the desktop environments are developing for a device that doesn't exist.
> Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
On a modern Linux distro (that isn't one of the "advanced" ones), the answer is yes. If you install something like Mint or Ubuntu, you have a graphical app store and driver manager (which AFAIK you only need for NVIDIA GPUs).
SFOS is sadly not well known in the US as its not sold there, which has definitely hurt popularity. Lack of VoLTE support for quite a long time didn't help either. Still, it's a great option I recommend to anyone who wants to escape from the current duopoly.
That was a bit ironic, indeed, but at least the USB-A works!
For what it's worth, the majority of mechanical RGB keyboards and mice are USB-A anyways, so, if you're fine with a very powerful machine that wouldn't have an internal keyboard support for a few weeks, sounds like a good advice anyways!
I'm unsure what RGB or a keyboard being mechanical has to do with it being USB-A, or what the relevance is, but yes, there are many USB-A peripherals available.
The point is that most of the keyboards for sale on Amazon are still USB-A, even the more fancy ones, even though Apple useds tend to portray USB-A as last-century and deprecated. USB mice is still faster and more fluid cursor movement than Bluetooth, too.
Also of note, even the most premium keyboards and mice are Full Speed USB 1.1, running at up to 12Mbps. You can verify this yourself through the Apple menu, About This Mac, System Report, USB, and look for your external USB keyboard or mouse.
Compare to the USB-C USB4 ports being capable of up to 40000Mbps. And, to be full-featured, they need to support up to 100W, or more, of power input, as well as output in excess of what would be required for a USB-A USB3 5000Mbps or even 10000Mbps port. Which is to say, for the cost of a single USB-C, a manufacturer can easily provide 4 or more USB-A ports, with a change to spare. That would avoid unnecessary adapters and improve compatibility.
Not to mention that most of the memory sticks are still USB-A, too, and there's no Fit USB sticks for USB-C at all, only for USB-A. Which means that it's far easier to semi-permanently extend storage of a USB-A laptop than of a USB-C one, which you may want to do to try out a system without messing up with your main Windows installation.
It's basically a nab against Apple's decision to remove a useful port, especially on the M4 Mac mini, where they now have USB-C ports that are not fully-featured anymore — the ports at the front have a different spec than the ones at the back, which we all remember now at the time of the announcement, but imagine dealing with it several years down the line, where you'll have to be looking up specs and troubleshooting your things not working the way you expect. They could have easily avoided it by NOT including the 2 slow USB-C ports, and including like 4 USB-A ones instead.
> The only thing my post was addressing is that you called out RGB mechanical keyboards specifically when there are also plenty of non-RGB and non-mechanical options. It was bizarre, like this reply.
The point was that even the more "premium" products are still USB-A, not USB-C.
USB-A simply isn't going anywhere.
Personally, I find USB-A more useful than HDMI, since HDMI is kind of inferior to USB-C in every possible way. I've tried using a 43" UHD TV as a monitor, since they're as cheap as $149.99 USD brand new, but it had noticeable delay even at 4k@60Hz, and just didn't feel right. The UHD resolution at 43" itself would actually be perfect, since 1080p at 20.5in used to be my fav a few years ago (before QHD at 23" started reaching the sub-$200 range), but, alas, the specs of a proper monitor (brightness, matte, USB-C with PD and a hub) are just better suited for a display compared to a TV, even if the resolution itself may seem ideal.
I've tried using a 43" UHD TV as a monitor, since they're as cheap as $149.99 USD brand new, but it had noticeable delay even at 4k@60Hz, and just didn't feel right.
This is typically due to default settings on TVs enabling various high-latency post-processing options.
Most TVs have "game mode" settings that disable all these at once, at which point latency is comparable to monitors.
Case in point: at both 60 Hz and 120 Hz*, 4K latency on my LG C4 is within a millisecond of the lowest-latency monitors listed here:
I fully agree that HDMI is inferior to USB-C, if only because quality USB-C-to-HDMI adapters are widely available, and mini/micro HDMI connectors commonly used on small devices (not including this laptop) are garbage.
* Probably also true at 144 Hz, but the linked table doesn't have a dedicated column for 144 Hz, complicating comparisons.
> The point was that even the more "premium" products are still USB-A, not USB-C.
Often they're both, with detachable cables. Again, it was just weird that you brought up RGB specifically, something not at all central to your original point, a gimmick marketed at 'gamers' which is pointless in a well lit room; as well as mechanical keyboards specifically; as well that you've carried on like this in reply. Please don't take the time to reply again.
An ARM64 is "a very powerful computer"? The whole promise with ARM is better thermals and long battery life, not screaming performance. With the thermals/cpu not working, we don't even get that.
The finer grained performance per watt vs silicon cost set in the context of use case is just lost in HN hardware conversations like this that ignore total cost of ownership, vendor politics and such.
Everyone is buying the tool that does the job, or building that tool if they want to make that large investment...
There's a difference in cost per flop at home and in a data center.
I'm updating my wiring and air conditioning for a 7x5090 workstation because having that power for experiments under the desk is worth the cost (and fire hazard).
If I had to build 10,000 of those I'd be banned by NVidia from ever buying their hardware again.
Considering HN started in 2007 I don't think your expedition will yield a ton from the heyday of Mac PowerPC. Apple shipped the last PowerPC hardware in 2006.
To me it seems some people developed some highly political or even religious views on tech. Truth doesn't matter as much as it used to. It's more of "I like the idea behind X, I think X is cool so X should be the best thing ever".
That makes sense. If you have working USB-A, then any USB ethernet adapter supported by FreeBSD should work right?
That’s actually a pretty big escape hatch for early development. It explains how you’d be able to get past having a nonfunctional keyboard pretty easily, for example.
If you prefer your usual (external) keyboard and mouse, which plenty of people (myself included) do, the rest of the list is kinda 'meh' as restrictions go.
Honestly when my current Helix 2 finally starts to die on me I'll be looking for a tablet or hybrid replacement since I neither want nor need an attached keyboard+mouse anyway, in my normal usage they're mostly just something that takes up desk space.
Obviously there are also plenty of people with preferences entirely incompatible with this approach, but so it goes.
Ubuntu has an experimental installation image for this laptop at https://discourse.ubuntu.com/t/ubuntu-24-10-concept-snapdrag... . Everything works except for audio and screen brightness control (I saw a patch for audio upcoming on LKML. I don't know about the brightness control, but it is stuck on high. Nevertheless, it still reports 12+ hours of battery with a bright screen.). It is a nice laptop, if you like the Lenovo T series.
Not buying anything Lenovo made ever again. T14 G1 was the worst computer I've ever had the displeasure of using. Extremely spotty USB C connection, throttling to 0.2 Ghz for no reason with no fix, and just terribly slow all around. Shame since I loved the T450s dearly.
I'm using an X1 Carbon Gen 11, and for my purposes at least, it's an improvement over every previous generation.
I'd love to switch to a Framework one day, but I'm not willing to use a laptop without mouse buttons. (I don't care about the TrackPoint at all; I do care about having physical mouse buttons.)
I've been keeping a list of problems with my gen10:
1. The laptop overheats easily. It is usually hot to the point of being painful to touch. It has melted the adhesive of the rubber strip on the bottom, which has fallen off.
2. The trackpoint is malfunctioning. Several times a day, the mouse cursor will jump to the top of the screen, and be stuck there until I wiggle the trackpoint fully in all directions.
3. There's coil whine and clicking from some part of the power intake.
4. Battery life is extremely poor, usually on the order of ~2 hours.
5. Sometimes the trackpad buttons will stop working. You have to put the laptop to sleep and wake it up again to get them back.
I switched a couple years ago from X1 after I spent months without a working mic and had to get my screen replaced twice and it still didn't work.
I went with the ASUS Zenbook. It's not perfect in terms of Linux drivers or support but they are built solidly. I would pick them again over Dell, HP or the Chinese rebrands.
I've happily used Asus for the past 5 years. Great linux support and no serious hardware issues. The only negative is that one of the arrow buttons came of my ExpertBook B5 after 1 year but it was easily glued back. Otherwise linux works like a dream and the price was good as well.
Anecdotal, I've had an Asus ROG laptop for close to 10 years now and it's still mint except for the battery. Aluminium frame and solid with Linux (even bluetooth works on Ubuntu, imagine that). I'd replace the battery and keep using it, but I'm not sure if it's worth the investment at this point given that it's DDR3. But when I go out to buy a new one, it'll probably be a TUF or something.
Asus may have deplorable predatory customer service, but if I buy the thing from a local reseller they have to deal with that instead of me if something goes wrong, so it doesn't really affect me haha.
You could quite easily use a t420 keyboard in a t440 with a few pins masked which was the last traditional thinkpad keyboard.
Actually I may be getting the numbers wrong, it could be the t430 that I was thinking about. It's been rather a long time since I did any brain surgery on thinkpads.
Hows the build quality with framework laptops? I fear that making it so modular might have required engineering tradeoffs with regard to build quality and endurance.
I got one in March 2022 and it's still kicking. Had to replace the heat sink and the keyboard this year. The support is kinda slow on the response, but the hardware itself is pretty nice.
I have a P14s G4 AMD that I am very pleased with. My only issue is the Qualcomm-made Wi-Fi that still doesn't work properly after a year because Qualcomm engineers can't figure out how to write a driver.
I rewrote part of their camera stack once to find that they hadn't managed cache coherence for the MIPI DMA, and didn't connect the coherency domains to handle it in hardware. Ticket probably still not being worked in their support portal.
Very rarely I see a little horizontal strip of corruption in my camera photos and roll my eyes.
Lenovo makes many laptops; some are good, some are bad.
This applies to pretty much ever manufacturer. Worked in a computer store for many years, and every manufacturer had some models with high return rates, but also models with low/normal return rates.
Saying "I will never buy $x because I had a bad experience with a bad model" is almost always a mistake IMHO.
What does matter is how they deal with the bad models. Some brands were definitely a lot better than others, and I generally advised customers based on that.
Yeah, 155H + RTX 500 Ada.
I use it primarily docked, so I don't really have data for efficiency/battery life. I did stress it with a Linux kernel compilation (`make -j22`), it was a while back, but I saw that the CPU frequency was up at 4GHz at the start, but dropped to 2GHz for the majority of the compilation, despite temps being in the 60-70°C range. I didn't try to troubleshoot it further since it's a work laptop and I don't compile huge projects on a regular basis. In terms of general system performance, I'm using it with Fedora KDE and it has been great.
Sorry to hear that. I actually cheaped out and for the first time went for the L-series, L14 G4, but with Ryzen. Very happy overall, pretty much no issues whatsoever, running Debian. I miss the old keyboard, though.
I upgraded from a 10-year-old Lenovo to a MacBook Pro M1 w/Asahi Linux for a while recently. It convinced me that we're not ready for ARM Linux desktops for general-purpose, regular-person use.
Besides all the crappy Linux desktop software today (I have been trying multiple recent distros out on multiple new laptops... all the Linux desktop stuff now is buggy, features are gone that were there 10 years ago... it's annoying as hell). The ARM experience is one of being a second-class citizen. A ton of apps are released as AppImages or Snaps/Flatpaks. But they have to be built for both X86_64 and ARM64, and extremely few are built for the latter. Even when they are built for it, they have their own bugs to be worked around, and there's fewer users, so you wait longer for a bugfix. The end result is you have fewer choices, compatibility and support.
I love the idea of an ARM desktop. But it's going to cause fragmentation of available developer (and corporate/3rd-party) resources. ARM devices individually are even more unique than X86_64 gear is, so each one requires more integration. I'm sticking to X86_64 so I don't have to deal with another set of problems.
One hopeful note: the developers for the Snapdragon X Elite are active on the kernel mailing list, and they are supplying patches for specific laptops, including the T14s. Now I run Debian, so i don't use AppImages or Snaps or Flatpaks, but I expect to have a fully functional T14s Gen 6 running Trixie when it is released as stable next year, assuming Trixie uses kernel 6.12 or (hopefully) 6.13.
they also butchered their entire dev kit rollouts and didn't launch with any sort of (promised) Linux support, I'm pretty sure time wise macran had Linux booting on M1s faster than Qualcom got any sort of Linux movement going on their own hardware.
Counterpoint: I've been on an M2 Macbook running NixOS-via-Asahi-installer for about a year, and I've run into maybe 2 applications that I cannot find in the Nix repos or flathub. I have a stable, fast, long-lasting machine running Hyprland and all the productivity software I've needed. I'm currently missing an internal microphone and, I believe, Thunderbolt (USB-C works fine) but this machine is faster than and as stable as it was when it had macOS on it.
I am as general purpose, regular person as you're going to find, in this world at least. I stare at a sentence like "In a functional programming language, everything is a function" and just blink. But a few months of blood and suffering to learn Nix/NixOS and I am managing the family's computers from a single repository and working faster than ever.
No and, currently, no, though Asahi has the accelerated graphics and an x86 compatibility layer sort of working now, so I imagine that will come soon enough.
The usuals are there, like Libreoffice, though I use browser-based MS Office too. Firefox and all my plugins Just Work.
I do a ton of photo work with Darktable, which I have come to appreciate after years of fighting it. Writing tools. Software development tools. It's arguably overpowered for my needs, but that also translates into 16-hour battery life (less than macos, but plenty), dead quiet, and a machine that does everything I ask without complaint.
For the kiddo, it's mainly about configuring and locking down the machine ... and getting it back up and running quickly if he breaks something. I've been using off-lease, years-old Thinkpads for him. No games to speak of, but he's more of an xbox kid anyway. I should probably do parental controls, but I have that largely handled at the DNS level anyway.
It's interesting to see the arm building issues still mentioned. I've been patching many packages in nixpkgs for apple silicon, and I can remember only one which had any kind of arm-related problem rather than darwin-specific. Snap/appimage packages have their issues of maintainers needing to spend the extra time. But sources? Not in my experience - I'd be interested to hear some examples.
I sadly am inclined to agree a bit about the Linux desktop stuff, I've been trying to get a nice simple XFCE linux desktop on a used thinkpad I recently picked up, so far tried Debian and Mint and they both have some issues with the keyboard mute button (it flickers on/off rapidly when I press it the first time, and the keyboard doesn't respond to input until I press the mute toggle again). There's other stuff like just weird quirks, like you can't have a key command like Super-R and then also a key command of just Super, because when you try to press Super-R it just instantly triggers the key command you assigned to just Super! Like, while I'm still holding it down! Is it not a modifier?! Or then there's something unmuting my speakers and mic every time the machine wakes from sleep/lock, and I think it's due to wireplumber, but Debian stable's wireplumber version is literally a year old (wtf?), so I can't find documentation on how I can alter this default behaviour (especially because this version of wireplumber uses lua for configuration? also wtf?) … No clue. (Also why does so much Linux software lack man pages?) … haha, that went on a tangent but it's been a surprisingly frustrating experience!
can you tell me about the battery life? I seriously need a new laptop to run linux and I need decent battery life. Don't want to buy a Mac just to have more than 3 hours of battery life
As I was working in Denmark, we had a lot of Lenovo resellers providing better offers than the normal list prices. This was a couple of years ago, maybe this is still the case.
I just checked and the laptop can be had for $1036 USD or €1809 (includes 21% VAT), and the configurator doesn't even allow adding more than the soldered-on base 16GB of RAM. You can save yourself €500 and get 768GB of additional SSD storage by going HP, or save yourself €400 and get a 32GB model.
What an absolute shitshow. I'm surprised Lenovo sells laptops in Europe with these prices.
It is not easy to do this. A friend learned the hard way. She was visiting the US and needed a laptop. Their online shop doesn't take foreign cards, probably to prevent price arbitrage. Lots of Reddit posts confirm the problem. You can always go to a bricks-and-mortar but model selection is more limited, especially in case of exotic SKUs.
I sent mine back. I thought the NPU would help with local LLM but there nothing to utilize it yet, lmstudio has it on the roadmap but it was a bit of a letdown. M1 MacBook was 30 times faster at generating tokens.
Happy with my gen 11 x1 carbon (the one before they put the power button on the outside edge like a tablet ?!?)
I just got NPU based LLM inference working locally on Snapdragon X Elite with small (3B and 8B) models, but it’s not quite production ready yet. I know all llama.cpp wrappers claim to have it on their roadmap, but the fact of the matter is that they have no clue about how to implement it.
> M1 MacBook was 30 times faster at generating tokens.
Apples and oranges (pardon the pun). llama.cpp (and in turn LMStudio) use Metal GPU acceleration on Apple Silicon, while they currently only do CPU inference on Snapdragon.
It’s possible to use the Adreno GPU for LLM inference (I demoed this at the Snapdragon Summit), which performs better.
I find this to be less and less of an issue, because RAM has gotten so cheap that you can pretty much just max it out when buying. At the moment, going from 32GB to 64GB incurs a 193$ markup for this laptop, which I think is entirely reasonable for a machine like this (although, honestly, I'm usually not even close to reach 32GB in my normal work).
The only notable exception here is Apple with their absolutely bonkers RAM upgrade prices, which is why I would never buy a Macbook.
EDIT: I just HAD to look, MacBook Pro(ha!) by default with 16GB unified memory, it will set you back 400$ to go to 32GB, so more than 4x what Lenovo takes (64GB not even possible, of course).
You say ram has gotten cheap, and then $193 for 32GB is fine with you? You can easily get 64GB for that price when buying separate modules.
I still think it would be beneficial for us to keep memory swappable at all costs. And if the connector is the problem, they should address that, rather than just accepting these tactics that _enable_ manufacturers in setting their own prices. I'm not saying they all do this, but there's plenty of them and Apple is the perfect example like you say.
The $193.00 for an extra 32GB of LPDDR5X is simply an example of Lenovo's CTO website showing the pre-discount MSRP prices for all the upgrades, even though they run near-permanent discounts of 40% and more compared to their official MSRP for many products.
As pointed out in the other comment, the true price at Lenovo for this upgrade is only $112.80 — not as good as you'd get with the DDR5 SODIMM, but it's actually cheaper than what Crucial supposedly charges for their 32GB of LPCAMM2, which isn't even as fast as what Lenovo includes.
We wanted longer battery life, so they run RAM at lower voltages now, which makes it necessary to minimize paths as much as possible. It's not some conspiracy. And they are working on a connector, see the LPCAMM2 link in the other post, so maybe you can wait it out if you feel that strongly.
It's $1184.40 for the default CTO with 32 GB LPDDR5X-8448MHz, and the upgrade to 64GB LPDDR5X brings the total cost of T14s to $1297.20 USD.
Even though the upgrade is listed at $193.00, that's actually the MSRP before the near-permanent discounts that Lenovo is very famous for, because 1297.20 - 1184.40 = 112.8. E.g., the extra 32GB of LPDDR5X-8448MHz — it's actually a faster variant of LPDDR5X than used in the base M4 — costs only a net $112.80 USD!
All together, that's $1297.20 for a machine with more AND faster RAM, at a cheaper price, than an M4 MacBook Pro that has a starting price of $1599.00 USD in the US, for just 16GB of the slower LPDDR5X-7500, compared to 64GB of the faster LPDDR5X-8448MHz with the Snapdragon ThinkPad.
Also, Apple is the only manufacturer in the form factor and price categories to solder storage with their laptops, as nearly everyone else uses the standard 2280, 2230 or 2242 NVME instead. Lenovo generally uses 2242 NVME in their ultraportables, which is also compatible with the cheaper/smaller and more popular 2230, as the 2230 format appears to be more popular because of its use by SteamDeck handheld gaming console and the clones, and hence has a lower price, because there's more competition in the form factor.
Drives with 2TB NVME in a 2230 form factor retail at about $150 right now (that's more expensive than 2280 but cheaper than 2242), compare to $600 that Apple charges for a 2TB upgrade from 512GB on a MacBook Pro! (It's actually $800 on a MacBook Air or Mac mini to go from 256GB to 2TB!)
Not to mention that Snapdragon does support DP MST for daisy-chaining the monitors through DisplayPort Multi-Stream Transport, whereas MacOS still doesn't — unless you're using Windows on an Intel Mac, that is!
But, hey, at least Apple has finally given us a 2 external monitor support with the base M4 chip, without having to close the lid!
> Those of you how know me, know that I am not a big fan of the X86
architecture, which I think is a bad mess, mangled by market power
considerations, rather than the CPU architecture this world actually
needs, in particular in terms of performance/energy ratio.
Meanwhile, ARM is a complete disaster, a mess and mangled by profit considerations when looking at the complete lack of platform standardisation and issues around compatibility. Issues which require significant engineering effort to bring a single ARM device in line with any random x86-based device that came out this year.
I always ask about battery consumption... Apple seems to be on another galaxy right now. I decided to stop waiting and installed Parallels to run Ubuntu there... I really wish the best for Asahi Linux.
Using the Ubuntu experimental image on the T14s Gen 6, the screen brightness is not adjustable, so for me it is stuck on high. Nevertheless, Gnome claims 12+ hours remaining when near 100%. In Windows where I can adjust the brightness, the battery lasts longer. Battery life is much better than any other x86 Thinkpad I've ever owned.
The CPU is pretty fast as well. I did no real benchmarks, but C++ std::sort() on the Snapdragon runs just 10-20% slower than on my 4 year old Ryzen 5 5600X desktop. Also, the base model T14s comes with 32G of memory, which is very nice.
On the other hand, I dropped mine in the street, damaging the upper right corner of the display (physically intact, but dead pixels in the corner). Even though the case material is nice, the laptop seems to be more fragile than older Thinkpads. (I've dropped my T480 and T450 numerous times, and never had issues other than cosmetic.) So the $35 accidental damage protection was worth it.
I filled out the form online, and they immediately sent me a FedEx mailer. I sent it in, but they're estimating Dec 2 for the return. So that's more than a three week turnaround, which is longer than I wished. I'm in southern California and the repair center I mailed it to is in Texas. It was slow ground shipping.
They have two different higher tiers of protection for "next day" repairs. I'm thinking about upgrading.
The M3 Max laptops can cross-build FreeBSD at a fraction of the time of the ThinkPad, being at around 791 seconds for `make -j17` versus the T14s being at 3210 seconds (with `make -j12`) according to the post above.
Comparing unbinned to unbinned, you get 10 P cores on the M4 Pro and 12 on the M4 Max. Regardless the original comment was regarding the M3 series, where there was an even larger difference between the M3 Pro and M3 Max (6 vs 10 P cores).
You probably can follow build(5) from FreeBSD hosts instead.
NetBSD is similar, but you need to edit `tools/llvm/Makefile` and make sure that you use the following target for `support-modules` instead:
support-modules: module-test.cpp Makefile
- if ${HOST_CXX} -stdlib=libc++ -c -fmodules -fcxx-modules -fmodules-cache-path=./module.cache \
- ${.CURDIR}/module-test.cpp 3> /dev/null 2>&1; then \
- echo HOST_SUPPORTS_MODULES=yes > ${.TARGET}; \
- else \
- echo HOST_SUPPORTS_MODULES=no > ${.TARGET}; \
- fi
+ # Just don't use modules pre for C++20 targets. Some compilers cannot support them.
+ echo HOST_SUPPORTS_MODULES=no > ${.TARGET};
You can further speed up NetBSD builds by editing `share/mk/bsd.sys.mk` and removing the workaround for SunPro's cc. The repeated invocation of /bin/mv for each object file really does add up.
I have not tried cross builds of OpenBSD from other operating systems.
Word of warning, I ended up getting a lot of strange compiler segfaults within xgcc when using when using `MKGCC=yes` instead `MKLLVM=yes` with NetBSD, specifically with floating point heavy code. I never did end up finding out why that happens.
Do you need to be in that galaxy? I easily get 8h with this $1400 64GB RAM AMD Thinkpad with an OLED screen running unoptimized Ubuntu (yes it looks 90s anonymous). An equivalent notebook for most practical purposes from Apple would be at least 3.5× more expensive.
I'm curious what you mean by that? Just that it's usually a large fraction overall? At least it's apt per pixel instead of blasting LED strips and subtracting the light we don't want with LCDs.
Really? Could you please arguably name something that you'd subjectively consider to be "actual engineering" and that you'd want to do on macOS or windows rather than on Linux?
(the comment you replied to was clearly arguing quality rather than quantity, so that's what I'm asking too)
Anything graphics programming related. D3D and Metal are significantly more prevalent than OpenGL or Vulkan.
Anything CAD related, because there’s next to no professionally used CAD software on Linux.
Audio stuff. How many DAWs have a significant Linux user base?
And even beyond that, how many website devs are on Linux? Most people making product pages aren’t on Linux because not a lot of the designers work on Linux and it’s better to have a mono culture of devices for office management.
And your question is what one would rather do on macOS/Windows rather than Linux which again is subjective even if I scope it to stuff you can do on both Linux and macOS and windows.
Flip that around, why would someone use Linux to develop when they could use macOS? Can you give a non-subjective answer? Then try the opposite.
Even if you’re developing for Linux deployments, you can still do most of it local and then spin up docker in the VM on demand.
The number of software developers who need to run a Linux VM on their Mac/Windows are a vast minority.
I suppose that by CAD you mean mechanical CAD, with which I have less experience.
On the other hand electronics CAD had been run mainly on Solaris decades ago, but for the last 20 years Linux has been the most likely host, including for the most expensive commercial professional solutions.
I have never heard of anyone using macOS for any kind of electronics design.
Yeah sorry. You're right, and when I read that I assumed the usual HN context of engineering == the kind of software engineering most often discussed on here. I see what you meant now. Here's where I was coming from anyway, reframed in the context of your comment:
Web dev is painful on Windows. NTFS really can't deal with 50k+ files in node_modules, I think it's something like 50-100x slower still. Not to mention that Windows by default _still_ tries to go and index your entire node_modules and silently poops the bed. This is one of the main reasons behind WSL's popularity, but that only works if you don't have your code on a mounted NTFS volume. Make the mistake of reopening the same repo you started working on in Windows in WSL2 thinking that you'll see an improvement & you'll bed disappointed - silently. You have to move your react project out of NTFS entirely and let the virtualised linux kernel manage those descriptors if you want to get work done.
macOS has a similar problem - not with NTFS but with Docker / virtiofs. Web development experience is generally awesome on macOS if you're working natively, but if you want to work inside a devcontainer (to maintain consistency with prod) and depend on Docker, it slows down considerably. I've heard that OrbStack and Colima have recently made this much better on macOS, but I've not tried it recently. But other more serious software development scenarios, where you want a might want a local k8s environment or you're writing lambda functions that you want to test locally? You have to use Docker and take the hit. In Linux it has always just worked (podman aside). Not to mention that Chrome's memory management is way better there than in Windows (thanks, ChromeOS).
For the rest of these please keep in mind that I explicitly said I was talking about quality of experience rather than quantity of people having to suffer through it, so the whole 'how many X people are on Linux when their manager makes them use Windows' argument is one I was specifically trying to avoid. With that said, I'll try to answer the rest of your qs:
> Anything graphics programming related. D3D and Metal are significantly more prevalent than OpenGL or Vulkan.
Agree almost completely - you wouldn't be building most graphics to run on Linux, so why not develop where your target platform is best supported. I disagree with your assertions around opengl or vulkan (see android), but UE5/Unity support in Windows vs elsewhere proves your point.
> Anything CAD related, because there’s next to no professionally used CAD software on Linux.
Agree, again obviously. In my case I love Onshape, and it works really well on Linux (apart from spacemouse driver support, which is a spacemouse issue not an Onshape one - there's no such thing as a class-compliant spacemouse interface and direct usb access for chrome would need them to implement a full driver; they invested heavily in getting a hack working on Windows but obviously not worth it for Linux, if for no reason other than that their target userbase will be extremely accustomed to Windows because of historical software availability and compatibility). But yeah, Onshape is an exception.
> Audio stuff. How many DAWs have a significant Linux user base?
Ardour supposedly has some recent converts, and the kernel is supposed to be acceptably good at realtime stuff now, pulseaudio/jack are supposedly better now. Regardless, you're right - it's too little too late. FWIW last time I did any real audio work absolutely nothing came close to CoreAudio (even on intel hackintoshes, or _especially_ on intel hackintoshes vs Windows on the same hardware). I don't think that has changed much since. RME drivers make a difference on Windows but WMI still sucks and ASIO on windows still isn't as stable as mac. Reaper officially supports Linux (<3 Justin F) but it's still dependent on Wine and yabridge, i.e. will probably never be on par. Reaper aside (which is a genius piece of software, on par with Samplitude 7 and Logic 5.5.1).
> And even beyond that, how many website devs are on Linux
Almost all of the ones I know, with a few of them still on Mac but curious. Literally none on Windows.
> Flip that around, why would someone use Linux to develop when they could use macOS? Can you give a non-subjective answer? Then try the opposite.
Hopefully I did that when I explained myself above.
> Even if you’re developing for Linux deployments, you can still do most of it local and then spin up docker in the VM on demand.
> The number of software developers who need to run a Linux VM on their Mac/Windows are a vast minority.
I think I already answered this, but for my (admittedly ignorant initial) definition of "actual engineering", unless you're targeting iOS or desktop development, _everyone_ is developing for linux as their primary target. Everyone.
I directly disagree with your final two statements, and that's kind of the point I was trying to make. For modern cloud/infra/saas/product/platform/web dev, i.e. the arguably subjective definition of "actual engineering", everything else is a compromise. Docker is, VMs are, WSL is.
Not to be flippant, but don’t your final two paragraphs boil down to : “if you ignore all the non-Linux stuff, everyone is on Linux?”
Also why is that the “arguable subjective definition”? Why are we trying to define “actual engineering” at all, even if subjective?
On what scale are you defining that? Users? Complexity?
Not to be rude, but it feels like “people in a Linux bubble are just reluctant to admit that there’s a wider non-Linux world out there”.
You said “most of the engineering discussed on here” but there’s tons of posts about graphics engineering. It just happens that web/saas is the most approachable end of software engineering. Again, and not trying to be rude, I think this is a case of being too close to the trees to see the forest. Are you perhaps only clicking on the links to subjects that are relevant to your domain knowledge?
Based on the reviews I've seen, Snapdragon + Windows seems to equal to the M3 generation of Apple ARM laptops in terms of longevity and performance (both devices winning and losing some benchmarks). Snapdragon didn't pull off their promises of consistently beating Apple, but they're extremely close, for a much lower price.
If Qualcomm continues to actually work on Linux, rather than let enthusiasts do all of the work for them like Apple, I think ARM on Linux is going to be all Qualcomm with Macs yet again being a second citizen in the Linux world. For Windows, it's already a choice of "do I want to be forced into using Windows for a couple hundred dollars of savings".
Mitchell Hashimoto has written extensively about his use of Linux in a VM on macOS. He published a NixOS configuration[1] which seems easy to use.
I recently bought a Mac mini M4 to experiment with this setup, and am strongly considering getting a MBP if it works as advertised. As a longtime ThinkPad user and F/LOSS enthusiast, it feels awful giving money to Apple to run Linux as a second-class citizen, but honestly there is just no comparable hardware that does what Apple has accomplished.
I have been running Ubuntu inside Parallels on an M1 MacBook for several years now, and am in general quite happy with it. What makes it less than ideal is e.g. that only OpenGL 3.3 is supported in Linux guests (Windows guests get 4.1), but for some reason the Vulkan support is actually quite good, and allows me to run the graphics tools I need. Having most AppImages out there unavailable on ARM64 is also sometimes a problem, but that's not Parallels' fault.
Parallels has some glitches (graphical flicker when it runs low on guest memory, less than stellar trackpad scrolling) but is otherwise very stable. I like that I have access to Linux and macOS at the same time, the other side is just a 3 finger swipe away, and cut-n-paste and shared folders work. Sound and video all work, though for things like zoom calls I tend to use the macOS side. All runs happily together with 16GB RAM for each side (and I often have both xcode and android studio open on the macOS side while compiling large projects on the Linux side).
So far, I’ve had a very good user experience, but I haven’t yet tried using it exclusively for an extended period to compare its battery life with that of a bare Apple Silicon macOS. Mapping shortcuts now...
I'm actually thinking of switching from a Mac back to the PC, since everything is done in a browser anyways, regardless of the system, but the lack of the fanless laptops in the PC world isn't promising.
ThinkPad X13s Snapdragon was fanless, but it's a bit old now, plus, only 2x USB-C, without any USB-A ports, and a screen that doesn't open 180°, unlike any other ThinkPad, meh.
One small tip I didn't appreciate for far too long with "mac-peer" laptops that have fans is to use them unplugged. My laptop almost never spins up fans when it is unplugged, and I can get a good batch of work done with the battery levels of today and fast charge over lunch. It's still not quite as ideal nor as long a battery life as Mac. But it feels so beautiful.
For the first time on Linux I feel better, like I am not just making sacrafices for values but like the actual whole all-around experience is better in most ways compared to my work Mac (M2 Pro so fans abound and not as aesthetically pleasing as the Airs IMO). It's instantly snappy, I have a nice large SSD, I've already swapped out RAM, no issues with key software, I have a theme with a desktop experience I prefer over the Mac one, and I can go to a presentation and type without fans stressing me out. As someone in AI for a while, personally, I don't value GPUs or NPUs, but that would be a difference. That's really leaps and bounds over Linux from 2016 or 2010 on Laptops.
The x13s is still quite quick and useable - especially since you can pick them up for a song on the used market. The display only opening to ~135 degrees is a bummer though.
I'm a bit puzzled about their weird naming. "T14s Gen 6" when apparently "T14s Gen 5" was Intel based. Surely changing the entire CPU architecture deserves a new model name?
Actual model numbers of T14s Gen 6(AMD) and Gen 6(Snapdragon) are respectively 21M1 and 21N1, they still use IBM inherited "Machine Type-Model" system. Looks like this is now expanded into a 4-3-3 digits alphanumeric sequence like "21N1001PUS". In case anyone needed cues to tackle these confusions...
It's still better than other manufacturers. If you look at HP's website their laptops are literally just named like 'HP 14" Laptop', without a generation/year.
How about the Microsoft Surface Pro 9 being x86 while the Surface Pro 9 with 5g being ARM. My conspiracy theory is they did it on purpose to submarine ARM into enterprise environments
What laptop do I get right now to run Linux with decent battery life? My dream would be a Mac-like 14 hour battery life experience running Ubuntu. I don't want to buy a Mac or be tied to their walled garden, it's literally my last resort - but I do need a laptop with better battery life than x86_64's with 3 hour battery life in a good day.
> mangled by market power considerations, rather than the CPU architecture this world actually needs
Why are CPUs being built if not for the market? Who are they building CPUs for? And who decided what kind of CPU the world needs?
I like the idea of alternative architectures as any other geek but this lately kind of thinking that permeates the subject comes out as academic arrogance.
I also got one of these. AFAICT for Linux, we need to wait for kernel 6.12, which is still at the rc stage but should be ready at the end of this month. As a NixOS user, I'm keeping track of this repo [1] for support.
Meh. Until support reaches full feature parity with x86 I’ll just keep running Linux in a VM on my m3 max MacBook. I do want an arm64 Linux or BSD laptop with the same ease of use and support as x86 though it’ll take time.
At home, I develop on a very beefy x86_64 desktop machine running Ubuntu.
When travelling overseas for work, I have an x86_64 laptop which provides me decent performance but only lasts 3 hours or so. All my colleagues are rocking Macbooks which last a whole day and I can't even take a piss without thinking about a power outlet.
What is the battery life for a setup where you only use the outside world Mac as a shell for a Linux VM? Can you run X11 applications remotely with `ssh -X`?
I sincerely HATE the whole Mac OS ecosystem but right now, they are the best in terms of battery life for a mobile device, and I need that.
So you’re probably not going to like the solution but it involves even more money!
Parallels for Mac I’m
Subscribed to and then run Fedora with i3 and it’s graphically accelerated and really really fast and I just use it like I would anything else except now I have the option to use macOS.
What you’re going to hate even more is I’m finding myself just trying to make macOS more like i3 finding tools that try and get there. Yabai and others.
In the end I am almost entirely in. Terminal with tmux and have a safari or chrome window open for docs and slack for all the work distractions.
Battery life in this setup is probably 80-85% of what I get in macOS and that’s already amazing but I have a m3 max one and that is overkill and likely kills battery life just itself.
I couldn’t for the life of me get ssh -X working on macOS so I went the full os route :shrug_emoji:
A lot of people use a laptop with an external keyboard and mouse because they're sufficiently picky about their keyboard and mouse to be happy to deal with the additional hassle.
I use the top half of a Helix 2 with a Thinkpad Tablet 2 Bluetooth Keyboard because I'm one of the three people in the world who actively likes the optical trackpoint.
If you almost always use the machine on a desk rather than literally on your lap (for which I use the same keyboard paired with an 8" tablet) it's not even -much- additional hassle.
So maybe it wouldn't work for you, but "basically useless" is silly.
I think generally people who carry an external keyboard and mouse around along with their laptop still think they're using it 'as a laptop.'
Most laptops IME mostly get used on a desk whether with additional paraphernalia or not; 'laptop' gets used to describe the class of machine more than whether it's touching your legs or not.
If you meant literal in-lap use it probably would've been better to specify it, and if you didn't mean 'useless' entirely it would probably have been worth clarifying that since your questioning whether the recommendation was sarcasm rather suggested you -did- mean useless entirely.
Language is a pain, sadly, but I don't think lojban is going to win any time soon.
—————
What does not work: Keyboard, mouse, TB & USB-C ports, thermal/freq mgt.
Conclusion: Highly recommended