I'm surprised that people call this a weak laptop. Clicking through to the article I thought it would have 2-4GB RAM and a weak CPU. It has 16G of ram and a very recent 3.10GHz Intel CPU. What are you doing on your laptops? This is a deeecent laptop and it is 13". The software you use is crappy if you can choke this machine w/o running a bunch of VMs concurrently.
> I had 16 GB of ram in 2010 and it was not even a top computer at the time
> It has 16G of ram and a very recent 3.10GHz Intel CPU. What are you doing on your laptops? This is a deeecent laptop
What exactly are all of you people doing with your laptops? I'm running an old W500 Thinkpad at 2.8Ghz dual-core, 4GB DDR2, and a 500GB HDD. The only times this machine has struggled was when I was using a memory-leaking software (Firefox, etc.) or one that tried to load a whole large file into memory instead of chunks.
If I need to work with heavy graphics I can stick an eGPU into the PCI express or an FPGA if I'm doing specialized calculations.
But besides that, I don't understand why anyone needs all this gear.
> What exactly are all of you people doing with your laptops?
C/C++ project builds. Losing two cores just about doubles build times, and template-heavy C++ code makes the compiler want gigs of RAM all to itself. In principle I should do this on a desktop or server, but IT isn't on board with that.
I have to agree with the other poster on this. I only full compile once in the beginning and once at the end. Every other time, I just recompile the changed part and rebuild it with the unchanged object files.
But if you're talking about packages from source, then I agree. It's the one thing that sometimes makes me question whether or not I should upgrade. But, after a full night of compiling Qt5 from source, that lingering doubt usually vanishes.
Do you do full recompilations all the time? Don't you use incremental builds? If yes, the compilation time should only rarely be a problem. But reading people's comments I feel like there are many projects out there using a shell script like "gcc -c a.cpp; gcc -c b.cpp; ...; gcc -o prog *.o" out there.
Changing something closer to the root of a large dependency tree triggers a lot of recompilation. The lack of real modules and reliance of `#include` in C++ makes dependency graphs larger than they could be.
Not necessarily. Say I'm editing "bob.cc", and in the mean time edited "common.h" #included by many many files. While I'm editing "bob.cc", there is no need for me to rebuild the entire project until I want to see how it links to everything else (which can be postponed quite a bit if you have some tests); so I either run "make bob.cc" or "make test_bob.cc && ./test_bob". I don't know much C++ and all this is theoretical, but I think this is the ideal workflow with C/C++ development, and this is why incremental builds are useful, especially if the code is nicely separated into testable units.
Not full recompilations, but when we're working on a feature release, switching git branches routinely requires a lot of files to be rebuilt due to header changes.
Why are you using a fancy 2.8Ghz dual-core machine when you could be using a netbook from 2008 with a 1.6Ghz Atom and 512MB of RAM?
Is it that strange that people have different use cases from you? Plenty of things peg the CPU at 100% for extended periods of time: video encodes, 3D renders, building code, slicing for 3D printers, etc. Certainly I could do my work on a dual core machine, and I did until a few months ago, but there are competing machines that are cheaper and offer twice the performance. Why would you not want a quad core CPU when they are the exact same price as the old dual core models?
At my webdev gig last year I used a 2008 24" iMac with a Core2 Duo and 4gb of RAM and it worked perfectly once I put in a request for a 250gb SSD. The only time it choked up was using ImageOptim on a ton of images, and that could always be offloaded to a server. I think computers are lasting a lot longer than they used to.
Linux is not very demanding of memory, but sometimes, I have to maintain legacy code writen in C#, so I use virtualbox with win7, visual studio, with is demanding for the memory.
You don't get the point of Purism's product line. The focus of their offering is freedom. The rest are peripherals---and more than decent ones, those are. Given their size, you can't expect them to compete with the big players in prices. But disabled ME and a modifiable laptop is worth that price.
The totally closed firmware and ROM on the m.2 nvme SSD is no more or less open than a modern wifi card with full open source Linux kernel driver support, which has a closed ROM running it.
For some reason wifi vendors typically ship devices without any radio firmware at all, but leave it up to the driver to load it. Rendering the device 100% useless without loading some external proprietary blob.
A key difference is that a hard drive can't secretly send information out. I'm fine with an isolated component the rest of the architecture can treat as a black box (even sending only encrypted data to it). But the wifi chip can easily build its own IP packets and leak a bunch of information to the internet or it can have an easily exploitable backdoor.
a hard drive is a huge source of attack vector. In particular if you're running full disk encryption with a very tiny unencrypted ext2 boot/grub2 partition, malicious firmware on a disk can intercept the plaintext keystrokes for a passphrase-unlock on FDE. This is a known intelligence agency attack vector.
This specific platform has all of the tpm module feature set disabled, no? Since the code running inside the tpm is proprietary and closed. To the best of my knowledge super gpl zealot users rarely choose to store a key in the tpm for full disk encryption unlocking purposes.
The SSD runs its own proprietary firmware that controls the raw disk device itself. If it wants to insert a blob of code into your bootloader or grub2 that can do keystroke interception on a full-disk-encryption unlock, it can. This is the same idea as a technique used by intelligence agencies with a typical "evil maid" attack.
Yes, I was more curious if a malicious actor (the hard drive) with access to the nvme bus could manage to exfiltrate data directly via the network interfaces without involving the OS.
It's in an m.2 slot, and good Linux-friendly ac cards are less than $20. Sure, it's wasteful to buy a device with a part you know you'll throw away, but it shouldn't change your purchasing decision.
It's becoming increasingly more okay to just bleed resources and pay your technical debt off or lower your development costs by requiring users to have stronger machines. It's like implicit crow funding but in a very stupid way. Steve Jobs said it best in [0].
I recognized the author instantly (I know him because of his Pepper & Carrot comic which he notably does in an all FOSS set up and puts under CC-BY-4.0) and if he says the laptop looks nice, is quiet and works well for his typing and art then it should be a glowing endorsement but people are inventing flaws, pointing out the GPU is a weak integrated one, that the CPU is dual core and too weak for compiling lots of C++, that 16 GB of RAM is too little (I happen to have 32 GB but 16 GB is not little at all) and also totally ignoring that this is clearly a slick small ultra book (with an ultra lower power CPU [1]) and meant to run all FOSS with no fuss (feel free to steal that catchphrase), meaning no Windows, no AAA games, etc.
I do personally use a more noisy, bulky and powerful laptop with even more than those "just 16 GB" of RAM[2] but I'd not call Librem13 weak, especially if it fits the owner's desired usage and does it in a slim and convenient for his travel patterns size.
I do wonder if my perspective comes from the fact I try to use stuff the best and fullest I can. E.g. for 5 years (2011/2012 - mid 2016) I ran a bulky business-y hands me down HP laptop bought in like mid 2010 (IIRC) with a dual core Intel CPU, 2 GB of RAM, integrated weak GPU with support for only OpenGL 2.1, 500 GB HDD. It came with SLED preinstalled and the only Fedora spin it could run well was Xfce and before Fedora when I ran Windows 7 on it I had to disable Window transparency and all the Aero baubles. I abandoned it only after the keyboard broke and by that point the battery was no longer holding charge too so I decided it's not worth it to sink cost into fixing and upgrading it to fit the times (battery, SSD, new keyboard, more RAM, etc.) and moved to a brand new laptop and Windows 10. It wasn't a money issue (or I'd go with keeping that husk of HP alive) but more of a personality issue, I didn't do anything power hungry regularly so that laptop was very fine for my web browsing, learning PHP, C, C++ and Lua, doing university work, writing LaTeX, etc. and I knew I'd move back to Windows from Fedora which would be an effort in itself too.
It seems to me that people don't know their use cases very well, and then don't really know how to map that to hardware. That's okay for the "end user" type, but we're mostly programmers---professional or amateur---here I believe, and this is bad. I have never used a computer with more than 8 gigabytes of memory, and I haven't ever needed one. What I do with my laptop: run Emacs (negligible CPU, 200-300 megabytes of memory) and Qutebrowser (negligible resource usage, not always open) on Debian with a couple systemd jobs and Xmonad. I can easily get working on a RPi (theoretically, never tried), and my 4GB and 1.80Ghz (i3) laptop is way more than enough. If there was a codebase that required 30 odd gigs of ram, I simply won't run/compile that on my laptop. If I'm doing graphics, then I know I need a better machine for that. But people don't consider these variables, and buy the machine with biggest specs they can afford. And then there are those C++ programmers that are complaining they need crazy amounts of memory because templates. Now if a file takes multiple minutes to compile, the problem to fix there is obvious. But people choose to throw money and silicone at it instead. And they do whatever they want to do, but must quit their BS calling a machine like this weak.
> totally ignoring that this is clearly a slick small ultra book (with an ultra lower power CPU [1]) and meant to run all FOSS with no fuss
Yeah. We complain we can't have a decent laptop with FOSS and a couple years later when we have one we whine it can't hold entirety of english wikipedia in its memory or that it has a right shift key slightly shorter than what we like. So ungrateful.
I searched GitHub for the most starred C++ projects, and then downloaded and compiled Godot engine, because it did not require Docker and seemed fairly straightforward to build. The initial build took 30 minutes, then I added a random semicolon to the file editor/connections_dialog.cpp, and recompiled, it took some milliseconds shy of a minute, the bulk of which was not the compilation of that file (I re-ran the whole build, including linking; if I only recompiled that file it probably would be quicker, but I don't know how to use SCons). None of that process was seemingly bound by memory or processor capacity, in fact I continued my sunday computational tasks (i.e. watch silly videos, skim reddit) while it compiled, w/o any slow down. And this is a whole, complete, very widely used game engine, with included a game editor, which is obviously a graphical program. I wonder how quicker this program would compile on a machine like many people here claim they need to use.
Godot might be a special case because it's so well made by some Brazilian industry veteran (gaming is kind of a spartan environment, especially on consoles) and is very polished and kinda well funded FOSS project, it's like a crown jewel.
I remember (I hope I'm not misremembering, I can't find it now) that Firefox took more than 4 GBs of RAM when being linked which ironically made it so the 32 bit version was being built in a 64 bit environment. Then again - who builds regularly rebuilds Firefox on their laptop (yes, the devs, by definition, do, but that's a very small subset of the population).
C++ has shortcomings with regards to build times, small change triggering a rebuild and linking time blowing up and so on but it's clearly the only one that fits the niche it sits in (maybe C can compete but it's a bit too spartan for some people, and maybe Rust in the future) but there are mitigations (like pimpl, forward declarations, etc.) and some of the criticism is just goofy to me.
E.g. there was story[0] of a hell like project with more managers than programmers, no working coffee machines nor toilets, crazy turn around of employees, employees not knowing how to code properly, changing version tracking software a few times and throwing away the entire history each time, simple operations taking seconds or minutes, physical paperwork to apply to edit a file, entire thing was bordering on a scam, etc. Main takeaway of the author that he reiterated in 2018: C++ is bad.
I mean, I guess you can dislike it for all its warts (and it has plenty), but would everything be fixed if this was a Rust, Python or C# project and all of the other craziness remained?
Maybe I'm just biased because I know C++ pretty well inside and out and even kinda like it but also never got forced to work on a bad legacy project made in it (but than again any crazy legacy code is bad, regardless of the language..) but lots of people are just bashing C++ so mercilessly, even here on HN people go and say that it's impossible to write in it (using a web browser that's certainly in C++ and relying daily on compilers, OSes and runtimes made in C or C++ that host their language, unless they're all Free Pascal developers or something (FPC is self hosting) and the joke is on us for not using Free Pascal).
They are rare (and it's hard to google for them..) but I got one in the summer of 2016 for just under 5000 PLN (about 1400-1500 USD and that already includes 23% Polish VAT)[0].
I bought my new work laptop from System76 in January this year with 32 GB RAM for around $1800 (including VAT paid upon importing the device into the EU). They are one of the few vendors that make it simple to configure a laptop with 32 GB RAM.
I make do with 4GB RAM and 1.8GHz CPU. If I need anything that exceeds the capacity of that, be it games or graphics or compilations, I invest in a proper device or service for that. And if you need GBs of RAM for each compilation, you have to fix your build system, switching from detrimental builds to incremental builds. We're talking about a thir-teen-inch laptop here, i.e. sth one can shove in the pocket of some older overcoats.
I bought this computer for work just recently. Like he said, I really appreciate how easy it is to open it up and modify it. I'd add that it's very lightweight and small, but has specs just as good as my old MacBook Pro.
Librem 13 would without a doubt be my next laptop if it had a >=Retina screen, USB-C†, and ideally a form factor as close to the current-gen MBP as possible.
I think they have a real opportunity to win over the group that doesn't like Windows and is getting increasingly scared off from macOS >= High Sierra, but not if it means a downgrade in any aspect of our current hardware.
† (Edit): I see that it actually does have one USB-C port, but I'm thinking more along the lines of dropping all non-C USB ports and charging over USB-C.
Just commenting in the hope that the Purism team also reads HN: I can totally second this, with an 2kish resolution and nice i7 w/4-6 cores + HT I’ll immediately replace my aged MBP with a Purism Librem 15. On a second note: Your offer of a privacy first notebook is worth to me as much of a premium as apple‘s OS and design are.
Fair enough about wanting at least one USB-A port, but I can't wait to upgrade from my 2013 MBPr to something that charges over USB-C. MagSafe is nice, but the magnetic breakaway USB-C cables look fine too, and it'd be much nicer to have chargers set up all around the house that work for all of my devices and don't cost a small fortune.
As amazing as Retina/HiDPI is, is it really worth trading "freedom" for? I have found high quality traditional 96ish DPI displays are really quit fine. Objectively, what does HiDPI provide? Do you write code faster? Do you write better words? I don't mean this sarcastically, but in a positive inquisitive manner. HiDPI is one of the few real innovations in the last decade. And yet, as cool as it is it doesn't add much as far as I can tell.
I went to a Dell. 32 GiB RAM. Xeon Quad Core. Heavy. But great.
My eyes don't strain when I use smaller fonts that let me see more of what I'm working on (graphics, code, whatever) at the same time. The absolute lowest PPI display I use is a 108 PPI 27" 1440p display--but it also sits much further back than I do when I use a laptop and I have the physical real estate to use much larger fonts than I do on a laptop. On a laptop, where I'm under two feet from the screen at all times and am physically constrained, I have different needs.
I would not use a laptop that couldn't at least match a Retina MBP's 220 PPI. It's noticeable. It makes my life worse not having that; the "freedom" to use a laptop that gives me eyestrain is not worth it to me (and Purism's touchpads are pretty bad, too). The tradeoff is worth it to me.
It is perhaps a niche use case here on HN, but as someone who regularly reads Chinese text, Retina has totally changed my reading experience. Pre-HiDPI you basically have to use a bitmap font (meh).
That's a good point, I never heard this before about how retina displays can improve legibility of Chinese text. It makes perfect sense. As a person who reads Japanese regularly, I'm now considering getting a retina display for this reason.
If am not even sure what “freedom” I am giving up. I don’t care about upgrading old machines, I don’t even care about modifying the OS. I just care about getting work done quickly and with a nice experience. Freedom for me is not having to think about the Linux kernel or carefully inspecting software licenses to be sure I’m not polluting “freedom.” I like MIT and Apache licensed open source software — but the almost religious fervor surrounding the Richard Stallman fringe of the tech community wears on me like the persistent, yet infuriating good nature of Mormon missionaries.
I want the freedom to just get my work done without having to consider an ideological struggle each day. I care more about WHAT I create and less about the purity of the tools upon which I create. I submit pull requests for open source software I use and I am happy to by licenses/donate (such as Sidekiq Pro, or donating to the Vapor team,) but beyond that, I’m not going to try to perform tech gymnastics or compromises to make some “free” computer work “almost as good as a Mac.” You know what IS actually as good as a Mac? A Mac.
Lots of respect for the FOSS folks; I appreciate your mission despite not sharing the obsession.
FWIW, I could have gotten a Dell with a nice HiDPI display, but it is a lot of work to make it a good experience on Linux. Only MacOS has a good HiDPI experience. Linux is functional, but takes effort up front.
To add another opinion, I'm really quite the fan of HiDPI displays but I found that 1080p is just about sufficient for my own 12"-13" ThinkPads. 96 dpi now feels limiting in terms of space and font fidelity, I would never consider going back after having worked with a higher resolution. It's nice to have smaller window decorations and taskbar without resorting to 6-8px fonts, and it really does feel so much nicer on the eye.
For a laptop and target audience like this, bumping it up to 1440p probably makes sense in the next iteration. It's also important to note that there is a trade-off of more quickly depleting battery with higher-resolution screens. It sounds like the battery in the Librem 13 is just about par (if even) with current runtime expectations for Linux, bumping that up to 8h would be qualitatively a much higher jump for me than increasing the resolution.
That, and going 14" with a thin-border "infinity" display. Also, can we please get the Lenovo key arrangement for arrow and PgUp/PgDown keys? This continues to be a deal-breaker for me with most laptops.
Alienware sells a 1440p 13" display which is fantastic. (They sell 4K in the 15".) Runs fine on Linux too without making the sacrifices that Purism saddles you with--notably the touchpad and keyboard.
It's a bit heavy, especially for a 13", but it's probably ny next non-Mac.
Have you tried plugging a charger into that port? I have a Dell 5450 with only one usb-c port which is marked as a Displayport; however my usb charger has no problems charging it (the pc warned me at first about the charger outputting max 29W, but I didn’t need the full 60W anyway. Although this problem is simply about the charger)
Unfortunately cost would probably be even higher than a macbook if it approaches that quality (retina display, form factor), because there's no economies of scale.
I would say that cost is the lowest of those concerns for something I need to do my job everyday, and a Librem with MBP-level specs would be worth more to me than a MacBook Pro.
I really like the Purism design aesthetic, but hardware-wise they just don't pack enough punch for me to work with.
Last year I ended 10+ years of developing on macos because their latest "pro" edition had a weak CPU and didn't support more than 16G memory (don't even ask me about the TouchBar).
After some research, I went with a quad-core System76 with 32G memory (could have gone to 64) and 4TB SSD. The laptop is an absolute beast for software development; various compile/build times were cut nearly in half (versus my previous MBP), it is a screamer. While I do love it, I must admit it's pretty heavy.
If Purism made a quad-core laptop with 32G+ memory, I'd be very interested.
I did that for years with great success. But it does depend on the nature of the work; for the past year or so I've been needing to run multiple VirtualBox VMs, a heavyweight IDE, and other memory-hungry desktop apps.
Cloud servers aren't great for interactive work like developing/debugging in an IDE or running a VM of a graphical OS (windows). I still use remote servers when I need real CPU/memory horsepower.
Yup! My ThinkPad 25 might not be super light but this side of 2011 it's the only seven row layout so it'll do and it's 3.5-3.9lbs depending which battery I take with me. I have a server rented at Hetzner which few laptops could match and none you want to haul around if you don't drive.
Only the small ones, X2.0 and T4.0s models arrive with one soldered module limiting to 24GB. Once you switch to real workhorses, even the T450 from 2015 supported 32GB which was quite rare for DDR3 https://www.reddit.com/r/thinkpad/comments/3kmlzv/people_kee... and every T4x0 since then did because they sport two DDR4 slots. Same for T5x0. The P50, P51, P71 support 64GB. You were saying...?
It's the same reason why the macbook pro tops out at 16 GB, most laptops use U series CPU's and optimize for thinness.
The U series CPU's are typically matched with (soldered) LPDDR3 for battery life and thinness, which maxes out at 16 GB. The Xeon and H series CPU's are matched with DDR4 for performance and therefore can go past 16 GB. AFAIK the U series CPU's with 32 GB will come with the cannon lake generation with LPDDR4 support, which should start shipping this year.
Lenovo used to sell the T470p which marries an H series CPU with a small (but not thin) form factor, but since the 8th gen U series chips now also have 4 cores / 8 threads (basically matching the 7th gen H series for performance) they've silently dropped that line, opting for a simpler line-up based on U series across the board, topping out at 16 GB in many cases.
The T470 and the T480 both support 32GB just fine. They didn't drop a thing especially because the T470 and the T480 is the exact bloody same aside from generational IC swaps but it's the same chassis and the same planar. So yeah both have the same two SO-DIMM slots. The T480s is an entirely different matter. Finally, the T470p has a very limited life because it doesn't have Thunderbolt 3. The ignorance in this thread is painful.
The 8th gen U series is not "basically matching" the 7th gen H series, the base clock (i.e. what you actually get under load) of the i7-7700HQ is 47% higher than the i7-8650U.
I have a 32gb t460p (skylake) myself and it is still an awesome machine. All day battery life during normal web dev with the extended battery pack, but it can also handle a 20 gb ram hadoop vm without bogging down, and bioshock infinite ran nicely on the nvidia graphics. It has just one real compromise: it looks very dated.
It seems they intend the t480 series to be its successor, now that you can get quad core cpu’s with dedicated graphics in that line.
HP ZBook series is readily serviceable and supports 32GB RAM through 2+ non-soldered slots, even on the gimped versions running Intel U-series processors. They also make non-U versions with the HQ-series processors (and I think the suffix got changed for more recent versions?).
We use these as our default machines for devs in my organization.
They're solid machines and well built. Their chiclet keyboard is a little bit off from others' in a way I cannot accurately describe, so it is a bit of a muscle memory learning curve - this has been a repeated experience for all our employees, but it's not a huge deal. Maybe a week or two to get back to your normal speed.
I'm looking for a new laptop for work now. Ideally I'd get a matte 4k screen, mechanical keyboard, full Linux driver support - and minimum 16gb ram (I've been bit by my oracle db vm pausing due to too few resources - especially when running a separate windows vm in order to gain access to a client network via some godawful vpn that doesn't play well with Linux (there are a few)).
But getting 16gb+ of ram, I really start to think I need ecc - a fair chunk of that's going to be file system cache, and I really don't want bitflips.
Any decent, light-enough, laptops with above specs and ecc ram?
The Lenovo "search by specs" page ignores their own customisation options, which is a bit frustrating. So Lenovo will tell people that they offer laptops with 16 GB RAM, when many of their laptops can be customised from Lenovo with 32 GB.
Yeah, the more recent versions are limited to one slot of 4GB or 8GB that's soldered on the MB, and then the second slot is maxed out at 16GB. So, you can't get to 32GB.
I am not quite sure but I do believe Purism actually designs their own hardware. They are pretty open about hardware selection, which would be weird if they only sold branded Clevo.
Besides the case matching perfectly with the Clevo barebone of the year (except for the hardware switches, which are in a plastic bezel on the display hinge, so that's a custom/modified part), there are teardown pictures online and the innards perfectly match Clevo machines as well.
It'd be neat to instead so this with the latest 8th gen i7 mobile chips that have 6 cores or the Ryzen 2700. Those 6 cores will make a huge impact for most software development people. Maybe 32GB is possible if you can change it yourself unless it's unsupported.
> Those 6 cores will make a huge impact for most software development people.
I'm a software developer, and I fail to foresee such an impact. For me personally, and for the vast majority of the people I worked with directly in the past.
Could you explain how 6 cores (and possibly 32GB of memory) it would make a dev's life easier, as opposed to 2-4 cores and 8-16GB?
Compiling and building. A really big C++ project can take hours to build. My 18 core/128GB iMac Pro builds in 2.25 hours what it takes a 64 GB, 12 core Mac Pro over 5 hours to build.
I’m probably on the far extreme of normal, but more cores and more RAM make a huge difference in tasks like rendering, compiling, building projects with huge dependency chains.
For writing some JavaScript web application, the serious horsepower makes a only marginal difference in actual productivity. But on “big” stuff, it can save hours per day, which means savings of real money.
> A really big C++ project can take hours to build. My 18 core/128GB iMac Pro builds in 2.25 hours what it takes a 64 GB, 12 core Mac Pro over 5 hours to build.
While I understand such monsters do exist, I cannot help but balk in horror at the though of millions of lines of code written in such an unsafe language. Things must have gone wrong on several levels to get to this point.
It still can on JS toolchains. There is multiprocess webpack (happypack) now, Flow threads really well, and the extra cores can even help when restarting a bunch of processes at once and live reloading. Shaving off even a second helps productivity when you do it dozens/hundreds of times a day.
Indeed, an over generalization, it of course does not apply to every software developer. But as a C++ developer I would appreciate extra cores to cut down compilation times. And even this won’t be true for every C++ developer in case their code base is relatively small, very well modularized, or you can use something like distcc.
Ah, unacceptably slow compilation times… I agree C++ is really bad at this (at least by default). This is especially frustrating when you know of Jonathan Blow's language, Go, and Wirth's compilers.
We shouldn't need incremental builds to be fast. Incremental builds have their own problems, which I'd rather avoid if at all possible. (They're more complex and leave more room for error than rebuilding everything every time.)
I want to compile 100K lines of code or more in less than one second. Compared to that, C++ is slow as molasses.
Well this review is concerned with using a laptop of Krita drawings and Blender modeling/rendering on the move.
The time taken to render the Blender BMW test scene was listed as 27 minutes.
I'm typing this on a late 2013 MacBook Pro that I really love but if I could upgrade it to a similar sized laptop running Linux really well that could render Blender in a performant manner I would.
I didn't see a price in the article. But there's a page on the Purism website to calculate prices: https://puri.sm/shop/librem-13/ Judging from that, I'd guess the model in the article is about $1700USD.
Aargh, just noticed the keyboard violates another one of my cardinal usability rules: everything directly below the Enter key must be a Shift key. This one invented a need for a second Fn key and then fulfilled it by taking away some of the right Shift key... worse, it took the area that touch typists actually use to press Shift.
Chromebooks, Apple, and Microsoft understand this. Dell mostly does. Logitech does not, nor do second-tier manufacturers who try to shrink down layouts without user-testing with conventionally trained touch typists.
Purism, if you're reading this, please: right Shift must fully overlap the horizontal space of the Enter key.
Wow, this seems like a pretty nice laptop that supports OSS software out of the box.
Keyboard, screen, and form factor seem to be exactly what people want. I wonder how the trackpad is (usually a major problem with non-Apple devices to get "right").
I also wonder what the battery life is like since I didn't see that in the review either.
I know I'm not normal about this, but I can't use a Linux/Windows laptop without physical mouse-buttons on the touchpad. This is actually the only reason I didn't buy a Librem13.
Apple's touchpad is better than the other OS's at least in part because the software driver is smarter. I think Purism themselves were working to improve the linux driver so there's hope on that front.
I read Apple's touchpad-driver source-code about 10 years ago, which was available to read under some OpenDarwin apple license. It was pretty neat. IIRC they draw a perfect circle from the center of the touchpad, radius extending to the top and bottom. Any time there's a keypress on the keyboard, within a small time-delay, any mouse motion outside of the circle is discarded. Brilliant! In comparison, the Linux and Windows drivers of the day would typically just say: within some larger(!) time-delay of a keypress, discard ALL touchpad motion. Which is noticeably more frustrating -- you had no way to reliably move the cursor at all while typing.
In any case, Macbooks were less enjoyable to me when they removed the physical button, but their driver is still ahead enough that it's not so bad to use, and their UI requires somewhat less right-clicking than I need especially on Linux. But for Linux usage I require precise left and right clicks which I can only get from real buttons.
I'm currently using a Dell Latitude 7380. The touchpad is great with real buttons :) But hilariously, I found (and taught them about) a bug with their keyboard firmware causing me frequent typoes, which affects all XPS 13 and Latitude 7xxx laptops. Last month they issued a BIOS update to fix it, specifically for the Latitude series. But if you have an XPS 13, I think they have not provided a fix yet, so you are probably running into typoes that are not your fault!
I suppose I shouldn't leave on a cliffhanger, even if it's off-topic.
If you have a Dell XPS 13, try the following in any editor:
1. Type the letter "k".
2. At virtually the same time, type the letters "o" and "k". Press them at basically the same time except so that the "o" is first before the "k".
You should expect to see "kok", but if you encounter the bug then you'll just have "ko". This problem is for any two keys on the keyboard, if typed in that "A, BA" pattern.
Let me know if you have the bug! If it's not solved in the latest BIOS, then Dell probably needs a kick-in-the-pants from a real XPS13 owner to solve it (even though I tried my best to convince them that it has the same bug as the Latitude 7xxx, which is basically the business-equivalent but with my beloved touchpad buttons).
Just to add a data point, I recently bought a Latitude 7370 (refurbished from eBay), and it does not seem to have this issue (BIOS 1.7.4). And now that I look at this version on Dell's site, it looks like I really need to update!
Side note: If you're looking for a laptop like the XPS13, under $500, and don't need as much processing power, check eBay. The refurbisher I purchased from seems to have lots of these, and may take offers. I got one with 3200x1800 QHD+ matte touch screen, m7-6Y75 CPU, 16GB RAM, and 256GB SSD for $475 shipped in the US. The service tag says it was in use almost exactly one year, and it looks brand new. I've only had it a few weeks, but so far my only complaint is that the battery life could be better (seeing ~4h average).
So, after updating the bios to the one released Feb 2018 (1.15.3), I can reproduce this issue, but only about once every ten or so attempts. Hopefully I didn't just start a nightmare :)
BTW, if I had to guess, BIOS 1.8.3 sounds like it probably introduced the issue:
> I can't use a Linux/Windows laptop without physical mouse-buttons on the touchpad. This is actually the only reason I didn't buy a Librem13.
I have the same laptop (different colour, different branding, same hardware).
It has a physical button, hidden at the bottom of the trackpad area. It has all the haptic feedback of a mouse click, and it distinguishes left and right clicks. You can also left click with a finger, then drag with the other (I've just tested).
The physical buttons you're asking for are there. They're just not visible.
interesting. This was the main reason I sent back my XPS 15. I had this problem especially in hjkl keys since I use emacs devil mode and my cursor was just jumping all over the place all of a sudden. This problem and the coil whine, but when I returned they didn't mention others have the same problem.
The battery is really good for a GNU/Linux laptop. I was used to laptops with two hours maximum while typing and 30min while painting. This one can go almost to four hours while typing and 1h30 while painting. Sometime even more, it depends the backlight settings and your activity with it."
no, they are not in the same world. one of my colleagues has it and compared to my previous machine 2013 MacBook pro it doesn't compare but even with my Thinkpad x1 5th gen, I think mac still has a better touchpad. Mostly it is because of libinput probably. Never tried mac touchpad on Linux.
I've always hated clickpads in general including apples. Dell
XPS is by far the best I have ever used. And I actually use it now instead of just disabling it.
Purism says that it lasts up to 12 hours with light usage, and I can attest to that. The only times I need make sure to plug in my computer is when I'm using a lot of CPU
Though that charger is not fully standards-compliant, it is one of the more compatible chargers out there because it's popular. So if the laptop doesn't work with it, it probably doesn't work with the others, either. That's unfortunate. Thanks for the info.
Family of five. When we travel I want to take a small number of chargers, each of which can serve all our devices, rather than having to remember a special snowflake permanently paired to exactly one device.
Home is similar. One commodity charger on the table next to the couch can serve any of our laptops, tablets, or phones.
That's the world my family has now, with a couple male-to-male USB-C to micro-USB cables as adapters for legacy devices that haven't died yet. It's as easy as I'd imagined, and until something even more standards-based and capable than USB-C comes along, we don't plan to change. Unfortunately that rules out this laptop, which is too bad because I like what it stands for.
Has anyone else noticed that, on Purism's website[1], you can only select "Don't Include" for the TPM dialog box if you have also selected the "English (UK)" keyboard layout option? Or is this some bug in my particular web browser?
For a company whose advertising copy on their "Why Purism?" page includes, "We believe people should have secure devices that protect them rather than exploit them,"[2] only allowing removal of the TPM chip for a particular keyboard layout is a pretty big red flag.
The TPM is a security feature--you don't have to use it. I have not used it on my laptop but I see from the documentation that it is different from some other trusted computing systems in that the end user controls the keys and what is loadable, not the vendor.
The laptop looks nice enough, but the lack of dedicated PgUp/PgDown/Home/End keys is a deal-breaker to me. I'm also on a 13.3" (Lenovo E31-70, previously a HP ProBook) and fitting these keys really isn't a problem.
I wonder when it will become illegal to produce such laptops as states progresses with surveillance and data collection about citizens. I would really like to buy privacy oriented laptop as I don't feel comfortable that someone can look up my private life.
But this laptop is pretty weak to use it for work I do sadly. When something more up to date is going to come out, I am definitely going to buy.
That's a probable future scenario, that computers will have privacy-invasive data collection or some kind of security/encryption bypass required by law.
I feel like those of us who are privacy-conscious need to support projects/companies/products like these, that are transparent/open-source and respect their users' privacy.
I also agree with your sentiment that this particular laptop model might not be powerful enough for "serious work" - although it's getting close, and I'm considering something similar in the near future.
I don't see any specs on the wifi card there. Assuming it's an Intel 802.11ac dual band minipci-express interface, (likely 2x2, not 3x3), which has decent linux kernel driver support. But if it's not specified it could be a lot worse. There's a shocking number of "new" laptops that still ship with 2.4 GHz only, 1x1 (SISO) wifi cards, because they're cheap as hell.
They say on their website "Atheros 802.11n w/ Two Antenna"
Wi-fi cards are actually a huge issue if you are concerned about privacy. You have no idea what the firmware is doing, and it has access to your entire memory through the pcie interface, and the internet at the same time.
I don't see how using a Qualcomm (Atheros) 802.11n chipset is any better than an Intel chipset card, developers do NOT get access to the code that's running its firmware. Same as Intel.
If you look at https://en.wikipedia.org/wiki/Comparison_of_open-source_wire... you can see that a few of the atheros chipsets have both OSS firmware and OSS drivers, the intel ones only seem to have OSS drivers. Perhaps they are using one of those that have the OSS firmware/drivers combination?
It really doesn't matter, the actual RF baseband inside the chip is still closed source. Atheros never released it for any model. The firmware interface to the OS driver, yes, but not what's going on under the hood.
It's still better than a proprietery firmware though, right? The baseband does not have as deep access to the system (like DMA), so having the firmware be OSS should at least be better than the alternative.
I do not see the logic in having a M.2 NVME SSD attached to the pci-express bus, which certainly has a totally closed source binary in its onboard controller... That's in the laptop, and it's OK? It has access to every disk read/write operation and manages a small RAM cache for the SSD flash, organizes the write wear leveling algorithm, etc.
But they have to go with some weird, ancient 802.11n card instead of a modern 802.11ac 3x3 MIMO, dual band Intel chipset card, because they don't like the binary blob in the ROM of the Intel minipci-express card?
At some point in time you have to trust the devices you're attaching to the pci-express bus in an x86-64 system or you won't have any useful functionality left.
I agree in principle, but there is also something to be said for not having a closed system being able to talk both to a wireless network and have DMA.
Anyway, the more open components the system the better, and if this shows there is an interest for an more open system, the next one might be more open than this.
The problem is that many wifi adapters require loading a binary blob at initialization time. "Burnt-in" firmware is practically the same as hardware, which you already trust since you bought it.
Not so much for the firmware blobs the vendor maintains in the "linux-firmware" kernel.org repository and that are absolutely required for the device to function.
What is up with that battery life... Lenovo X laptops with older batteries I have (i5-i7) run 15+ hours while coding and browsing. And lighter laptops I have with Ubuntu/Debian get well over 20 hours. Does this thing have a very small battery or is it too powerful? So little battery life is a shame so wondering if it is software or hardware.
I carry both (laptop and typematrix) on my bag. Heavier, but bearably so. And since I use a Roost laptop stand, I can't use the built in keyboard most of the time anyway.
I used to do the same. Eventually the cable of my TypeMatrix 2030 USB was damaged and I tried to fix it but I didn't know what I was doing back then. I replaced the first one with two new -- one for having at home and one for having at work.
I bought my first TypeMatrix 2030 USB in 2010 and kept using TypeMatrix 2030 USB until december of last year when I bought an ErgoDox EZ Shine with Kailh Thick Gold switches. Completely black with nothing printed on the keys. The ErgoDox EZ keyboards are ortholinear like the TypeMatrix and they have similar keys in the middle but they are split in two and they have legs so you can angle them. Also they are programmable so you can assign your own mapping. My key map is based on Dvorak of course.
The ErgoDox EZ is quite expensive but it has been worth it IMO. Very much so :)
The ErgoDox EZ is a bit more convoluted to travel with and you wouldn't be able to use it without a table but then again I had already stopped carrying my keyboard and laptop most of the time -- I now only bring them if I am going someplace and will be away from home for many days. Laptops aren't that great anyways; short battery life and the screens are too small. I mainly use a stationary computer with dual 24" monitors, 32GB of RAM, an AMD Ryzen 7 1700 CPU and an Nvidia GTX 1060 6GB graphics card.
There are two things that are really nice about having the keyboard be programmable;
1. I can use it with any computer without having to configure layout customization on the computer itself. Just set the keyboard layout of the computer to be US QWERTY if it isn't already (I live in Norway) and I can use my keyboard with the Dvorak based variant that I have defined.
2. It can send different signals based on whether you are tapping or holding a button pressed. Some people remap their caps lock to ctrl, some to esc. Typically people that use emacs will map it to ctrl and vim users will map it to esc. I use vim but I think having ctrl in that position is nice as well, so I have the button in that position configured so that it sends esc if I tap it and it sends ctrl if I press and hold.
Unless purism can prove they are NSA proof, I would not buy their computers. They expect a very large premium for the fact that their computers are open sourced.
Don't get me wrong. I believe what Purism is doing is great and with the direction of where society is heading this is a much needed product. I just feel like they could be doing more to prove that they have a truly secure(from NSA) and open.
With the Chinese social credit system and the vault 7 leaks; this is going to be a massive market segment.
Actually, I wouldn't be surprised if Purism itself was a secret NSA division :)
That would be the smartest way for them to target specifically the people that are actively trying to avoid being spied (so, logically, they have something important to hide).
Really? I want Intel graphics. Fuck dual CPUs on my developer laptop. I don't want to figure out how to disable or switch a separate NV/AMD GPU if all I'm doing is photo editing and running docker images.
I'd really prefer if the XPS15 or HP Spectre 15in had the option for an Intel card instead of an nvidia GPU I'll never use.
I have a Macbook Pro 15 with dedicated GPU at work and battery runtimes are abysmal, even after accounting for the bright HiDPI display. Great for gaming, but no thanks for a portable development laptop.
And from what I understand these days that's not so much the case except for when running the very latest games on High/Ultra. Or maybe when running >1 4k screens at 60hz or more.
Unless you're talking about AMD's new iGPUs, intel's desktop i7 8700k with UHD Graphics 630 barely scrapes over 60 fps on average playing Overwatch at 720p on low settings [0]. I think it'll take a long time before any iGPU can play recent-ish games at even low settings in 4k.