> Dual core is pretty weak
> I had 16 GB of ram in 2010 and it was not even a top computer at the time
> It has 16G of ram and a very recent 3.10GHz Intel CPU. What are you doing on your laptops? This is a deeecent laptop
What exactly are all of you people doing with your laptops? I'm running an old W500 Thinkpad at 2.8Ghz dual-core, 4GB DDR2, and a 500GB HDD. The only times this machine has struggled was when I was using a memory-leaking software (Firefox, etc.) or one that tried to load a whole large file into memory instead of chunks.
If I need to work with heavy graphics I can stick an eGPU into the PCI express or an FPGA if I'm doing specialized calculations.
But besides that, I don't understand why anyone needs all this gear.
C/C++ project builds. Losing two cores just about doubles build times, and template-heavy C++ code makes the compiler want gigs of RAM all to itself. In principle I should do this on a desktop or server, but IT isn't on board with that.
But if you're talking about packages from source, then I agree. It's the one thing that sometimes makes me question whether or not I should upgrade. But, after a full night of compiling Qt5 from source, that lingering doubt usually vanishes.
Is it that strange that people have different use cases from you? Plenty of things peg the CPU at 100% for extended periods of time: video encodes, 3D renders, building code, slicing for 3D printers, etc. Certainly I could do my work on a dual core machine, and I did until a few months ago, but there are competing machines that are cheaper and offer twice the performance. Why would you not want a quad core CPU when they are the exact same price as the old dual core models?
Linux is not very demanding of memory, but sometimes, I have to maintain legacy code writen in C#, so I use virtualbox with win7, visual studio, with is demanding for the memory.
Outside that, 4gb is survivable using Linux.
Hard drives don't have that problem.
see the "attacks against disk encryption" section.
I think you’ll still have the problem with this thing being on a pci-e bus though.
It's becoming increasingly more okay to just bleed resources and pay your technical debt off or lower your development costs by requiring users to have stronger machines. It's like implicit crow funding but in a very stupid way. Steve Jobs said it best in .
I recognized the author instantly (I know him because of his Pepper & Carrot comic which he notably does in an all FOSS set up and puts under CC-BY-4.0) and if he says the laptop looks nice, is quiet and works well for his typing and art then it should be a glowing endorsement but people are inventing flaws, pointing out the GPU is a weak integrated one, that the CPU is dual core and too weak for compiling lots of C++, that 16 GB of RAM is too little (I happen to have 32 GB but 16 GB is not little at all) and also totally ignoring that this is clearly a slick small ultra book (with an ultra lower power CPU ) and meant to run all FOSS with no fuss (feel free to steal that catchphrase), meaning no Windows, no AAA games, etc.
I do personally use a more noisy, bulky and powerful laptop with even more than those "just 16 GB" of RAM but I'd not call Librem13 weak, especially if it fits the owner's desired usage and does it in a slim and convenient for his travel patterns size.
I do wonder if my perspective comes from the fact I try to use stuff the best and fullest I can. E.g. for 5 years (2011/2012 - mid 2016) I ran a bulky business-y hands me down HP laptop bought in like mid 2010 (IIRC) with a dual core Intel CPU, 2 GB of RAM, integrated weak GPU with support for only OpenGL 2.1, 500 GB HDD. It came with SLED preinstalled and the only Fedora spin it could run well was Xfce and before Fedora when I ran Windows 7 on it I had to disable Window transparency and all the Aero baubles. I abandoned it only after the keyboard broke and by that point the battery was no longer holding charge too so I decided it's not worth it to sink cost into fixing and upgrading it to fit the times (battery, SSD, new keyboard, more RAM, etc.) and moved to a brand new laptop and Windows 10. It wasn't a money issue (or I'd go with keeping that husk of HP alive) but more of a personality issue, I didn't do anything power hungry regularly so that laptop was very fine for my web browsing, learning PHP, C, C++ and Lua, doing university work, writing LaTeX, etc. and I knew I'd move back to Windows from Fedora which would be an effort in itself too.
 - https://www.folklore.org/StoryView.py?story=Saving_Lives.txt
 - https://www.intel.com/content/www/us/en/processors/processor...
 - http://www.x-kom.pl/p/293547-notebook-laptop-156-msi-pe60-6q...
> totally ignoring that this is clearly a slick small ultra book (with an ultra lower power CPU ) and meant to run all FOSS with no fuss
Yeah. We complain we can't have a decent laptop with FOSS and a couple years later when we have one we whine it can't hold entirety of english wikipedia in its memory or that it has a right shift key slightly shorter than what we like. So ungrateful.
Being a developer does not mean everyone is running hundreds of docker instances on their laptops.
I am yet to own a machine with more than 8GB and most of the time it seats idle on 4GB, even when compiling C++ code.
I remember (I hope I'm not misremembering, I can't find it now) that Firefox took more than 4 GBs of RAM when being linked which ironically made it so the 32 bit version was being built in a 64 bit environment. Then again - who builds regularly rebuilds Firefox on their laptop (yes, the devs, by definition, do, but that's a very small subset of the population).
C++ has shortcomings with regards to build times, small change triggering a rebuild and linking time blowing up and so on but it's clearly the only one that fits the niche it sits in (maybe C can compete but it's a bit too spartan for some people, and maybe Rust in the future) but there are mitigations (like pimpl, forward declarations, etc.) and some of the criticism is just goofy to me.
E.g. there was story of a hell like project with more managers than programmers, no working coffee machines nor toilets, crazy turn around of employees, employees not knowing how to code properly, changing version tracking software a few times and throwing away the entire history each time, simple operations taking seconds or minutes, physical paperwork to apply to edit a file, entire thing was bordering on a scam, etc. Main takeaway of the author that he reiterated in 2018: C++ is bad.
I mean, I guess you can dislike it for all its warts (and it has plenty), but would everything be fixed if this was a Rust, Python or C# project and all of the other craziness remained?
Maybe I'm just biased because I know C++ pretty well inside and out and even kinda like it but also never got forced to work on a bad legacy project made in it (but than again any crazy legacy code is bad, regardless of the language..) but lots of people are just bashing C++ so mercilessly, even here on HN people go and say that it's impossible to write in it (using a web browser that's certainly in C++ and relying daily on compilers, OSes and runtimes made in C or C++ that host their language, unless they're all Free Pascal developers or something (FPC is self hosting) and the joke is on us for not using Free Pascal).
 - https://news.ycombinator.com/item?id=16793884
I had 16 GB of ram in 2010 and it was not even a top computer at the time
 - https://www.x-kom.pl/p/293547-notebook-laptop-156-msi-pe60-6...
There are others, inclusing Lenovo and Systems76. They’re all top of the line though, same price levels as the Dell you mention.
And I personally won’t buy Lenovo.
Great laptop for travel and customizability.
Some are disappointed that it's not a superpowered battlestation, but that's not what it's designed to be. It's really good in its niche.
I think they have a real opportunity to win over the group that doesn't like Windows and is getting increasingly scared off from macOS >= High Sierra, but not if it means a downgrade in any aspect of our current hardware.
† (Edit): I see that it actually does have one USB-C port, but I'm thinking more along the lines of dropping all non-C USB ports and charging over USB-C.
Charging over USB-C is ok but MagSafe was better and I almost never care about which side I'm charging on.
So mixed USB-C is the best bet IMHO.
I went to a Dell. 32 GiB RAM. Xeon Quad Core. Heavy. But great.
My eyes don't strain when I use smaller fonts that let me see more of what I'm working on (graphics, code, whatever) at the same time. The absolute lowest PPI display I use is a 108 PPI 27" 1440p display--but it also sits much further back than I do when I use a laptop and I have the physical real estate to use much larger fonts than I do on a laptop. On a laptop, where I'm under two feet from the screen at all times and am physically constrained, I have different needs.
I would not use a laptop that couldn't at least match a Retina MBP's 220 PPI. It's noticeable. It makes my life worse not having that; the "freedom" to use a laptop that gives me eyestrain is not worth it to me (and Purism's touchpads are pretty bad, too). The tradeoff is worth it to me.
I think for most people that answer would be a resounding "Yes". Me included, unfortunately.
I want the freedom to just get my work done without having to consider an ideological struggle each day. I care more about WHAT I create and less about the purity of the tools upon which I create. I submit pull requests for open source software I use and I am happy to by licenses/donate (such as Sidekiq Pro, or donating to the Vapor team,) but beyond that, I’m not going to try to perform tech gymnastics or compromises to make some “free” computer work “almost as good as a Mac.” You know what IS actually as good as a Mac? A Mac.
Lots of respect for the FOSS folks; I appreciate your mission despite not sharing the obsession.
For a laptop and target audience like this, bumping it up to 1440p probably makes sense in the next iteration. It's also important to note that there is a trade-off of more quickly depleting battery with higher-resolution screens. It sounds like the battery in the Librem 13 is just about par (if even) with current runtime expectations for Linux, bumping that up to 8h would be qualitatively a much higher jump for me than increasing the resolution.
That, and going 14" with a thin-border "infinity" display. Also, can we please get the Lenovo key arrangement for arrow and PgUp/PgDown keys? This continues to be a deal-breaker for me with most laptops.
It's a bit heavy, especially for a 13", but it's probably ny next non-Mac.
Unfortunately cost would probably be even higher than a macbook if it approaches that quality (retina display, form factor), because there's no economies of scale.
It's not hard to justify spending money on tools.
Last year I ended 10+ years of developing on macos because their latest "pro" edition had a weak CPU and didn't support more than 16G memory (don't even ask me about the TouchBar).
After some research, I went with a quad-core System76 with 32G memory (could have gone to 64) and 4TB SSD. The laptop is an absolute beast for software development; various compile/build times were cut nearly in half (versus my previous MBP), it is a screamer. While I do love it, I must admit it's pretty heavy.
If Purism made a quad-core laptop with 32G+ memory, I'd be very interested.
This way your lap doesn't get hot and your battery doesn't drain. Plus, you don't have to carry around a paving slab.
Cloud servers aren't great for interactive work like developing/debugging in an IDE or running a VM of a graphical OS (windows). I still use remote servers when I need real CPU/memory horsepower.
Seems like it's difficult to get 32G+ RAM on recent Lenovo models. There's not so many options out there actually.
The U series CPU's are typically matched with (soldered) LPDDR3 for battery life and thinness, which maxes out at 16 GB. The Xeon and H series CPU's are matched with DDR4 for performance and therefore can go past 16 GB. AFAIK the U series CPU's with 32 GB will come with the cannon lake generation with LPDDR4 support, which should start shipping this year.
Lenovo used to sell the T470p which marries an H series CPU with a small (but not thin) form factor, but since the 8th gen U series chips now also have 4 cores / 8 threads (basically matching the 7th gen H series for performance) they've silently dropped that line, opting for a simpler line-up based on U series across the board, topping out at 16 GB in many cases.
Damn shame imo their won’t be a t480p.
It seems they intend the t480 series to be its successor, now that you can get quad core cpu’s with dedicated graphics in that line.
That was the thing I liked about the T470p for performance it was the last step before doubling the price and getting a bigger heavier P-series.
The amount of power they put into 14" that stays cool under heavy load was damn impressive.
We use these as our default machines for devs in my organization.
They're solid machines and well built. Their chiclet keyboard is a little bit off from others' in a way I cannot accurately describe, so it is a bit of a muscle memory learning curve - this has been a repeated experience for all our employees, but it's not a huge deal. Maybe a week or two to get back to your normal speed.
But getting 16gb+ of ram, I really start to think I need ecc - a fair chunk of that's going to be file system cache, and I really don't want bitflips.
Any decent, light-enough, laptops with above specs and ecc ram?
The Lenovo "search by specs" page ignores their own customisation options, which is a bit frustrating. So Lenovo will tell people that they offer laptops with 16 GB RAM, when many of their laptops can be customised from Lenovo with 32 GB.
But Purism doesn't design these, Clevo does. Purism pretty much only chooses the anodization colors and stickers.
Maybe your confusing them with systems76?
I'm a software developer, and I fail to foresee such an impact. For me personally, and for the vast majority of the people I worked with directly in the past.
Could you explain how 6 cores (and possibly 32GB of memory) it would make a dev's life easier, as opposed to 2-4 cores and 8-16GB?
I’m probably on the far extreme of normal, but more cores and more RAM make a huge difference in tasks like rendering, compiling, building projects with huge dependency chains.
While I understand such monsters do exist, I cannot help but balk in horror at the though of millions of lines of code written in such an unsafe language. Things must have gone wrong on several levels to get to this point.
Maybe UNIX compilers need to do some improvements here.
And when using the experimental modules support it is even faster.
I want to compile 100K lines of code or more in less than one second. Compared to that, C++ is slow as molasses.
What could you possibly need such a humongous processing power for? Do you develop games or edit videos or something?
The time taken to render the Blender BMW test scene was listed as 27 minutes.
I'm typing this on a late 2013 MacBook Pro that I really love but if I could upgrade it to a similar sized laptop running Linux really well that could render Blender in a performant manner I would.
I confess I am surprised Krita pushes this laptop to its limits. It's a freaking 2D drawing program, how does it manage to hog the CPU like that?
Chromebooks, Apple, and Microsoft understand this. Dell mostly does. Logitech does not, nor do second-tier manufacturers who try to shrink down layouts without user-testing with conventionally trained touch typists.
Purism, if you're reading this, please: right Shift must fully overlap the horizontal space of the Enter key.
(the SHIFT key is to the right of the UP_ARROW key)
Why laptop manufacturers feel the need to get creative with the key placement is beyond me.
Trips me up every time I am without a decent keyboard (to be fair: I mostly use it docked).
Keyboard, screen, and form factor seem to be exactly what people want. I wonder how the trackpad is (usually a major problem with non-Apple devices to get "right").
I also wonder what the battery life is like since I didn't see that in the review either.
Apple's touchpad is better than the other OS's at least in part because the software driver is smarter. I think Purism themselves were working to improve the linux driver so there's hope on that front.
I read Apple's touchpad-driver source-code about 10 years ago, which was available to read under some OpenDarwin apple license. It was pretty neat. IIRC they draw a perfect circle from the center of the touchpad, radius extending to the top and bottom. Any time there's a keypress on the keyboard, within a small time-delay, any mouse motion outside of the circle is discarded. Brilliant! In comparison, the Linux and Windows drivers of the day would typically just say: within some larger(!) time-delay of a keypress, discard ALL touchpad motion. Which is noticeably more frustrating -- you had no way to reliably move the cursor at all while typing.
In any case, Macbooks were less enjoyable to me when they removed the physical button, but their driver is still ahead enough that it's not so bad to use, and their UI requires somewhat less right-clicking than I need especially on Linux. But for Linux usage I require precise left and right clicks which I can only get from real buttons.
I'm currently using a Dell Latitude 7380. The touchpad is great with real buttons :) But hilariously, I found (and taught them about) a bug with their keyboard firmware causing me frequent typoes, which affects all XPS 13 and Latitude 7xxx laptops. Last month they issued a BIOS update to fix it, specifically for the Latitude series. But if you have an XPS 13, I think they have not provided a fix yet, so you are probably running into typoes that are not your fault!
If you have a Dell XPS 13, try the following in any editor:
1. Type the letter "k".
2. At virtually the same time, type the letters "o" and "k". Press them at basically the same time except so that the "o" is first before the "k".
You should expect to see "kok", but if you encounter the bug then you'll just have "ko". This problem is for any two keys on the keyboard, if typed in that "A, BA" pattern.
Let me know if you have the bug! If it's not solved in the latest BIOS, then Dell probably needs a kick-in-the-pants from a real XPS13 owner to solve it (even though I tried my best to convince them that it has the same bug as the Latitude 7xxx, which is basically the business-equivalent but with my beloved touchpad buttons).
Side note: If you're looking for a laptop like the XPS13, under $500, and don't need as much processing power, check eBay. The refurbisher I purchased from seems to have lots of these, and may take offers. I got one with 3200x1800 QHD+ matte touch screen, m7-6Y75 CPU, 16GB RAM, and 256GB SSD for $475 shipped in the US. The service tag says it was in use almost exactly one year, and it looks brand new. I've only had it a few weeks, but so far my only complaint is that the battery life could be better (seeing ~4h average).
BTW, if I had to guess, BIOS 1.8.3 sounds like it probably introduced the issue:
- Resolved internal keyboard double letters issue.
This is usually done for a reason (namely, terrible keyboard hardware).
I have the same laptop (different colour, different branding, same hardware).
It has a physical button, hidden at the bottom of the trackpad area. It has all the haptic feedback of a mouse click, and it distinguishes left and right clicks. You can also left click with a finger, then drag with the other (I've just tested).
The physical buttons you're asking for are there. They're just not visible.
The battery is really good for a GNU/Linux laptop. I was used to laptops with two hours maximum while typing and 30min while painting. This one can go almost to four hours while typing and 1h30 while painting. Sometime even more, it depends the backlight settings and your activity with it."
The trackpad is the biggest question though: it’s Apple’s trademark and no other manufacturer seems to do it even close to as well.
Very annoyed that the Dell XPS 13 is barely compatible with any USB-C chargers. It won't even recognize USB-IF certified chargers.
Home is similar. One commodity charger on the table next to the couch can serve any of our laptops, tablets, or phones.
That's the world my family has now, with a couple male-to-male USB-C to micro-USB cables as adapters for legacy devices that haven't died yet. It's as easy as I'd imagined, and until something even more standards-based and capable than USB-C comes along, we don't plan to change. Unfortunately that rules out this laptop, which is too bad because I like what it stands for.
For a company whose advertising copy on their "Why Purism?" page includes, "We believe people should have secure devices that protect them rather than exploit them," only allowing removal of the TPM chip for a particular keyboard layout is a pretty big red flag.
The TPM is a security feature--you don't have to use it. I have not used it on my laptop but I see from the documentation that it is different from some other trusted computing systems in that the end user controls the keys and what is loadable, not the vendor.
Or may be the a soft version of it that says the most likely explanation is it's a mistake.
I feel like those of us who are privacy-conscious need to support projects/companies/products like these, that are transparent/open-source and respect their users' privacy.
I also agree with your sentiment that this particular laptop model might not be powerful enough for "serious work" - although it's getting close, and I'm considering something similar in the near future.
Wi-fi cards are actually a huge issue if you are concerned about privacy. You have no idea what the firmware is doing, and it has access to your entire memory through the pcie interface, and the internet at the same time.
in the year 2018
I don't see how using a Qualcomm (Atheros) 802.11n chipset is any better than an Intel chipset card, developers do NOT get access to the code that's running its firmware. Same as Intel.
But they have to go with some weird, ancient 802.11n card instead of a modern 802.11ac 3x3 MIMO, dual band Intel chipset card, because they don't like the binary blob in the ROM of the Intel minipci-express card?
At some point in time you have to trust the devices you're attaching to the pci-express bus in an x86-64 system or you won't have any useful functionality left.
Anyway, the more open components the system the better, and if this shows there is an interest for an more open system, the next one might be more open than this.
Not so much for the firmware blobs the vendor maintains in the "linux-firmware" kernel.org repository and that are absolutely required for the device to function.
I bought my first TypeMatrix 2030 USB in 2010 and kept using TypeMatrix 2030 USB until december of last year when I bought an ErgoDox EZ Shine with Kailh Thick Gold switches. Completely black with nothing printed on the keys. The ErgoDox EZ keyboards are ortholinear like the TypeMatrix and they have similar keys in the middle but they are split in two and they have legs so you can angle them. Also they are programmable so you can assign your own mapping. My key map is based on Dvorak of course.
The ErgoDox EZ is quite expensive but it has been worth it IMO. Very much so :)
The ErgoDox EZ is a bit more convoluted to travel with and you wouldn't be able to use it without a table but then again I had already stopped carrying my keyboard and laptop most of the time -- I now only bring them if I am going someplace and will be away from home for many days. Laptops aren't that great anyways; short battery life and the screens are too small. I mainly use a stationary computer with dual 24" monitors, 32GB of RAM, an AMD Ryzen 7 1700 CPU and an Nvidia GTX 1060 6GB graphics card.
Here is the ErgoDox EZ website: https://ergodox-ez.com/
Here is the layout I created for my keyboard using their online tool: https://configure.ergodox-ez.com/keyboard_layouts/qppwjy/edi...
There are two things that are really nice about having the keyboard be programmable;
1. I can use it with any computer without having to configure layout customization on the computer itself. Just set the keyboard layout of the computer to be US QWERTY if it isn't already (I live in Norway) and I can use my keyboard with the Dvorak based variant that I have defined.
2. It can send different signals based on whether you are tapping or holding a button pressed. Some people remap their caps lock to ctrl, some to esc. Typically people that use emacs will map it to ctrl and vim users will map it to esc. I use vim but I think having ctrl in that position is nice as well, so I have the button in that position configured so that it sends esc if I tap it and it sends ctrl if I press and hold.
Here is a photo of my room that I took just a couple of days ago: https://i.imgur.com/NcqDL9N.jpg
Don't get me wrong. I believe what Purism is doing is great and with the direction of where society is heading this is a much needed product. I just feel like they could be doing more to prove that they have a truly secure(from NSA) and open.
With the Chinese social credit system and the vault 7 leaks; this is going to be a massive market segment.
That would be the smartest way for them to target specifically the people that are actively trying to avoid being spied (so, logically, they have something important to hide).
Or maybe you are a NSA agent trying to send false flags! :p
Every time I see Intel HD Graphics I think weak GPU. Bad for image, Intel.
I'd really prefer if the XPS15 or HP Spectre 15in had the option for an Intel card instead of an nvidia GPU I'll never use.
There's an open question https://forums.puri.sm/t/install-tianocore-payload/2629 and a somewhat-relevant developer comment at https://forums.puri.sm/t/librem-v15-boot-issues/1202/7 .