Been extremely happy with mine the past couple months. The little modular port attachments seemed like a novelty at first, but now it feels absurd that you'd buy a laptop with a bunch of "hardcoded" ports that you can't ever change.
The only real Linux related quirk I've run into so far is that you have to disable panel self refresh (it's on by default and causes stuttering). Other than that tiny thing I pretty much just installed my stuff and started using it.
One little anecdote: I got a card in the mail from Framework saying that there was a problem with the cable for the touchpad, and it had instructions on how to fix it. Contrast that to my experience with Apple where they would delete forum threads for laptop problems and spend years denying issues until legal action forced them to acknowledge it.
Anyway, I'm a fan. I'm really looking forward to when the marketplace opens up with some new parts. I really want my blank keyboard. I'm hoping 2021 will be the year I can own a laptop without a god damn windows logo emblazoned on the keys.
Maybe I need some convincing here. How is the "modular port" concept any different than a universal port with dongles (i.e., how Macs have been since 2016). To me the fact that the port attachments are recessed is little more than a gimmick. Especially as all of my devices have transitioned to USB-C anyway, dongles/"modular ports" feel more like a stopgap measure than one requiring a permanent form factor change.
I find it hilarious that we spend multiple thousands of dollars on sleek, elegant hardware and then hook up chunky plastic dongles to overcome their bad hardware interfaces.
So I love the idea of these ports (agreed, they're basically "recessed dongles").
I couldn't lose them / forget them. They wouldn't take up space in my bag while I'm traveling. I could "set and forget" them to perfectly match whatever desktop / docking setup I'm using. In five years when my wireless VR system uses some as-yet-unknown hardware interface, I can swap a single component out to support it. Seems like brilliant design to me.
The crazy thing here is that it’s not so hard to hit the standard set of “pro” ports —
USB-C x 4 (new standard blabla)
USB-A x 2
SD x 1
HDMI x 1
Phone/Mic
I applaud the modular approach but Apple’s donglevision was the pure distillation of user-hostility between the Bean-Counter in Chief and the SVP, Thin Stuff.
And all the industrial sheep who followed them. May we all recover…
IMHO it’s a shame that Lightning didn’t become the standard connector for USBC. There’s a reason for that: socket fragility.
The socket is the most expensive part of the connection and when it breaks, it’s bad news. If you’re lucky and the bean counters didn’t overrule engineering over a microcent saving, the socket is on a daughter card otherwise you’re stuck with a one-port-down device, or an expensive motherboard replacement.
Instead, the Lightning connector is as stupid as it gets, worst issue is pocket lint you can easily remove with a toothpick
I'd disagree with that one, with Lightning the pins are in the connector. I've broken my iPhone before trying to get lint out of the charging port and bending the pins by accident.
USB C on the other hand has all the pins cable-side so there isn't anything to worry about ramming whatever you fancy into your phone or laptop since it's just a PCB with pads on rather than anything you can bend.
In my experience a sewing pin is just about the perfect thickness to get into an USB-C port. If you're even halfway careful you can dig out all the lint without damaging anything.
I do have to say that USB-C seems to be much more lint-prone than micro- or mini-USB. I have never needed to dig out any lint on my previous phones, but have had to do so a fair few times on my latest phone.
Thanks for sharing your anecdote. Mom & Dad didn't give me a call about your phones not charging yet, but knowing that it's doable is a relief.
I don't remember digging lint out of any previous USB generations either. Only a couple of USB-A connectors, which were integrated to some smaller MP3 players (yes, I remember them!).
The receptacle housing generally prevents you from bending the connector tongue, and even if you manage to do it somehow, AFAIK it's almost never FR4 in the receptacle (although USB-C receptacles printed directly on 0.8mm PCBs work great!). It'll generally be some sort of injection molded thermoplastic that's fairly flexible, so even if you manage to bend it, it'll spring back.
As for pins vs pads, you can make pads almost arbitrarily more durable by increasing the gold plating thickness, whereas it's really hard to make pins not bend.
I disagree. In my experience the lightning port/connector is the main pain point of the iphone. Even if you clean it out with a toothpick, it stops charging well (you have to fiddle with it and even hold it in a specific position to get it to charge), the cord falls out easily, etc. I truly hate the lightning connector. I'm not sure if USB-C will be better, but it certainly can't be worse.
Apple can figure how to put USB-C on a phone, but perhaps does not because lightning's connector has preferable RF sensitivity profile.
Not sure if it's measured as significant, but it's the sort of thing that RF engineering concerns itself with (preventing anything from detuning antennas or otherwise raising the noise floor).
> perhaps does not because lightning's connector has preferable RF sensitivity profile
Given that literally every other smartphone on the market has a USB-C port, i'd say this is not the reason why Apple used a non-standard connector, failing (voluntarily) to comply with European interoperability laws/standards.
FYI the interpretability standards passed recently were for the charging block. Every device has to charge from a usb c port so that’s why Apple switched.
Apple are slowly migrating to solutions that feature USB-C at some level, even if they keep Lightning on the phone itself. That's because EU authorities have signalled that they know Apple are taking the mickey, and will continue making more stringent rules until Apple play ball. E.g. https://www.bbc.co.uk/news/technology-58665809
Micro-USB was the standard back then. All phone manufacturers had custom proprietary (though most jack-based) connectors, but due to european regulations they all switched to micro-USB to comply with standards... all, except Apple of course who went another way.
Apple was not exactly unaware of these developments, as they have repeatedly signed the memorandums of understanding surrounding charger interoperability, according to wikipedia.
For what it's worth, Lightning is dramatically better than Micro-USB ever was, and IMHO it's still better than USB-C in terms of form factor (it's thinner, allowing for thinner ports, so thinner devices) but not compatibility (I have to have a bunch of C-to-lightning cables and a bunch of C-to-C cables).
> but perhaps does not because lightning's connector has preferable RF sensitivity profile
... or because Lightning has a commercial "preferable profile" - it's another form of lock-in, and at the hardware level no less; such an extremely desirable feature, from a commercial perspective, is very, very hard to give up. It would open the door to a world where phone accessories are effectively universal, and surely we can't have that.
to your point, Apple also earns money from a 2+ year head start on shipping thin-connector phones, shipped in Sep 2012 whereas usb-c had only reached a stable design decision in Aug 2014, so would wait even longer to ship.
Didn't Lightning ship (Sep 2012) before usb-c was standardized (Aug 2014)?
So no idea about RF engineer comparisons, but Apple seems to have stopped waiting for consensus on what USB-C would be, in wanting to ship something thin before Samsung and Google and LG could even design something thin, by almost 2 years.
I wonder if some research suggests a port change would stall phone upgrades among enough users of earlier models, and that phone upgrades are lucrative.
In itself the Lightning connector apparently is better than USB-C but just look at the market of USB-C (and also USB-A which can be connected to USB-C easily) devices, compare it to that of Lightning and it becomes obvious which is better for you and why does Apple want the other.
I think one reason why lightning works great is because it is made by Apple. Apple is expensive, and therefore, it can afford to spend the couple of extra cents needed to manufacture a good connector and install it properly.
USB has to go on devices where the port already represent a sizable fraction of the cost, and there is a race to the bottom to whoever will produce the least expensive parts, and of course, it is shit. If lightning was standard, we would probably see a lot more failures, simply because not everyone has the same quality requirements as Apple.
Of course, USB doesn't have te be terrible, but when you compare USB to lightning, you compare a mixed bag of good and bad parts to only good parts. To be fair, you should only compare USB implementations from reputable, expensive brands against lightning.
Changing a connector is not trivial if you have hundreds of millions of customers and a massive ecosystem of third-party vendors, each with their own roadmap.
Of course, Apple could pull it off if they really wanted. They’ve done it it with iPads. But please don’t frame it as customer trolling. We’re better than that here.
Before USB-C, the universal, international standard was micro-USB and every phone manufacturer (at least in Europe) was bound by law to implement it. Apple has changed its connectors since socket interoperability became effective and they could have adopted the standard. They just purposefully ignored the consumer-respecting standards in order to keep their 40$-connector business flowing.
According to European regulations, Apple's actions are strictly illegal, but if any law enforcement actually cared to protect people from wealthy corporations, we probably wouldn't have any climate change, tax evasion, planned obsolescence, science-denial smoking ads, corporate land grabs, companies stealing water supplies from local populations... As always, laws that protect the weak from the powerful are betrayed, while laws that protect the powerful from the weak are strongly enforced.
> Before USB-C, the universal, international standard was micro-USB and every phone manufacturer (at least in Europe) was bound by law to implement it.
That is false. The EUC program was about PSUs, not device ports, and Apple was compliant by providing PSUs with detachable cables. Furthermore the EUC never legislated on the subject, they considered that the voluntary covenant worked well enough and no legislation was necessary.
> Apple has changed its connectors since socket interoperability became effective and they could have adopted the standard.
They were already complying and the “standard” at the time (micro-usb) was bad, not using it was a good thing.
> They just purposefully ignored the consumer-respecting standards in order to keep their 40$-connector business flowing.
At this point you’re just outright lying.
> According to European regulations, Apple's actions are strictly illegal
You are, and I want to make it clear that this is an objective affirmation, high as a kite.
> The EUC program was about PSUs, not device ports
Are we talking about the same thing? You seem to reference this memorandum of understanding [0] promoted by the European Commission (and signed by Apple), whereas i reference further developments such as this vote [1] which was widely advertised in the press at the time.
I am unaware whether that vote was actually turned into a regulation, but i am fully aware that the European Commission is not the entity deciding on regulations in the EU (although it has way too much power to overrun the EU parliament).
> the “standard” at the time (micro-usb) was bad, not using it was a good thing
OK micro-USB was not the best. Still much better than using custom proprietary connectors overall. Just look at how much money/resources was saved by reusing existing cables: do you remember the hot mess we were in in the early 2000s when a phone charger broke, to find a spare compatible one?! Now i can't remember the last time i had to buy a phone charger, because there's an abundance of standard cables. It's a net win for me and my wallet, and a net win for the environment.
Also, not going with a standard you deem bad is fine... if you're working to either improve the standard or replace it with another one. Which Apple never did, as they were happy to have their custom hardware which their fanatic customers would buy no matter the price.
> At this point you’re just outright lying.
I may be misinformed on specifics, but i'm for sure not lying. If you're impying that Apple (or any multinational corporation for that matter) are good faith, you have some research to do on how industrial capitalism operates and its actual consequences on people.
>> According to European regulations, Apple's actions are strictly illegal
> You are, and I want to make it clear that this is an objective affirmation, high as a kite.
OK i'm high as a kite, maybe? Does that make my message wrong on every aspect? Apple has been known to and condemned for breaking many european regulations already [2] [3] [4] [5], often engaging in actions they knew were illegal. I'm not a lawyer so i can't comment on the technical legality of their Lightning connectors, but i can for sure as a european citizen say that they knowingly and willingly violated the spirit of the law to further their profit.
And as a pseudonymous person on a random orange forum, i can say you should take more time to correct facts with actual sources, instead of defending evil corporations while accusing your peers of lying.
Let's be charitable here, the lightning connector appears to be more durable than USB-C, at least on the device side. There's no protrusion, whereas in USB-C the contacts are on a very thin prong that çan be damaged if something small enough manages to get inside the connector.
I'm not aware of such issues, but i'm personally still running micro-USB devices only so i have zero clue. Let me know if you have links/resources on this issue.
However, i'm fully aware these were not the arguments presented by Apple when they refused the USB standards. If Apple cared for durability, which they definitely don't [0], i'm sure a lot of people would appreciate that and maybe standards could be improved across the industry.
The fact that Apple never cared for any form of standard that i know of [1] does not give them a lot of credit.
[0] They pioneered making it very hard to replace your own battery and flipped the finger on everyone by using non-standard screws on purpose. Seriously, how can it be legal to sell a product which requires any form of tooling to change a battery?! Let's not even get started on software obsolescence on iOS/macOS...
[1] USB and VGA, sure, because they were forced on them. Maybe FireWire? But even then i'm not sure it was a standard back when Apple started using it... On the software side, apart from email, DNS and WWW clients they also don't respect any standard protocols: AirPlay, iCloud, etc.
> I follow smartphone world quite closely and have never heard/read about USB-C issues.
This isn't an effective point as the "smartphone world" is plagued by ephemeral devices which are either susceptible to programmed obsolescence or are caught in an upgrade treadmill due to a myriad of reasons (non-replaceable batteries failing, screen problems, camera issues, hardware failing due to wear, blocked software updates, fads, etc..)
> Changing a connector is not trivial if you have hundreds of millions of customers and a massive ecosystem of third-party vendors, each with their own roadmap.
And yet not only has Apple already done exactly that for iPhones (specifically: migrating from the iPod connector to Lightning), but so has virtually every Android vendor done exactly that for Android devices (specifically: migrating from USB micro-B to USB C).
> And yet not only has Apple already done exactly that for iPhones (specifically: migrating from the iPod connector to Lightning)
That’s the point though is it not? Apple was just out of a connector switch, which required users to throw out all their old accessories and get new ones. They were not going to do that again within just a few years.
> so has virtually every Android vendor done exactly that for Android devices (specifically: migrating from USB micro-B to USB C).
Historically, Android had nowhere near the accessories ecosystem of Apple.
I believe that is why Apple is starting the switchover to USB-C, with the new iPad using USB-C:
* the dock connector lived for about 10 years, we’re approaching the 10th year of Lightning, that’s a pretty good lifecycle for a connector
* the universality of USB-C amongst Android manufacturers means there now is a large ecosystem of accessories and Apple won’t have to rebuild their ecosystem from scratch
I wouldn’t be surprised if the ipad was basically a warning shot, and Apple switched the rest of their mobile devices over to USB-C with the 2022 releases.
> I feel that only confirms my point. The switch happened ten years ago, yet many people are still being mad at Apple today over it.
It disproves your point from multiple directions:
1. It demonstrates that Apple has no qualms about abandoning proprietary connectors and leaving an entire connector ecosystem stranded overnight.
2. It demonstrates that said connector ecosystem has no qualms about adapting to a new proprietary connector - let alone a standardized one.
And no, I know of precisely zero people upset about switching away from the iPod connector. The only thing about which anyone is upset about is the fact that Apple chose a different proprietary connector instead of using that opportunity to standardize.
> I think I’m failing to see your point. None of those vendors has any amount of control over the USB accessory ecosystem, or do they?
The bigger players absolutely do manufacture their own accessories, but that's secondary to my point: that the accessory market readily adapted to phone manufacturers switching connectors on its own. Apple, if anything, would have an easier time for the exact reason you indicate: Apple has control over the Apple accessory ecosystem, and can use that control to put additional pressure on accessory makers.
What do people think they're saying by saying this? No, this has literally happened here by one of "us" so "we" are clearly not better than this. Heck, "we" have done and are continuously doing far worse than this.
In case it’s not entirely clear: I was referring to HN guidelines.
Accusing others of acting in bad faith is never helpful. I will continue to remind others of the rules, no matter how often they’ve been broken in the past.
Cool. Yet this behavior (and worse) is widespread and not being punished by moderators except for the most blatantly obnoxious cases. Guidelines are only relevant to the extent that they are enforced, otherwise they're just a wish list, not a code of conduct.
Speaking of good faith, a good faith reading of their comment would be that they think Apple is intentionally maintaining a non-standard connector for their smartphone range despite knowing that switching to USB-C would be beneficial to their users.
Alleging that Apple acts in bad faith hardly seems like a violation of HN guidelines. If anything, the claim that they put the needs of the users first would seem the preposterous one as one would expect them to be beholden to their shareholders (and thus profit) above all, not their customers, and there are plenty of reasons why maintaining their own connector might be more profitable.
Accusing people of acting in bad faith is impolite. Accusing companies of acting in bad faith when there is ample evidence of their wrongdoing is one's duty as a consumer.
No ethernet ? WiFi is nice and all but when I get a docker-compose project that decides to pull down the internet I really love the fact that I'm on a gigabit network.
This is where it goes wrong. Everyone thinks their particular favourite port is a 'pro' essential, and we end up with Homer-cars with a thousand ports. Just use USB-C. Almost everything can go through USB-C.
But not everything can go through the USB-C cable you have on hand.
That's the annoying bit with USB-C. We may have (almost) standardized on a single plug/socket shape, but we didn't escape the essential complexity - the fact that one type of connection cannot handle all the use cases we'd like it to. We just pushed that complexity into cables. Instead of having to deal with separate data, network and graphics ports, users now have to deal with potentially separate data, network, graphics and charging cables. I'm not convinced this is an improvement, because USB-C cables are a bottom-feeder market that will not hesitate to outright scam the buyer.
At this point I'm not sure it's an improvement. I feel like the optimum point would be a small amount of standards targeting mutually incompatible applications. That, or forcing some specification requirements on USB-C, and standardize some capability labels.
I hope that this is what USB4 will bring, since iiuc, USB4 is basically the IF's name for Thunderbolt-4-capable USB-C. This was enabled by Intel contributing the TB4 spec to the committee, in a shockingly benevolent move that I guess may have been the greatest internal political feat Intel staff pulled off in the last decade.
Edit: Oh and presumably the ports on the Framework are USB4, they just can't say that yet because the certification is still in the works.
Ah it seems I was slightly off, it's TB3 not TB4, "The USB4 specification is based on the Thunderbolt 3 protocol specification." [0] But it does require: USB-PD, PCIe & DP tunneling, minimum 20 Gbit speed, max 40 Gbit speed. Stated goals to "minimize end-user confusion".
I've seen some peripherals and such with it, no laptops yet though. The spec was released in 2019, so considering hardware cycle time we should start to see more devices soon. It's pretty cool that Framework will likely be on the leading edge of that wave.
Is Dell still gimping them to 10Gbit/s like they used to? Considered an 13" XPS for years, but then turned to Apple because of this ridiculous decision.
I do not want all my USB-C cables to be able to handle 90W. That would make them very thick and expensive.
I do not want all my USB-C cables to support the maximum 40 GBps speed (or whatever it is). That would require them to have all the 19 wires and shielding and all and again, would make them expensive and short.
And just imagine how much a 90W maximum speed 3 meter cable would cost...
I prefer having one power cable, one fast cable and then a bunch of disposable cables for general use cases.
I prefer all my cables with the same heads to be exactly the same. Why thought it was a good idea to make them different? As if someone buying the cable will know the difference.
But then you lose the flexibility of using one port in many different ways. What we need is some standard color coding or other clear visual indicator on cables to reflect their capabilities.
I've opined the same before. Just put standard-colored rings (with textures, if we want to be sight-flexible) on the cables, when they're shipped from the factory.
Standardize the colors through the IF, and bam, you can tell at a glance what a cable is capable of.
Like resistors, except I don't think cables are likely to shrink too much in the future.
You'll never get Apple to do that though. They didn't with USB-3, they didn't with mouse/keyboard, they won't put them on their cables. And since they're the premium brand, everyone else will have to follow them.
I'm not sure why you're being downvoted, those examples are all absolutely true. There's no chance of Apple complying with a spec which doesn't meet their sense of industrial design (which there's no way this would).
I will say, they're not alone. Look at Razer, for instance. Electric green is not exactly a part of the USB 3 standard.
Which is a great solution if all you have are Apple cables, but colored rings fall off or get broken. It's like the US solution to healthcare: "don't be poor", it's not practical in real-world sense. People are going to buy (and make) whatever shit cable they want and regulations and standards don't mean a thing.
I would have to go with the parent comment although I know what you're saying as it would reduce cost.
I think what we've learned throughout the years is that your color coding idea doesn't work out in practice due to an earlier comment stating that the USB-C market being bottom-feeder. There has to be an exact, rigid specification of USB-C cables that all of them should follow (i.e. USB4/TB4). Any more complicated than that like color coding results in giant scams by manufacturers, outright wrong, or impossible-to-find cables on an eCommerce search engine. I just don't want to deal with any of those anymore. It feels so much nicer right now to look up TB4 on AliExpress and be done with it, no more worrying or guessing.
USB-PD permits 100 W with 20 V at 5 A. If that's carried over just two round copper wires (I don't know whether it is in USB-C) they would need to be 18-gauge or thicker for safety—about 0.94 millimeters. If they're copper, that's about 7.3 grams of copper per meter, 14.6 grams including the return path. Copper is expensive: almost US$7/kg. So a 3-meter 5-amp DC or two-phase cable would weigh 45 grams and contain 15¢ worth of copper. You could drop both the cost and the weight by going to aluminum. If you were designing the system from scratch, you could use 3-phase AC to cut the weight by half again, and use 48 volts to cut the weight by another 58%.
I don't have any idea how thick the 19 wires have to be for USB 40Gbps (GBps?) but I imagine the answer is "not nearly that thick".
Bottom of the barrel vendors will absolutely give a fuck and will cheap out without telling you.
The advantage of USB 2 is that it's so simple that it's very hard to screw it up. You pretty much have to intentionally do it if you want to create a dangerous cable. Even the shittiest cable will work with the vast majority of devices (it might slightly heat up, voltage may sag at the receiving end meaning it will charge slower, but it'll somewhat work).
USB-C is significantly more complex and requires active electronics in the cable itself in some cases, and the potential for higher voltages means a faulty/recklessly-designed cable could request higher voltage from the charger and blow up whatever's connected at the other end.
It sounds like you could maybe make substantial progress by just separating the differential pairs (?) by a millimeter or two of dielectric, giving you a ribbon cable, with much lower crosstalk than the round kind. Bonus points if you color the dielectric rainbow colors.
I feel like it's harder than shielding and tolerances, given that longer passive cables don't seem to exist despite the very high prices people are paying for active cables.
There might be issues of attenuation; we're talking about signals in the GHz range, where you have to use waveguides instead of wires to get low losses.
> I do not want all my USB-C cables to be able to handle 90W. That would make them very thick and expensive.
The difference between the minimum and 100W is that the cables need to support 5 amps instead of 3. That's not much difference at all considering there are data wires too.
Supporting 240W requires a couple tiny components in the plug. That's also barely anything.
By “expensive” are you talking in the $20-30 range for a single sufficiently long cable? I don’t replace cables that often but I don’t see the big deal paying a reasonable for a high throughput cable when needed.
They are if you buy the right cables. Just buy cables which have the capabilities you want, and are obvious to you. It's pretty easy, as long as you're willing to put about 5 minutes into the effort one time.
It's definitely an improvement, because you can still carry the one cable that does it all, and use it for everything, even the things that don't actually require it.
Honestly? Presentation. That's why I consider it a dumb argument in general. People mention "Homer's car" or equivalent memes from works of fiction as some kind of ridiculous contraptions, but don't bat an eye when a show like Star Trek does the same. The big difference, IMO, is that Homer's car is delivered to you up front, a solution looking for problem(s). Star Trek's tricorder or roundabout or a starship only happen to show a different one-off feature every episode - so the realization that the equipment is deeply multipurpose, and has all those features already present, kind of flies past people who're not into this sort of thing.
The issue for me is how silly it is to hard-code these arbitrary and often single-purpose connectors in the laptop.
A laptop should be a general computing device. So why hard-code something as weirdly specific as an SD card reader into it? Give it the functionality to have any IO device attached (USB-C) instead.
How many additional watt-hours of battery would they have been able to fit in the laptop if they didn’t have the carve-outs for such dongles?
Say what you want about the MacBook’s lack of user-replacability, but it’s basically a tiny chip board about the same size as the iPhone’s, with a big box of batteries holding it.
> How many additional watt-hours of battery would they have been able to fit in the laptop if they didn’t have the carve-outs for such dongles?
Looking at the insides, I'm going to guess about 2 watt hours. Or they could have made it unmeasurably thinner.
> Say what you want about the MacBook’s lack of user-replacability, but it’s basically a tiny chip board about the same size as the iPhone’s, with a big box of batteries holding it.
Framework has 55 watt hours. The obsolete macbooks have 41. Both intel and M1 macbook pros have 58. Both intel and M1 macbook airs have 50.
Sounds like that lack of user-replacability isn't necessary.
Man I loved the idea of PCMCIA back in the day. Mobile network access (EDGE, if I remember correctly) via one of them on my chunky Toshiba was awesome.
I was fascinated by PCMCIA, because I had a laptop I was trying to put OpenBSD on it in 2001 and its ethernet port was not working, so the card was a workaround. I always wondered why it didn't really take off in Europe, it was a simple and pretty compact way (for the time) to get very advanced stuff in a laptop - I guess it was expensive to produce and the name was atrocious. I believe it got more popular in Japan.
I very much enjoy multiple USB 3 ports, ethernet, card reader on my laptop and do not have to carry any dongles. And I can easily hook 2 x 4K 60P screens using built in HDMI and mini-DP ports. It also has thunderbolt 3 so I can still hook anything extra should I ever wish.
That argument would make far more sense if these anorexia laptops at least compensated for the removed ports with more USB ports. But no, you get the same pathetic 4 (at most) as always.
Even worse when they're the pathetic failure (host-side) that is USB-C, so nothing fits without a dongle anyway. Bonus points if you have to waste one of them for charging the laptop, yay!
From what I have seen, it is more the hobby/semi professional range that has all the possible connectors built in whereas in the high end it is more modular and you buy different modules dependent on the connectivity you need. Especially if you need to fit it into a rack. E.g. you might only have some DSUB 25 pin connectors, but they cover dozens of analog I/O channels on minimal amount of space.
I guess some of those are analogue? I guess you can't squeeze those all through the same physical form factor connector. You can with digital, so let's reduce the clutter and do it!
I'm not a musician, but I've seen plenty of DJs setup their gear. It is clear that the connectors and cables are designed to be physically durable. They work in environments where even a beefed up USB cable would only last a few gigs, since building compact connectors for consumer grade electronics is at odds with the day to day reality of commercial applications.
I'm sure that other factors play a role. The economics of going digital would be terrible if it meant replacing a significant amount of equipment every time a new standard took over the market. Again, pointing to USB (since that it what everyone seems to associate with universal digital connections), we have seen three major iterations and a number of minor ones over the past 30 years. That's hardly the type of cycle that businesses want to hop onto given that a tiny operation requires thousands of dollars of equipment, where any given component may be anywhere from a couple of years old to over a decade old.
> we have seen three major iterations and a number of minor ones over the past 30 years.
... None of which broke existing functionality. I can plug a full-speed device from 2000 into a USB-3 A port and it will work perfectly (as long as there is still software support for the vendor-specific drivers that might have been necessary for non-class-compliant devices).
Except for USB, they are all analogue, and some are mutually interchangeable.
3.5mm TRS, dual 3.5mm TS, dual RCA, 1/4" TRS, dual 1/4" TS, XLR cables transfer the same kind of signal, and you can easily convert between the connector with dongles.
Mixers have all of these so that you wouldn't have to.
The utility is not thinking about where the f***ing dongle is when you just want to plug something in.
Yes, exactly, let’s suggest musicians to use USB-C, and every third cable won’t work, and they will be able to make a concert but with no guitar, exactly like the devices in front of us when we try to work.
The only insurance against “the USB-C downtime” is a subscription to Amazon Prime 24hrs delivery and another $68 (no kidding) Apple cable.
There's thunderbolt, USB 3, USB 4. External adapters of varying quality and capabilities are often inferior to even budget integrated stuff.
For example getting a 4k 60FPS HDMI dongle was going to cost me >100$, and the cheap ones I had overheated. Meanwhile a budget laptop with HDMI and integrate graphics works fine. Getting a dock with gigabit ethernet, high res HDMI, decent SD reader and a fast hub was >200$ last time I checked - and not that portable either.
Cannot agree more. USB-C is a big mess. I've quite a few of them in different specs. Some can do 100w PD, some support DP-Alt mode, some are Thunderbolt 3, and some are USB 3.0 and can allow a maximum of 2A, some even only support USB 2, however can deliver 5A. Put all those mess aside, some started to fail just after being used a couple of times.
That covers pretty much every common scenario when travelling.
USB-C for a connecting to a dock, HDMI for a meeting room screen, USB-A for reading a flash drive.
Homer cars is a macbook with a bunch of stupid HDMI and usb-c to usb-a dongles hanging off it so you can read a flash drive or connect to a meeting room screen.
My newest laptop gets a fairly consistent 700-800Mbps on WiFi.
Don't get me wrong I still prefer ethernet to avoid packet loss and reduce latency but download throughput isn't a problem I notice on WiFi anymore (since I'm also only on a 1Gbit/s line)
What a bizarre retort. It should also be fairly obvious that if you carry your laptop somewhere else you're not going to be able to reach it with the ethernet cable, either.
If you have a good WiFi network at home, that's great when your laptop is at home, but if you carry it outside your home you are at the mercy of whatever infrastructure you find there.
Usually if you need to transfer large amounts of data you can still plug in an Ethernet cable in e.g. an office.
Because usually, you can't just throw a 100m cable through a building and plug it just into some Ethernet socket. And who wants to carry around a spool of cable?
I don’t know really… yes Ethernet is nice to have but in 16 years of using an MBP as my “pro” machine in a big company I needed it like twice, 14 years ago. So yeah in principle, you’re right.
Why should anyone waste the precious space in a ultrportable laptop on the newfangled and time unproven technology which Ethernet is? I want my Token Ring port back to connect to my ring in a box with a Boy George connector – to celebrate the diversity of the computing I have filled up the basement and the attic of my house with.
I'm not disagreeing with the value of a wired Ethernet connection, but both my new ThinkPads (P1 and X1E Gen 3) have gigabit Wi-Fi. Connected to my Asus RT-AX86U, I got a 935Mbps download on a speed test over Comcast.
I had heard that 802.11ax (Wi-Fi 6) was pretty good, and it sure looks that way so far. I have some good Cat 8 Ethernet cables, so I will experiment with that too.
I have fiber optics internet connection at home, and my 4 year old MacBook Pro does consistently over 500Mbps (peaks close to 700Mbps) over WiFi. Granted, it’s still not 1Gbps, but I can’t think of any regular scenario where it would make a significant difference.
I live in an apartment building, while I have a 5g router in my living room my work room is separated by a bearing wall, but even in the same room I often get random interference where the internet starts stuttering.
My desktop with ethernet is way more stable than my MBP WIFI. Also ping is noticeably lower for games.
Hmm, I think the RJ-45 fits well enough on my 2019 Acer Aspire laptop. With the spring-loaded flap shut, it ends up being no thicker than the HDMI port next to it.
Apple did one good thing, which is make every USB-C port have the same capabilities (charging, thunderbolt). Windows laptops, especially once you get down into the budget section, are absolutely atrocious at this, you have to read little lightning symbols and can only charge from a special port...
Which came at the cost of just having fewer ports. 2 USB-C ports is a joke even if they are both thunderbolt 3 capable. One is taken by charging if you don't have a thunderbolt dock with power delivery, leaving you with effectively a single port.
It's always been that way on Macbooks. It's also simplified with USB 4, which means the newest Macbooks just support everything under Thunderbolt / USB 4 on every port. Older Macbooks may have had some Display Port shenanigans because of differences between DP 1.2 and DP 1.4 and whether it was over Thunderbolt 3 or USB 3.1, but all modes were basically supported.
Any port on any Macbook with USB-C can be used as the charging port, which is a big deal all on its own compared to most non-Macbook laptops that use USB-C charging.
USB-C is much more complicated than most of us would anticipate, I would prefer to make it more specific: 2 Thunderbolt and 2 USB 3.2 gen 1. And I don't know when was the last time I used SD, let's save it for something else. And on a computer, I would prefer DP or mini-DB over HDMI.
A laptop is a computer you use on the go. I don't see how this usage pattern includes that much of external hardware to use all those ports. Smartphone, data stick - that's it.
There may be a kinda-permanent place, where one using their laptop most of the time. I don't see any problem having a dock station there with all the the ports and a power routed via single USB C or Thunderbolt port.
The problem is not the industry. The problem is people using laptops where they should use desktop computers. Which are, coincidentally, are modular and expandable through the roof.
Not everyone is rich enough to also buy a desktop computer or have space for it? Not to mention that the hassle of duplicating software and data files between a laptop and a desktop is too much work for anyone who does not actually like to spend time on tech.
If I am on a tight budget, getting a desktop instead of a laptop is a no-brainer. There are very narow field where laptop is a must, and most of this are valid for employed individuals, so the burden of providing the hardware is on employer.
> The crazy thing here is that it’s not so hard to hit the standard set of “pro” ports —
Given that the cheapest USB hub allows you to plug in half a dozen USB-A devices and SD cards, and given that frequently they are not used at all by anyone, why would it be preferable to add 3 dedicated ports instead of just using one of the four available USB-C ports?
The same goes to the HDMI and phone/mic ports.
In fact, nowadays you have monitors that not only support video over USB-C but also serve as USB-A hubs, which means that with a single USB-C connector you can get everything you mentioned in your example.
Insulting all Apple users by calling them "industrial sheep" may put you into conflict with having your views given reasonable consideration, not to mention the site guidelines. It's not generally okay here to call people names for disagreeing with your views.
I'm not positive, but I think they were talking about Apple's competitors rather than their users. Samsung, for example, dropped the aux port for dongles soon after Apple.
I accepted the correction from others here in accordance with a site guideline about this exact scenario:
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
I was unable to come up with a good faith and plausible interpretation; others found one that I'd missed, and thus I retracted my objection. The author apparently later confirmed their interpretation, but that was not factored into my retraction, and is not relevant to the guideline I'm trying to adhere to.
To your comment about "disingenuous", I've spent most of my life being misunderstood for making perfectly logical statements that other people decided were some sort of slander instead of trying to understand in good faith given the context that I'm a nerd with social disorders. So I'd prefer to avoid being upset with someone else over a misinterpretation when I wish others would be less upset with me about them.
Good approach, and I guess I can see my comment to be along the lines of those "perfectly logical" ones, but ignoring the social context: I try, though :)
I don't understand the logic. When the original usb-c MBP came out I spent $30 on Monoprice for usb-c to whatever cables and never looked back. I even still have many of those cables 3 laptops later.
People would actually comment about dongle gate in Meetups and I'd show them my usb-c to micro-usb cable... ...oh the look of shock in their eyes... "You mean... you never bought a dongle?". The concept of a cable with usb-c at one end and anything else at the other was completely foreign.
I had that complaint when working in a 5 story building and spending a third of my awake hours in meetings here and there.
That 30$ dongle become either a dangling bit you'll have on your laptop all day, it will be hiting stuff, get under the laptop, or worse case scenario stuck between the screen and the keyboard when you don't pay attention. As it'd always dangling it also become loose over time and get flacky accordingly.
Back then having a HDMI port was standard, no dongle being the norm. So yeah, having the choice between needing a permanent dongle or not, the answer is obvious.
What changed for me is WFH, otherwise I thing I'd still wish for no dongle until USB-C projectors and displays rule the world.
I have a monitor that acts as USB hub and power source, everything is plugged into it and then one single cable connects it and all of that and power to my company issued MacBook. Every meeting room used to have hdmi and DP and thunderbolt connectors but no more because every company issued laptop is now capable of thunderbolt (MacBook or dell precision series if you opt in for Linux)
If the company officially used many laptops with USB-C ports, it would make sense to have a USB-C to HDMI in every room with a projector, rather than making everyone carry their own.
Isn't plugging unknown USB-C-anything a huge security risk? It would be easy for a visitor to "forget" an evil adapter in a meeting room, and if employees are in habit of using them, boom.
Yes, I'm sure you could make one which acted as a USB-HDMI converter as well as a rubber ducky. sprinkle a few around, maybe bribe a cleaner to leave one in a meeting room, and you're set.
I think it's even easier than that; we now have cables that are normal shape and size of a USB connector that have an embedded system in them with a webserver, keyboard emulation, mass storage and wifi for a remote attacker to connect to.
It's also easier to just ask to quickly use the worker's computer to get a presentation going.
"Oh yeah sorry, my presentation is made in PowerShell instead of PowerPoint".
Yes. Transition periods are always painful in that respect, worsened this time as a ton of “business” line windows laptops still have a HDMI port, same for Dell’s linux offering for instance.
When the first full USB-C Macs went out they definitely were the odd ones out in the company, and even now there’s still that split between run of the mill windows laptops and macs. Adaptors are more common, but it’s still not great.
It plays more on the "why don't you just ... ?" question that raises when you ask for adapters being standard in every room.
It reminds me of asking to include decaffeinated pods in our recurring coffee orders for the espresso machine. The person had no opinion on coffee, but wasn't convinced they needed to accommodate for the minority that was concerned.
Luckily, we could always make it worse; there were times where full-sized ports were thought of standard yet we had PCs and Macs with mini-versions that were specific to the manufacturer (like mini-composite, AV-jacks, mini-VGA, mini-DVI).
I concur! I have about 3-4 different usb-c to what ever cables and one usb-c to female A port for thumb drives. My thinking has always been that having all usb-c “future-proofs” for future configurations… maybe I will have two HDMI external monitors in the future, rather than display port and DVI? Easy, just get two usb-c to HDMI cables when that scenario arises. With cables it allows for so many different configurations rather than proprietary modular adaptors that any given company might give up on, decide to sunset older versions for new ones with more features. After living what you just described for the last few years I can’t for the life of me fathom how this modular approach will gain mass appeal. USB-C with cables seems far more flexible to me.
Similarly, I bought some adapters that I carry around. I travel between a couple of locations, and I bring just one charging wall plug, and one 10-foot USB-C cable.
I have adapters that convert the usb-c to micro and lightning, to also charge my airpods, flashlight, etc. Each adapter is about 3/4" (2cm), female USB-C end, and male end of lightning/micro. I've glued them together so that it's just one little thing to take.
I hated carrying around 3+ cables, so this has been a welcome change.
It's true that I can only charge one thing at a time, but that's not an issue for me except in rare circumstances.
This sounds like a much better solution than mine. Rather than cables I should have gone with little adapters. Then I just need to take a couple usb-c cables and I can work with any legacy port.
As it stands I typically have four cables in my briefcase but at least they are still smaller than a mouse collectively.
I love the modular laptop concept, but not for ports. For those who don’t want them hanging, these are perfectly color matched, made of the same kind of aluminum as the Air, and sit flush. I prefer it to having extra bulk to the base laptop.
Example: my external USB mic (much better than the internal one) and my USB disk for daily local backups, connected to two different ports this morning (and many other days.)
Your unwillingness to understand or empathize is a form of dishonesty. Just because you personally never needed a dongle doesn't mean such situations don't exist or that they are somehow boundary conditions.
I don’t see a problem with them. The majority of users never need one and even when I use them I usually use them infrequently. I often leave them on the ends of cables. My display port cable has a usb c dongle left on it so it’s like it’s natively usb C anyway.
Sure, if you do some weird stuff or have an extreme use case, I can see why you would want more built in ports, but for the majority of users, they only plug in the charging cable and maybe video out.
Not fair. Very few laptops have serial port (I came up with GPD Micro PC) and maybe zero have modem, but some laptops have VGA port. Manufacturers know that VGA is still used but serial/modem aren't.
These days I suspect that something like a toughbook might be the only option for those sort of ports - although that GPD Micro PC does look quite fun.
Even 10-15 years ago, proper serial ports were becoming extremely rare, but there are times when you need a proper one.
Around that time we resorted to pc card/express card serial ports for occasions when USB to serial isn't good enough, although they were relatively expensive (3-4 times more than a USB serial dongle).
(The use case in that scenario was field engineers connecting to a very wide variety of odd equipment, like fire alarm panels and door entry systems, that sort of thing - USB dongles were massively inconsistent and unreliable - different dongles would be compatible/incompatible with different kit, was a right mess).
Obviously, these days, express card slots are also quite rare.
(The alternative is hauling out my old IBM T22, which I think maybe came with Win98... mostly still works apart from the battery).
Serial is less useful as you need a serial cable, so if you're going to carry a serial cable you might as well have one with USB on the end.
If you're going into an RJ45 serial connection (like I am at the moment), then an ideal laptop would have multiple RJ45s which could be used as either 10G or serial with a standard cat5 cable (not a specially wired one).
In Japanese market, some latest models support VGA but mostly by domestic brand. Some models are made by Clevo or whatever, so possibly also available on other markets. Here's a list: https://kakaku.com/pc/note-pc/itemlist.aspx?pdf_Spec047=1
I use a USB-Serial cable about once a week. I use an ethernet cable dozens of times a day.
Framework means in theory I could have a laptop (Well in theory) with say 4 ethernet / serial ports (switchable) and SDI, and that's far more useful to me than USB-C.
All the conferences I’ve been to had dongles readily available. Most used hdmi. Once I had hastily arranged breakout room that had VGA for the Beamer. I think this is a non-issue?
I always carry it with me but usually they have some sort of screen casting tech around already.
I do make it clear from the planning stages that they need to provide either one of 3 video inputs to their selected system (HDMI/DP/Screen casting) or they need to provide the computer that I can use to remote into my 13"(this is what they usually choose if they have older screens or projectors).
Then you'll take a dongle with you to that conference? Are you telling me you would always waste one of the 4? framework ports for Display Ouput X that you only use once a year?
ethernet and serial dongles are a requirement for emergency maintenance inside of datacenters. But of course not many people on hn spend time in datacenters anymare...
It's also needed for just making sure your internet is setup properly at home. Nobody cares about your speed test over wifi, but ISP's sometimes care if you can't get anywhere close to the rated speeds over Ethernet.
And of course that setup still requires Ethernet. Can't setup a wifi ap over wifi.
I'm curious as to how those work out. The modules are too short to fit a VGA, serial, or ethernet port and be flush with the laptop, but I think you could make one that extends further out and above, and would still have some benefits over a dongle.
> The modules are too short to fit a VGA, serial, or ethernet port and be flush with the laptop...
I'd be fine with a pop out style port for those ports. Won't be flush while in use, but I'd happily accept that to trade off having to carry around dongles. I'd rather pack and carry a small "stick" of these modules stacked together than a bundle of dongles.
I give it three months tops before someone starts selling a Pez-like "dispenser" that stores these modules. If the module bodies were designed to stick together though, that would spark joy in my inner Marie Kondo.
I would argue that a USB-C to VGA or HDMI cable is just a longer dongle. What if you take your USB-C-only laptop to a remote office to do a presentation, but your six foot USB-C to HDMI cable isn't long enough to reach the port because the projector is mounted in the ceiling and has a standard HDMI cable routed to the lectern? I'd much rather have the Framework with a HDMI port on the device than struggle with a common situation like that.
> What if you take your USB-C-only laptop to a remote office to do a presentation, but your six foot USB-C to HDMI cable isn't long enough to reach the port because the projector is mounted in the ceiling and has a standard HDMI cable routed to the lectern?
I personally really like the idea of what Framework is doing and wish more laptops followed suit, but that is a trivially solved problem you identified:
Of course it's trivially solved...with a dongle for your dongle! Or you could avoid dongle-ception by using a modular laptop like the Framework, or even a standard laptop with an HDMI port; even current-gen models from Dell, Lenovo, and HP still have it as an option especially on business-oriented machines. It all comes down to what your everyday requirements and tolerances allow for.
But again, the "dongle" argument is moot and not really a reason to either consider or avoid the Framework, for me at least. It's more about the device being open and repairable, and arguments about dongles are just attempts to justify one's current USB-C only device.
> It all comes down to what your everyday requirements and tolerances allow for
Agree completely. For me, Apple's USB-C only ports isn't an issue as everything I use plugs in via one or two TB3 cables (depending on personal vs work laptop) and daisy chains from the monitor or a TB3 dock so no dongles needed at all, but I still appreciate the design choice Framework made and think it's a good strategy.
I don't see how putting a cable in my backpack is going to be better than a dongle, and I'm certainly not going to a client for the first time then complain they don't have the right cable.
Not to mention the dongle supports several ports.
But you know what is better than either ?
The framework laptop solution of letting me configure the port I want before going to my client.
Theoretically you could emulate the signal with software/drivers given that USB-C has 24 pins. But there's actually display standards/signals built into USB-C so you "just have to" convert the digital signal to analog for VGA, but then it's no longer a stupid cable and more like a dongle.
Never have I seen a greater push against good design. The laptop ship with USBc if you didn't pick that up.
YOU CAN USE YOUR USBC TO VGA CABLE IF YOU WANT.
Or, if you don't want, you can grab A VGA module out of your drawer you store all your retired dongles in, slide it into your laptop, and there you have it.
This is not about using a module vs a cable. My comments refer to using a cable instead of a dongle. People make it seem as though using a dongle is the ONLY way to, for example, connect your MacBook Pro to a TV when you could just use a cable for it.
I was going to present from my phone to a projector the other day, but ( probably due the wear and tear of putting in the charger every day for several years) it was glitchy, so I asked if I could borrow a newer phone and got a few month old, still glitchy, so I had to use a PC anyway. The plan was that I was going to walk around with my phone during the presentation...
What I'm trying to say with this story is that for example monitor cable connectors are designed to fit tightly (vga and dmi even having screws) to give a constant signal, which you don't get from USB-C unless you stand still.
I carry a battery powered projector for this reason for talking with customers, providers or partners. I use standard airport suitcases for that.
It just makes no sense spending lots of time trying to adapt to obsolete infrastructure for every person you visit. If necessary I even have a blackboard and color chalks in my car and get away with them.
When I go to the meeting room, if I don't need to use my projector, great, but I will never use VGA, too much hassle.
- some conf room don't have a projector, but flat screens, a smart white boards or some remote conf setup that needs you to plug in, and/or no walls that fits the bill for projection
- some conf rooms don't have a place to put for your projector and get a good picture. Their is own the ceiling.
- unless you buy a very good one, some conf rooms won't have the light for your projector to be readable
- it addresses only the projector problem, not ethernet, sd card, usb A, etc
- a good projector is way more expensive that a few dongles, are easier to break, harder to replace if lost/broken or if you forget it at home
Not to say it's a bad idea to _also_ have a projector.
In 2019 my new employer sent me a new MacBook pro. I couldn’t connect it to my home office monitors which had vga and dvi ports, so I asked for dongles.
Rather than try to sort out the cable confusion, they simply shipped me brand new monitors (which I wasn’t asking for). I also needed dongles to attach my keyboard and mouse, dongles for same were provided by IT.
My point: Dongles are still an issue, not everyone throws out their displays/keyboard/mouse every time apple comes out with some new version. My 2010 dell displays still work just fine, and it would be great if I could plug them straight into my laptop.
While I'm not a big fan of dongles, they make them slim enough to just leave them attached to the device. I've had one attached to my mouse for 2 years now and it doesn't add much bulk. My Samsung phone came with one so small that you can't even tell is there (other than the extra width for the USB-A part).
I have one, it’s USBC on one end & regular USB on the other, just flips around in the protective sheath to whatever one you need. It’s also super fast, though I’ve never spent much for high end thumb drives to compare to.
I got it at Target. Love how easy it makes going between my USBC only MBP and other random computers.
You're being downvoted which seems a bit weird, but I agree at least for myself. I went USB-C only in my house, and it's been pretty excellent, up until my new job gave me a Windows laptop that has exactly one USB-C port and requires Mini-DisplayPort 1.4 for its display output.
Well, not everyone wants to carry a brick. Sure, Thinkpads are great because they have each port ever invented, but I still prefer a thin laptop (if you have USB-A or Ethernet ports you can't have a thin laptop) with the option of using a dongle once a month if I need it.
There are plenty of thin laptops with USB-A ports. Thinkpad X1 is both thinner than a macbook and has two USB-A ports. And there are laptops just 2mm thicker than a Macbook Air that have ethernet via some clever mechanical engineering.
> I find it hilarious that we spend multiple thousands of dollars on sleek, elegant hardware and then hook up chunky plastic dongles to overcome their bad hardware interfaces.
I don't understand how anyone can come up with that conclusion. I mean, your "cheap plastic dongles" jab is actually a testament to the extent of how superb it's interoperability is. I mean, you're for some reason complaining that we are free to even plug in "cheap plastic dongles" to a high-end device when in reality this means that we can even plug in the cheapest "plastic dongles" and expect it to work. How is this a bad thing?
Back to the "cheap plastic dongles" complain, I do use one from time to time, and the reason is quite simple: I had USB-A devices which I use for years but I also have a couple of laptops which only pack USB-C ports. Should I throw away perfectly good hardware just because a random guy on the internet dislikes cheap plastic dongles? Should I base my purchasing decisions on whether a laptop supports legacy ports? Or should I just spend $10 on a dongle intended to be used occasionally and stop worrying about inane details?
Those who prefer spending their time on relevant things don't even realize that complaining about the proper etiquete of pairing peripherals with computers is a reason for anyone to waste their time. Why do you?
The point of the original comment seems to have gone right over your head.
> So I love the idea of these ports (agreed, they're basically "recessed dongles").
It's juxtaposing two types of dongles: the common, cheap, plastic ones, and the sleek, integrated ones from Framework's laptop, in order to show that Framework's aligns better with the design ethos of the device itself.
> Should I throw away perfectly good hardware just because a random guy on the internet dislikes cheap plastic dongles? Should I base my purchasing decisions on whether a laptop supports legacy ports?
Again, you're missing the point. Both of these types of dongles will support your USB-A devices without you throwing anything away in your pique. One will just look good, feel good, and integrate with the machine you're using, while the other won't.
I wouldn't think of them as recessed dongles. After installing them I haven't changed them out at all so far. It's more that you can configure things how you want. If you wanted 4 usb-c ports and that's all; just do that.
I would love it if all my devices had transition to usb-c, but they haven't. I still occasionally need usb-a and sometimes I need an hdmi out. So... that's what I have. And if I stop needing usb-a I'll get rid of it and put in another usb-c. You could even do a single usb-c port and then 3 storage attachments if you wanted. Nobody is ever going to sell a laptop like that, but for someone who really needs storage and doesn't care about connectivity that might be perfect.
If you're okay with dongles then you're probably fine. I'm not. They clutter up the workspace, occupy permanent space in my bag which is annoying, and often enough aren't around when I actually need them.
It also goes a little bit beyond that: if one of your ports stops working (e.g. rust, water damage, etc), you can just buy a replacement port and you're back on track, as opposed to "welp, I guess I'll have to do without it..."
The modular port is a universal port with dongles. Except that (1) it doesn't take extra space outside of your computer, (2) it is cheaper than mainstream dongles (Apple sells USB-C to HDMI for $70 while the Framework HDMI expansion is $20), and (3) it is fully open source and you can actually print/sell your own.
Luckily, USB-C means you don't have to buy any accessory from Apple, and have the world of low cost peripherals at your disposal, like the $13 version on Amazon [1].
Quite the opposite, it means you can start with plenty of usbA port, and when you don't need them anymore, switch them to usbc, without changing your laptop.
It means when one port wears of, fixing it is easy, cheap, and doesn't immobilize your machine.
It means you can change your port to fit an hdmi or ethernet as needed, without having the stuff coming out your laptop, all ugly and taking space on the desk.
What upsets me about the solution is that the whole laptop is limited to 4 ports. They don't even offer the obvious "2 in 1" dongles where one dongle would contain e.g. two USB-A ports.
Many other laptops aren't modular - but they offer more than enough ports to make up for it.
You need at least 1 USB-C port for charging. If you want to be able to use an external mouse and keyboard without a hub, that's 2 USB-A ports that you need. That leaves you the choice between having one USB-C port OR HDMI for the last port. There isn't even an Ethernet option at all. And you still get the potential downsides of the ports being adapters from USB-C, if I understand correctly.
If you grab a Lenovo P14s, you can't swap one of the two RAM sticks (limiting you to 48 GB), opening it takes a little bit more work, and replacing some of the more integrated components is going to be harder (SSD is trivial). In exchange, you get 2x USB-A, 2x USB-C, HDMI, MicroSD, and full-sized (not flip-out/break-off) Ethernet. Plus an optional built-in smartcard reader, plus some proprietary docking port extension around one of the USB-C ports. Looking at the Gen 2, you can get that at around 3/4 of the price with a similar or better config (including a somewhat serious GPU) as long as you order on the weekend (when Lenovo's non-ripoff pricing is in effect), and you can also add a fingerprint reader and NFC if you want.
The Framework laptop has upgradeable RAM, but you will have to upgrade it yourself (discarding the RAM that comes with it) if you want more than 32 GB, and no matter what, it won't ever support more than 64 GB. Is supporting at most 64 GB really that much better than a laptop with 64 GB soldered in?
By the time you want to upgrade the CPU, the mainboard won't be compatible, so what's really there to upgrade?
> Is supporting at most 64 GB really that much better than a laptop with 64 GB soldered in?
Yes. Normally some people (like me) can't afford to buy the highest spec laptop. Thus, I'll be going with a lower memory and storage version of a machine, and then expect to upgrade the RAM and storage when I've got enough. That's exactly what I did with my current Thinkpad T440. And for a current gen Apple, I can either buy a 8 or 16GB machine, and the price difference is pretty significant for me.
> The Framework laptop has upgradeable RAM, but you will have to upgrade it yourself (discarding the RAM that comes with it) if you want more than 32 GB
FWIW, if I understand you correctly, AIUI you can order the "DIY" Framework with no RAM at all, so there's no need to discard anything--but also, just looked now & you can also order the DIY edition with 64GB.
> By the time you want to upgrade the CPU, the mainboard won't be compatible, so what's really there to upgrade?
The mainboard! (Well, that's Framework's plan at least.)
And the mainboards can run standalone too, so you in theory you can use it to automate your house in the future or something too. :)
(With regard to the 2-in-1 aspect, I think it's important to remember that this is Framework's first product range, they need to limit their scope to not spread themselves too thin.)
And people are already starting to experiment with hacking together their own modules, e.g. https://www.youtube.com/watch?v=0_uOzNt-xwY who was prototyping with off the shelf "magsafe" style adapters & essentially "rehousing" a wireless mouse dongle.
So with a "universal" wireless mouse + keyboard dongle rehoused in a module you could get two of your ports at least. :)
I don't agree with most of your points, but I do (at least on initial review) agree with this:
> They don't even offer the obvious "2 in 1" dongles where one dongle would contain e.g. two USB-A ports.
I would have thought that by making the expansion ports slightly wider, including 2 USB-A would be possible.
At my side gig running livestreams, I often end up with more than 4 USB-A devices connected, unfortunately, so with USB-C charging and HDMI, I'd need a dongle to use this device even with such a 2x USB-A expansion.
- USB-A dongle for wireless mouse
- External USB-A sound card (to connect the mixer board)
- External USB-A camera
- External USB-A flash drive to load up PowerPoints etc
provided by the presenter
- ... and then maybe I need to plug in my USB-A Yubikey to authenticate. Or a 2nd flash drive.
That said, using a USB-C hub/dongle for cases like mine isn't the end of the world.
I'll give a reply not seen here yet. I haven't bought the framework laptop (yet) but I can see the module appeal. There's all sorts of hacker-ish ideas that I could imagine stuffing in there and the fact that I don't need a dongle means they're always attached and ready to be thrown in a bag. My first idea:
Framework offers a 1Tb storage module for their ports! I backup my root OS via ZFS snapshot to USB every so often now. How great would it be to have a storage port that's all the recent snapshots of your important datasets. And, the possibilities are endless. The fact that they don't change the form factor of the laptop and that they're always attached is actually a big deal.
Good point. For what it’s worth my last MacBook had a TB3 port physically wear out. Thankfully it was covered by warranty, but the connector saver concept is definitely compelling.
This was something that was more prominent during the Micro-USB era. The little metal "tabs" on the male end of microUSB connectors would start to wear out after a thousand+ plug/unplugs resulting in a loose connection that wasn't reliable.
With USB-C, the connector was designed consideration of a bunch of factors, one of which I would assume is lifespan of the end connectors - USB-c has thicker, more resilient plastic hooks built into the inside of the male plug and stronger mating latches in the female end of the connector.
It when You need to adjust cable multiple times until connection happen, and then work very careful to not move anything.
Ports sometimes are very fragile. My old laptop has only 2 of 4 USB ports working.
Connectors are rated for a given number of connect/disconnect cycles. For USB-A it's a minimum of 1500[1].
If your laptop has a cheap connector which isn't rated for more, and you do two cycles a day (start/end of day, start/end lunch), then you'll go through the rated number of cycles in less than two years.
Doesn't mean the connector will fail right away but it might start to act up. Connectors are not forever.
The modularity means you can change your workflow or peripherals without needing to find a dock or dongles long term. You can travel with the HDMI dongle in for putting a movie on a hotel TV or use with an external monitor somewhere. Maybe you need that SD card reader for most of your photo work but only on weekends or trips when you go process images immediately or need to offload them from the SD card.
Regardless, the port can be what you need it to be or just a useful USB-C, you aren't tied to whatever ports the OEM thinks you'll need forever even though it may only be valued by a small number of consumers.
That small number is still enough to drive sales for FrameWork. I'm interested if I need a better laptop and I sit at a desk with desktop in use almost all the time. This appeals to those that interested in more control over their device in configuration, expansion or modifications, and the various ports and IO options. I can't say I'd buy many of the USB modules (rarely would use most anyway) but the mentality is there and I so far have trust in the product. It's not meant to appeal to GAMERS or Enterprise execs, just those that want more control over their devices.
My devices aren't USB-C (e.g. headset, tablet) and I use HDMI cables a lot. So it's great for me. Turns out thanks to the modular port design it's also great for you!
The main advantage is that the dongles are all built into your laptop. With dongles I have a pile of them in my bag (that takes up more room), I have to remember to carry them around with me if I'm in a conference room, etc.
The thing is, that the dongles are integrated into the laptop body. For a portable machine, this is huge. I love my 16" MB pro and do think that 4x USB-C is great for connectivity, but having to carry up to 4 dongles with me any time I move between working at home and the office is pretty much a nightmare. The built-in dongle ports, if you might call them that, are a much more elegant solution - somthing Apple should have invented (and could have sold for a lot of money). On top of that, they seem to provide even more "dongles" than laptop ports out of the box.
It is also probably mechanically much more robust, if you plug cables into your dongle box instead of into the (motherboard mounted) port directly. If one of the dongle boxes breaks, it should be cheap to exchange.
> How is the "modular port" concept any different than a universal port with dongles ...
At least the "modular port" adapters are not dangling from the side of a laptop as the dongles do. Dongles totally ruin the esthetics of otherwise slick MacBook for me.
The thing about dongles is you don't always have it with you. The biggest change I noticed when I got a USB-C based mac is that I couldn't just plug it into every projector through the HDMI, so I had to start planning ahead more. Same with getting photos off of an SSD card
Abstracting the ports makes a ton of sense. I have some hardware laying around - headphones, e-reader - which is perfectly good, but part of me wants to replace it just because it would be much nicer to have USB-C everywhere instead of micro usb. I could see this as something which could significantly extend the lifetime of the hardware by removing those types of compatibility concerns.
Absolutely agree. If I ever had to go back to a "docking station" which I'm sure many on here have used in the past I'd be ripping out my hair.
A single multi-port dongle with Hdmi, extension USB-C and USB-A makes it so that I connect 1 thing to my laptop at my desk. And you don't have to press the whole laptop on some weird device that can scrape that back of your laptop.
If more people had experienced docking stations of 10 years ago they would also be excited for these dongles.
I assume if all your devices are usb c you realistically have no horse in this game - and that's great.
For those who have different needs than vendor provides - permanent or temporary - recessed or not, or in other words part of computer or something I need to carry/lose/forget/misplace can be a huge huge difference.
I would say it's quite similar to having a touch device with an integrated pen holder where pen disappears in the device (so it can't fall out) and not.
It is a "gimmick", but if you are frequently shuffling your laptop around, having a bag clear of dongles and "floating" stuff is a world of difference.
I mean, even thinness in laptops is a gimmick (it's actually the first thing I'd do away with to get maintainability, battery life and better cooling/performance/noise — fanless, anyone?), but it sells like hot cakes.
I don't get it either. I dock my MacBook and my monitor supports USB-C which also provides power and a bunch of other standard USB connections. I don't even really need a dongle anymore.
I think this kind of thinking is what keeping Apple sailing high.
Headphone jack gone? Get the AirPod. Oh BT drains the battery faster of already sub capacity battery? Why don’t you have a power bank yet? And yeah, keep it on you always. Isn’t that normal? Or Apple has a shiny battery pack. Maybe buy two.
Glass back breaks? Well, you gotta lose something for wireless charging. But I don’t do wireless charging. Why not? Go buy another thing.
There was this TV show and there was something like “happy to comply” in that.
Yeah - I think Apple's approach is the right tradeoff, though I admit I think it's cool from a nerd that likes gadgets perspective.
If there's enough of a market for that that they can survive that's cool, but I think there's a reason it's not the default design (that isn't some cynical one about planned obsolescence).
I think the proliferation of usb-c has finally solved the universal port issue, you can just buy a 3rd party dongle with HDMI/displayport, usb-c (with pass through charging), sd-card, usb, etc. and it works great. Meets my needs pretty well.
external dongles? Apple specifically did this to extend the revenue of their boxes. I cant stand apple any longer. Currently I prefer the HP Omen - (the support from the executive escalation support team is stellar).
but the idea of needing a FN dongle whenever I want to do something is FN archaic. Plus they are over-priced, bulky and are much prone to ultimate failure of either the port (from flexing about when youre on a soft surface like a bed or something.
I have Two AOC USB screens that I use - so I have one laptop, three screens and it all fits into my backpack. the external USB screens are the only "dongles" I want.
The modular ports are my least favorite part as well. The fact that you have to buy one USB type c module just to be able to easily plug in your charger is crazy. Another just to have reachable USB port. Maybe 1 or 2 modular spots would be nice, but put in some standard type c ports and monitor connectors without having to pay an upcharge or include at least 2 type c modules free.
It has 4 type C ports, that's what the modules plug into. So the type C port module is basically a one inch extension cable. They do recommend you buy 4 so it's $80 and not $20. Having ANY upcharge to be able to have an expectedly reachable port to plug your laptop's power supply into seems like a design flaw.
Unless they've done research that shows the type C port on the main board is a common point of failure and needs soldering to fix I don't see the point. I've had to clean lint out of ports but I've never broken a type C port on a computer.
The USB-C passthroughs are nominally $9 each, so it's not $80 to fill out the bays.
More importantly, they're also built into the price estimate already, so when it says "$999" or whatever, that's including four $9 cards. It doesn't cost you any more to switch some of them for USB-A's instead, and other choices like HDMI, Micro SD, whatever, will be an upcharge.
A more savvy (sneaky?) approach might be to say that 4 cards are included and then only quote the increase over the base price for the things like HDMI that cost more, but I suppose they wanted it to be seamless in terms of how the pricing appears if you want to order more than 4.
Obviously there's a real sense in which engineering went into having this system and the things take up space, so there's a cost to having them, but I don't think calling them an upcharge is really legitimate; they're built in to the quoted prices.
Sorry, my mistake on the price for the USB-C passthroughs, they are $9 along with the type-A modules. All others are $19.
I was looking at the DIY edition and the price is not included in the estimate so it is an increase in price to get any modules. Looks like you are looking at the prebuilt options and I see those do include 4 type C modules in the standard configurations and price. I think they should do the same for the DIY editions and let you remove them if you want.
No the type C ports are $9 each, so it is less than $20 for two. You can reach the port without a module, but sure they could throw in one or two for free I guess. You should suggest that. I know users do not like to be nickel and dimed, and the two modules probably do not increase the BOM much.
That said they may be planning for a situation where you bring your own modules or buy from a third party if you wish -- it is an open design after all. Consider the enterprise use case. An org could have a batch of USB-C modules for replacement that they source from a third party for a cheap price, and then order the laptops themselves on demand.
Framework is all about reducing waste, so only giving users what they ask for is part of that.
Not trying to be snarky, serious question: How does a design that requires everyone to purchase at least one type C module reduce e-waste? It adds manufacturing overhead, shipping overhead, etc to every laptop.
It only reduces waste if mainboard/laptops are discarded due to a failed charging port. Does that happen often with type C connections on mainboards?
How is the state of linux power saving (i.e., mobile battery life) in 2021? Last time I checked you could expect to get almost 2x the battery life on equivalent hardware with Windows or Mac OS.
With tlp and a 5.10ish kernel, my T480 handles light web browsing at 3-5 W depending on display brightness (wqhd display). Heavier websites, high-res video decoding, etc. will bring it up to 7-10 W. Maybe 20 W when compiling something, or when attempting to use the (worthless) Nvidia GPU (which I leave powered down most of the time).
It has a dual battery setup with a 24 Wh internal battery and a 72 Wh external battery (I use the bulbous 6-cell, but you can get a thinner one if you prefer), so even when setting the charge stop thresholds at 80% to preserve cell lifetime, the computer easily lasts all day on battery.
That being said, it would be easy to find a random laptop that either idles at something stupid or that can't go to sleep properly because of just one wacky peripheral, making Linux power management seem bad.
Half the battle is picking the right starting point (e.g. a Thinkpad), and then setting everything up so you can actually measure the power in all of the different states so that you can actually confirm that your setup works properly before relying on it and ending up disappointed at the airport.
Start with this. If there are peripherals you don't use (something as small as a Bluetooth adapter or as large as a GPU), figure out how to power them down, and get the computer to put them in that state by default.
On ThinkPads (and probably on others), you can monitor the instantaneous power consumption by poking around in:
/sys/class/power_supply/*
There are widgets that can monitor this for you and alert you if anything seems amiss, but I just have a script that puts some text and symbols in my menu bar:
Thanks it looks neat. I'll check out your scripts - especially interested in the power draw part for myself. I had setup the 80% cycling before but that was on my n-1 installation and haven't had chance to put that back on.
I have a friend who I convinced to buy a Thinkpad and stick ubuntu 20.04 on it. Unfortunately he got a more expensive X1 one with an Nvidia Graphics card, it strangely overheats when he needs to charge it and plugs it into the AC. I suspect the power settings are all over the place. I've not had chance to get a log from him yet. How did you "power down" your nvidia card?
First I made sure X was working the way I wanted with Intel graphics. There are a handful of ways to do hybrid graphics on Linux, depending on what you want to be able to do, and how old your hardware is (including "I didn't even want this thing, just keep it powered off forever"). The Arch wiki has some pretty good guides [1] that are helpful even if you're using a different system.
bbswitch seems to work on my system, and the interface is really simple [2]. But there's also this page on acpi_call [3], which suggests that bbswitch is old and unmaintained and that newer systems do something different. From a quick scan, it looks like the Arch wiki also mentions this approach.
As far as drivers go, I know everyone likes to dump on Nvidia for their closed source mess, but on every system I've used with Nvidia hardware (desktops and laptops), I've found that the Nvidia drivers have universally been more reliable than nouveau, so that's what I use.
When I bought my T440s in 2013 (8 years ago now), I could push it to 18 hours with wifi off. And could get 10 hours plus with wifi. I seriously doubt Windows or Mac OS would have gotten much more juice out of it.
I currently use a T440S, with Arch Linux, and fairly recently replaced batteries. I feel like I only get 5 hours or so. Although if I’m on my work computer, with my T440s open next to me and I’m lightly using it throughout the day then it easily lasts all day, with like 30-40% remaining at the end of the workday. My biggest beef is that the replacement batteries these days seem bad - even brand new Lenovo brand ones degrade really fast. My assumption is that they’re not being manufactured anymore, and so a “new” battery is actually new old stock and has been sitting around for years. I am tempted to upgrade to a newer laptop.
New batteries won't lose capacity over time. A lot of "new" batteries are just knockoffs, but there's no way of knowing without opening them.
If you want good aftermarket batteries, KingSener has a solid reputation. Only go for aftermarket batteries that are upfront about being aftermarket and are open about the cells they use and well reviewed, or for sealed Lenovo batteries in original packaging.
I recently bought a brand new, sealed 72Wh original battery. It sat sealed for so long the batteries went down below 3.1V per cell and the pack locked itself. Looking at the self discharge curves I'd expect this to be the case for any actually new battery.
To make it work I had to open the battery and use a specialized charger to give it 0.1A of current until it got to 3.3V per cell, and I then programmed it to charge at 6A, after which the battery pack decided to start up.
Thanks. I’ll try that brand. I buy mine from Encompass, the official seller of Lenovo batteries. So they shouldn’t be knockoffs, nor unsealed. But, one battery went from 90% health down to 30% in the course of six months. And they’re not cheap!
Can't comment on mobile, but I've set my laptop to sleep after 10min of inactivity and 3min after lid close and it lasts the whole day (4.1Ah). I'm not using it for 8h straight, but rather a combined 3h or sth probably but with this aggressive sleep settings it easily stretches some 8h and I don't feel afraid of it just going out at the end of the day.
Expresscard was supposed to replace it. It's literally a pin for pin pci express slot. But the "market" spoke (more like Apple decided they were too good for useful ports) and laptops got slimmer and less able to be expanded on.
Thank you early user, Hopefully people like you would make enough sound that manufacturers come to their senses and put 'Repairability' back on the feature set.
It's not just the compute devices, I've been waiting for weeks for Samsung repair technician to arrive for fixing year old fridge. I remember going to showroom when I was a kid with my dad and the sales person used to pitch availability of parts, repair centers for the consumer electronic products.
> absurd that you'd buy a laptop with a bunch
> of "hardcoded" ports that you can't ever change.
This is great for multi-monitor users. Depending where in _my setup_ the other monitors are, I don't want the power and video cables, or docking port cable, to cover part of the external monitor. Being able to move the ports' locations is genius.
I really wish people talked about this stuff more when it comes to laptops. Hot temps and loud fans are a super turn-off for me. I'm willing to compromise performance for a quieter, cooler runtime, but usually all I can find out about a laptop is its processor clock speed and a horribly inaccurate battery life estimate.
Computer thermals are weird. You can ultimately choose between a gimped device that stays cool under load or a spectacularly hot, dynamically clocked CPU. I normally cut the clock speed of my CPU in half with any new laptop, since I'm really only going to use it for text editing/SSH. That alone is good enough to lock it below 40c, but there are other ways to achieve a similar effect.
Loud fans and hot temps are fine for me, in isolation. But knowing those hot temps (and to a degree those loud fans) are slowly killing my laptop, and likely very quickly killing my battery, much faster than you'd imagine, yeh, it suddenly becomes a bigger priority.
For me, I've never found a laptop as good as the older thinkpads at handling temps.
I had a 16" MacBook Pro that was so hot and loud all the time that I just couldn't stand it anymore and sold it to buy an M1 MacBook Air, which thankfully doesn't even have a fan.
I don't understand how it didn't come up in any reviews, because you don't have to push it super hard to make it happen.
I rely on notebookcheck [1] for stuff like this--best notebook reviews I've ever read. If you go down to Emissions there's a "Temperature" subsection with pretty detailed info.
Yeah, being limited to only 4 ports is a complete non-starter for me especially when you are required to use one of them to charge the laptop. So practically everybody is going to end up filling one of those modular ports with a USB-C module, when they could have easily fit two dedicated USB-C ports in the same space, with no practical loss in expandability.
I think the issue with this is the number of thunderbolt ports provided by the chipset. They have to choose between having only 4 ports that all work the same way or more than that where they don't all have the same features.
In the future if another alternative to USB-C PD arrives, you could presumably swap out to that module (of course, presuming it's otherwise compatible with the internals).
Also, is there any reason to believe someone wouldn't make a two-port USB-C module provided the market for such a thing exists? Maybe off-the-shelf ICs to do this aren't available (which would make this more expensive and thus less likely), but that doesn't mean that will always be the case.
> In the future if another alternative to USB-C PD arrives, you could presumably swap out to that module (of course, presuming it's otherwise compatible with the internals).
The module itself works over USB-C PD, so the only effect of such a module would be to convert the new standard back to USB-C PD with all its limitations, plus the cost of conversion.
True, but that still might be desirable if there's a new cable standard that is otherwise compatible -- beats making the entire laptop obsolete.
As a thought experiment: If the Framework laptop existed prior to USB-C PD, it would have been a very cool feature to be able to add a USB-C PD module and effectively upgrade the laptop to support it.
It'll be interesting to see if the internal USB-C design works out in the long run. My instinct would have been to build a larger (proprietary) internal connector for the modules that included charging, constant and switched power, USB, and as many PCIe data lines as possible.
I was reading some forums on their website a little while ago and they said that seemed unlikely with a decent amount of space being used by the mounting mechanism and whatnot. It would be nice, though.
No one wants all the different ports though, they usually just want the correct ports. If you're plugging more than 4 things into your laptop at once you should be using a dock.
If Framework opens this up for other companies to make modules, you might also see something like a USB port module with an integrated Logitech wireless mouse receiver. So then your mouse isn't even taking up a port.
I know Bluetooth is a thing but you get the point. All of those tiny devices that people keep plugged into their laptop ports 24/7? This is a much better form factor for them, and you don't necessarily need to sacrifice a port for them either.
I don't want all the different ports at once, but I definitely use more than 4 in total. My current laptop the following (and I'd love to have more):
- 2 USB-C (1 of which is used for charging, I'll probably have reason to use the other at some point with increasing USB-C adoption)
- 2 USB-A (1 for a wireless mouse, 1 frequently used for flash drives and whatnot)
- SD (used occasionally - cameras and with an adapter for micro SD in phones)
- RJ-45 (used occasionally, probably more often soon)
- HDMI (used somewhat regularly)
- Headphone jack (also built into the Framework)
So with the Framework I'd be missing out on 3 ports. I could survive with that, but it'd be pretty sub-optimal. Thankfully I shouldn't be in the market for a new laptop for 5+ years, so hopefully Framework will have more options by then.
If you don't want all the ports at once, then it sounds like modular ports is perfect? You can just attach whichever combination you need on any given day / moment. Or am I missing something?
Fair enough, that wasn't really something I was thinking of. Of course, at that point you're really just changing from having to carry around a bag of dongles to a bag of ports. Not having to remember to have to put in/bring my HDMI port to a presentation is a little bit convenient.
My first laptop bought way back in 1993, had a scheme where either or both of the two(!) removable batteries could be replaced with a plug-in module. I had one that gave me a SCSI port and another that gave me a 2400 baud(!) modem.
It was better in theory than practice. You couldn't hot-swap modules (because this was 1993, after all) and driver support was iffy. I sold that laptop a few years later and didn't own another laptop until I bought my first PowerBook in 2002. I've kept dongles in my bag from time to time for connecting to external monitors, but most of the time I never really bothered with it.
I did like my late-90s Dell Latitude where I could replace a CDROM module bay with a 2nd hard drive or battery. It was even hot-swap if I recall correctly.
That said, these days I know exactly what I want in a laptop and most mid-spec+ laptops have an abundance of what 90% of users need.
What I LOVE about the Framework (don't have one yet) -- it hte "Cyberpunk-yness" of the thing....
Imagine the day when our kids are rummaging through a pile of various modules looking for just the right one to plug into their Deck.
This, to me, truly feels like the "deck" from Neuromancer of olde!
What will be great is once modules become a 3rd party aftermarket blast off..
Fiber interfaces, all sorts of other modules and the inevitable future HN post about fake dongles with spy-hardware from china, 'beware of keyloggers on foreign modules' etc...
I hope that certain elements are attached by magnets.
---
Can you directly attach two machines side-by-side with a USBC cable? What if you could chain multiple of these boxes together and have a second deck, which is headless and just swipe between the two desktops on the screen... One KB and Screen and two decks? I have always wanted this.
I truly think its absurd that we havent yet been able to use machines like legos - I think that these decks offer a path to that with multiple decks.
How does the touchpad perform? I'd love to move away from Apple, but don't want to spend the money on a laptop to be disappointed with that aspect. My experience with non-Apple touchpads has been sub-par so far (looking at you Dell), and it's one of the factors that keeps me ensconced in the Apple ecosystem.
I am also wondering. I have seen lots of reviews on how cool the modularity and repairability is (and I love those things, too!), but not much about the day-to-day usefulness. Is the touchpad responsive? Functioning multitouch? Does the fingerprint sensor actually work?
And of all these, the touchpad is by far the most important, and like you, keeps me to my Macbook.
The baseline, preassembled model starts at $1000. Windows 10 Home, quad-core i5, 8 GB RAM, 256 GB storage, a nice 2256x1504 display, thin and light (1.3kg, 11.7" x 9" x 0.6"). Compare that to your other thin and light options at this pricepoint.
XPS 13, $1020
* i5
* 8 GB RAM
* 256 GB storage
* 1920 x 1200 display
* 1.2 kg, 11.6" x 7.8" x 0.6"
MacBook Pro: $1300
* M1
* 8 GB RAM
* 256 GB storage
* 2560 x 1600 display
* 1.4kg, 12" x 8.7" x 0.6"
It's almost a no-brainer, even without considering the repairability, unless you like macOS. Unfortunately, not many people see repairability as a feature yet due to the toxic status quo, but this could change. I think that after brand recognition is established, this laptop could legitimately be competitive in the laptop market, and not just appeal to hardcore techies.
It doesn't look like a no-brainer to me (already a Mac user). The MBP has a better screen, better battery life, performance, has a metal body, is completely silent and has nicer software.
That said, you should get the Air instead which is $999 with almost exactly the same specs, minus the annoying touch bar, and a tad lighter at 1.29kg.
For the record, I also own a Mac: the MacBook Pro, 2019 baseline model. I used to enjoy the battery life (though it has been slowly getting worse to the point where I have considered reaching out to Apple about it since the battery is glued in and I can't replace it myself). I like the metal body. I also like macOS. My MacBook is not completely silent because it's an Intel mac, but the M1 would be, that's true. As far as "better screen", I'm not so sure... on paper, the resolution is nearly the same. I also like the trackpad. But anyway, you need to decide whether these things are worth the complete lack of repairability. As a reminder, your Mac's battery will die at some point, and your Mac will become useless. I'm seriously considering selling my MBP and buying this instead.
Word, I'm pretty fed up with throw-away culture, especially with big ticket items.
I've used pretty much all the developer laptop contenders as daily drivers at some point. You get used to what you get used to, so marginally better isn't so important, I've found.
I'm going to pick one of the framework laptops because they look decent, but my secondary objective is to support a company that at least is walking some of the walk I want to see.
> I'm pretty fed up with throw-away culture, especially with big ticket items.
I love what Framework are doing, but integrated hardware does not immediately equal throwaway culture (saying as something who still uses their late 2013 MacBook Pro).
I think it's fair to say though that even if you don't have a throwaway culture, apple certainly has a throwaway culture and has been cultivating it in their users heavily (often by integrating their devices more than is necessary). Seems you managed to escape their wiles but many haven't.
No I think there are certain aspects about Apple products that are definitely not throwaway culture, mainly software support. The hardware is also holding up really long (I also use a MBP 2013, I have several iPhone 6s users in my family). The main issue is that in case of defects, repairs are not economical and I wish Apple would recognize that. Given the long viability of hard and software, I have reasonable doubt to believe it’s a decision made to maximize profit, but rather a legacy Jony Ive sized blind spot.
Apple makes hardware that is extremely durable, I and others use MacBooks that are getting close to 10 years old. I really doubt people will use the Framework for longer.
The main reason people buy new phones and macbooks is not that the old ones stop working, it's that the new ones are desirable, they are even slicker than the one you've got.
Apple adds features in attempt to lure people to 'upgrade' from a nearly identical product. Touchbar, face-id, 'apple silicon'. Most of this stuff doesn't change the value proposition of the core product. I'd say that the majority of Apple's user base is largely there for status reasons, and those people upgrade to keep the status they feel the brand grants them. As someone who isn't an Apple fan, but who uses some of their products, I am very aware there are exceptions, but I'd suspect that, like me, these exceptions also don't follow the press release driven upgrade cycle.
It's this type of consumerism that I'm personally not interested in. I want a tool, I want it be under my control. I want to be able to maintain it for its natural useful lifetime. That thinking does not belong in the culture Apple is fostering.
(Edit: and yeah, I get that Apple isn't the only culprit here, it's just that if a company seems to be fostering a culture that fits my ideals, I'm willing to compromise on 'slickness')
Apple Silicon was a very significant upgrade, my M1 Air is just insane. It's a ridiculously good computer.
But if you want a tool, why do you want it to be "under your control"? I think most people are more like me, I just want a really nice computer that works well, I have exactly zero desire to replace the RAM or upgrade the SSD.
I own an M1 Air. It is a macbook, and seems fine so far, but it's over-hyped.
I'll agree that it seems that currently most people are like you. They don't think about maintainability when it comes to belongings. If they care about the effects their actions have on the environment, they might want to start.
If you are buying a new phone and laptop every two years, where are you putting the old one? What happens to the toxic materials in the batteries? Did you need to have a new machined aluminum shell, or could you have just replaced the part that was bad inside it?
That all said, I don't have any illusions about changing the course of the average person's thoughts or habits via an HN post.
Like I said in the original post, I'm done with buying throwaway goods if I have an alternative. I'm happy that an option like the framework exists.
I definitely don't buy a new laptop every two years, few people do, and when I do buy one, like most people, I sell or hand it down. Same with a phone, I don't think many people throw away 2 year old phones.
Using a laptop intensively for 5-6 years cannot be called "throwaway culture".
When it's finally unusable, Apple will recycle it for free, at least if you buy a new one at the same time. I don't know what they do with the battery, but I can't see why it would help if it was easily user replaceable.
Maybe you feel that the M1 is overhyped because you only had your previous laptop for a year or two? Coming from a 2016 MBP the difference was huge. And of course that laptop is still in use 5-6 hours a day.
Most importantly, assuming Framework is still around in 10 years, do you think you'll use it for significantly longer than 8 years, which I suppose will be the lifetime of my old MBP?
I have three old macbooks laying around that people just gave me because they didn't know what to do with them - all broken in some way. I wonder how many people actually recycle their laptops, and am worried about the answer.
AirPods die within a few years and then get tossed in the trash. Modern MacBooks aren't built like they used to be. I don't see any reason you couldn't use the Framework Laptop for that long. Just replace anything that wears down over time.
My 2016 MPB is in perfect shape, except the keyboard that was overly sensitive to dirt from the outset. Not sure if that counts as modern? My new MBA seems extremely well built too, not sure what you are referring to.
Using my airpods pro at least 5 hours a day since they were released, also perfect shape. Like the AirPods 2 i handed down. I'm sure they won't last forever, but they've already provided a hell of a lot of usage for 200 bucks.
How is the battery life on your 2016 MBP? Like I said in one of my other posts, my 2019's battery is already slowly degrading. I'm at ~300 charge cycles after ~2 years of moderate usage. Apple says the battery is "consumed" after 1000 cycles[1], so it seems my battery is already 30% consumed, which doesn't make me very hopeful for the longevity of this device.
I wish it was easier to find actual data on battery life of 2016+ MBP models. Googling for "how long does MacBook Pro battery last" yields mostly useless results. There's a macOS app called "Coconut Battery" that tracks battery health over time, with an optional reporting feature that uploads the data to a server for comparison purposes.[2] Unfortunately the online viewer only allows you to view data by each individual model. It might be more helpful to aggregate the data from all 2016+ models, since the battery is the same, afaik. The online viewer also doesn't show how many reports the averages are based on. I wonder if it would be possible to scrape their data, to produce an evidence-based answer to the question "how long will my MacBook probably last?", rather than relying mostly on anecdotal evidence from Mac users.
Anyway, I think the idea that a $1300-$6500+ laptop ships with a glued-in, non-replaceable component that Apple themselves admits is "consumable" is just absurd, regardless of how long it may last.
> Apple offers a battery replacement service for all MacBook, MacBook Air, and MacBook Pro notebooks with built-in batteries.
I believe you probably mean “user replaceable” in your comments above. I do see the value in that for myself, which is why I still have my 2012 MacBook Pro. Nevertheless, I continue to my iPhone 6s after 3 battery replacements (I paid Apple for two and one was covered by them due to a recall).
I point this out because I think your criticism of Apple designing disposable hardware is less true of them than their competitors, at least in my experience. My Apple laptop and phone are the first Apple ones I’ve bought, and have outlasted my PC laptops and Androids by at least 2x.
If you want to make a criticism about Apple designing hardware that’s costly to repair, that’s something I could agree with.
The batteries on 2016+ MacBook Pro are not Apple-replaceable, either. They replace the entire top case because they can't replace just the battery.[1] (If anyone has any source for the contrary, please provide it.) Assuming it's not user-replaceable nor Apple-replaceable, can we agree that it cannot be called a "replaceable" battery? Yes, technically it may be possible, but if even the company who made it thinks it's too difficult, it's not very useful or accurate to call it "replaceable". To be honest, I disagree that it should be called "replaceable" even if only the company who made it can replace it. It's a linguistic debate at this point.
By the way, the 2018+ MacBook Air does have a replaceable battery, in that Apple can replace it, so by extension, users can too (if they can track down a replacement).[2] Apple must adopt the adhesive pull tab approach on the MacBook Pro. The glued-in battery is completely unacceptable.
> I think your criticism of Apple designing disposable hardware is less true of them than their competitors, at least in my experience.
Keep in mind that, as the most profitable tech company in the world, Apple deserves to be held to a very high standard. Time and time again other companies follow their lead. Also keep in mind that Apple uses this profit to actively lobby against right to repair, making things worse for everyone in the long run.
Though laptops mostly don't care about repairability and any repairability they have is usually an accident, I don't think most other laptops on the market are quite as egregiously anti-repair as the MacBook Pro.
As a user, my main concern with a battery replacement is that I get a new battery, that the device continues to work, and that it doesn’t take too long. If Apple decides the best way to meet those objectives is to swap the top case, I’d still consider the battery replaced.
> Assuming it's not user-replaceable nor Apple-replaceable, can we agree that it cannot be called a "replaceable" battery?
I could agree to calling these batteries “non-user replaceable”. What you’ve described is that Apple can do it, but they replace more components at the same time.
For what it’s worth, I like MacBooks but I skipped that generation of MacBook Pros because it didn’t seem worth it to me. I bought a non retina MacBook because I didn’t like the soldered on comments. After having 3 iPhone 6s battery replacements done by Apple, I guess I’m somewhat okay with glued in components as long as _someone_ can service it.
> If Apple decides the best way to meet those objectives is to swap the top case, I’d still consider the battery replaced.
The only reason Apple "decides" this is because they designed the laptop with a non-replaceable battery. It would cost less money and be much less wasteful for them to replace just the battery, but they can't, because it's too difficult, because they designed it that way. This isn't the "best" way, it's most likely the only way for them to do it at their scale. They replace just the battery in the MacBook Air because they designed it with a replaceable battery.
The top case is replaceable -- Apple can replace it, and so can the user, if they can source a replacement. But the battery, individually? It's not replaceable. That iFixit guide you sent is not news to me. This is what I was referring to when I said:
> Yes, technically it may be possible, but if even the company who made it thinks it's too difficult, it's not very useful or accurate to call it "replaceable".
Most people (including myself) would never consider performing this procedure on their device. Apple won't do it. You would be hard-pressed to find a repair shop that would. It's far too risky and time-consuming.
Your definition of "replaceable" seems to be "theoretically possible to replace, even if extraordinarily difficult (by design)". Under this definition, basically any part in any product is "replaceable", so it is not a useful definition. Apple could probably not have made it any more difficult than it already is to replace the battery, and they don't even do this themselves, so it's perfectly fair to call it non-replaceable. Definitions are subjective; we must collectively decide how to use the word "replaceable" in this context, and I see no reason to adopt the almost meaningless, Apple-friendly definition. For example, depending on the definition, you might even be able to say the MacBook Pro battery is "user-replaceable". Does "user-replaceable" mean "designed to be replaced by the user", or "easily replaceable by the user", or "theoretically possible to replace by the user"? If the latter, then the MacBook Pro battery is user-replaceable, I guess.
I think a better definition of "replaceable" would be "feasible to replace, at least by the company who made it", and an even better definition would be "feasible to replace by the user". Either way, your definition is not useful.
> After having 3 iPhone 6s battery replacements done by Apple, I guess I’m somewhat okay with glued in components as long as _someone_ can service it.
The iPhone 6S battery uses adhesive pull tabs like the MacBook Air, so it is (easily) replaceable (by Apple, or by users, or by independent repair shops). The screen is lightly glued on, but that's standard procedure for phones these days, and the removal process isn't too difficult, risky, or time-consuming with the right tools [1].
In fact, it has been relatively easy to replace the battery in every iPhone released since at least the iPhone 4, and every iPhone since then has scored at least a 6/10 on iFixit [2]. So, afaik, the non-removable battery is currently only a problem with these Apple products: MacBook Pro (1/10 iFixit), AirPods (0/10), AirPods Pro (0/10).
The battery life was never stellar, compared to previous and of course current Airs, but my daughter can use it a full school day without plugging in.
>Anyway, I think the idea that a $1300-$6500+ laptop ships with a glued-in, non-replaceable component that Apple themselves admits is "consumable" is just absurd, regardless of how long it may last.
This is where my view differs, since I can just hand it in for replacement, I don't have any issues with that. To me it doesn't matter at all if I can do it myself or not. Well except if you have to leave it at the Apple store for several days, then that's very inconvenient.
In general I just see it as a practical and financial question, and since I use my laptop 10 hours a day, and earn a very good salary from it, 200 dollars more or less every three years is really insignificant.
In years past you had to bring up the MBP but now the Macbook Air tips the scales.
I got my 16 GB Macbook Air for $1000 at Microcenter. For development purposes it's the same as a Pro, except it doesn't have a fan (and has never throttled) and doesn't have a touch bar (which is great)
The MBA is an insane value with the advent of the M1, a complete turnaround from the old days.
Also it has a trick up it's sleeve when it comes to this:
> the complete lack of repairability. As a reminder, your Mac's battery will die at some point, and your Mac will become useless.
First off, Apple will replace the battery in a MBA for $129 (vs $200 for the MBP), so a little alarmist...
But more importantly the new Air uses stretch-release adhesive (think command strips) and doesn't require removing the logic board for battery replacements. The MBP didn't inherent this improvement.
It's no Framework but it makes it the value proposition that much sweeter...
The battery is indeed doable, but the keyboard and screen on the macbook air are more of a challenge, although the parts shouldn’t be hard to find in a few years. With the Framework laptop those parts are easy to replace but bespoke, so framework’s repairability will be determined by parts availability in half a decade.
This is a good point. "Repairability" is a lot less important to me than lifetime. Right now, I have a MacBook with AppleCare. The author's criticism with AppleCare is that it can take a week for Apple to repair the computer. That's reasonable for me.
So the real test of the Framework computer is not in the first week. (Although the initial impression is very impressive!) The real test is whether I can, 3 years after buying the computer, replace the battery more easily than Apple could replace the battery in a Mac. Modular hardware is limited by the availability of the parts, and Framework doesn't have the brand to convince me that they'll be around for longer than I can get my computer serviced by Apple.
Do you mean the criticism is reasonable or 1 week repair time? Because the latter is absolutely not for anyone that uses their laptop as their primary working device. A week downtime is completely unacceptable.
I'm a bit conflicted about this, because on the one hand: yeah being unable to work for a week is obviously bad, but, devices are going to fail and it could sometimes be the difference between _you_ wasting a week debugging or not.
I like the idea of being able to fix my gear if I'm away on vacation or there's no apple store around. In fact, I'm overall a big fan of repairability because I'm a bit of a tinkerer.
But if it's a work machine? Then I'd rather have a replacement on standby- and Apples time capsule/time machine stuff works better than other backup/restore systems I've used.. replacing a machine is a 2hr process if you have something usable in stock.
Eitherway: I work in Europe, so these machines literally cost more than a monthly salary, it's cheaper for the company to have me sit on my hands for a week than to have a spare laptop for me.
There are a lot of likely failures on a laptop/computer which shouldn't require any more than an hour of downtime, because they should be trivial to fix, but can take your Apple device out of business for weeks even. As much as love my Apple devices, this alone might drive me to the Framework laptop.
Things which should be fixable by the user/IT personal on site:
- the battery
- the fan
- the keyboard
- storage
Luckily I have discovered an independent certified service provider for Apple devices, which is as flexible as the Apple rules allows for repairs, but just my experiences of getting the fan of my Mac Mini fixed by Apple were horrible.
If you were to buy a proper business computer like a ThinkPad or a Dell Vostro/XPS, you would have a next business day or sometimes even 4 hour on-site warranty where a technician would come to wherever you are in the world and fix the part for you right there.
1 week is an unacceptable wait for a business computer.
Agreed. I had a tech come to my hotel room in Hong Kong and fix my Alien (Dell) computer. He replaced the screen and keyboard under warranty in 2016. I had a Dell that I dropped off for same-day service. I have a Thinkpad T430u I bought back in 2012, and it is running great. Very solid. I run Kali Linux on it, and I do a lot of coding or writing on it. I love the keyboard, and I am one of those fans of the thumbnub! I have owned all sorts of computers from 1978, and this is by far my most rock solid one. The original battery is still in it, and it is not holding the charge it did, but hey, 9 years is a long life for a battery in these things. I am now using a MSI Stealth G65, and it is great, but let's see how it holds up. I've dropped it twice in two years, but it seems fine. I use it for my 3D work, CAD, and UE4 fun. I may buy The Framework to see how it goes, although like some others, I worry the parts will not be available two years down the road, and if they are, at a normal price. It's a chick-or-the-egg scenario: people need to buy them to create a market for this to happen.
>I'm a bit conflicted about this, because on the one hand: yeah being unable to work for a week is obviously bad, but, devices are going to fail and it could sometimes be the difference between _you_ wasting a week debugging or not.
These devices are sold as "Pro" presumably meaning Professional. Not being able to work for a week is not professional. There are companies that give you a repair time in hours, not days.
>it's cheaper for the company to have me sit on my hands for a week than to have a spare laptop for me.
I'm self-employed so a week downtime cost me a week's salary. But even so, I wouldn't want to sit on my hands for a week, because I consider that unprofessional.
Resources can be shared so even for a company with 10 employees it quickly becomes viable to have one spare laptop/Mac mini/whatever at hand just in case.
If you can't wait a week, go buy a new one and restore to it. When your repair comes back return it. Apple's return policy is 2 weeks and is actually 45 days when you ask nicely.
Another thing to consider is that, if everyone is reluctant to try the Framework laptop because they're not sure about long-term support, it will never receive said support. I'm willing to take the risk here, for the greater good! I don't even perceive the risk to be that high, to be honest.
> if everyone is reluctant to try the Framework laptop because they're not sure about long-term support, it will never receive said support. I'm willing to take the risk here, for the greater good!
I dunno, this whole idea of 'trickle down innovation' that ends up coming at the expense of the working class is a very bad deal when you consider the amount of advanced technologies available today yet which have been enclosed/commoditized as 'intellectual property'.
Another example is Musk's scammy 'secret master plan' hustle that tells a feel good (yet misleading) story to the propertied class that they should 'buy a $170,000 Tesla to help make mass produced Tesla's possible for poor people'. Which is a rich story when you see that big oil companies, together with governments, suppressed viable electric car technologies for years ('Who Killed The Electric Car?' documentary). My point is that these are deep systemic issues that should be remedied at their root, instead of them being presented as something the non-propertied -class should plan and pay for, especially when you consider that we have actually already paid for it (remember we gave them those big juicy low-interest government loans).
At this point in time, buying a modular laptop like a Framework computer should have no risks involved with it.
We need to just grow many more open source standards. All that proprietary hardware and software does is remove valuable feedback loops and lessons from the commons. It criminalizes cooperation, interoperability, repairing and repurposing. Only the propertied class wins here.
All the above arguments only compound and multiply when you consider that most of the technology that exists today was developed with taxpayer backed government loans (Mazzucato: The Entrepreneurial State), meaning that, as she puts it, "we have ended up creating an ‘innovation system’ whereby the public sector socializes risks, while rewards are privatized".
Even if Framework goes out of business, most crucial parts that are likely to fail are standard (hard drive, memory, battery), so you'd still be able to replace them yourself. And even for the ones that aren't (like the modular ports), the schematics are available [1], so anyone would be able to make new ones.
> The author's criticism with AppleCare is that it can take a week for Apple to repair the computer. That's reasonable for me.
One of my biggest reasons for using Apple laptops was the now-defunct Joint Venture program. I enrolled all our company laptops under that program expressly for the loaner laptop during repairs and priority support. The ability for employees to walk into the nearest Apple Store to the current client account and walk out with a loaner the same day to restore from backups and return to the client the following day was a no-brainer business expense.
Now that program has no functional replacement for that benefit, switching to Framework looks extremely attractive, especially as more of our work finds ourselves in containers.
I'm currently working in China. My 2015 xps has been getting slow.
So I took it to a nearby dell garage for them to replace the 9550 motherboard with 9570 for roughly 300$ while keeping all of the other components unchanged.
I've honestly been shocked at the level of service.
Sadly, the 9500 has moved to a different chassis so further upgrades are impossible..
Does anyone know if Apple is even capable of replacing the battery, or if they replace the entire top case assembly, on 2016+ MacBook Pro models? Apparently the Air's battery is indeed replaceable (...by Apple) but I'm not sure the Pro's is, which is incredibly wasteful, if true.
> The real test is whether I can, 3 years after buying the computer, replace the battery more easily than Apple could replace the battery in a Mac
No, the real test is whether you, 3 years after buying the computer, can replace the battery more easily than you could replace it on the Mac. Applecare just covers the cost of a new laptop when your Mac breaks during warranty. "Repair" is a generous way of putting that.
Agree with everything said here, as a MBP user (2010->2015->16->19) for the last however many years (PC before then and as a secondary now)
The closed ecosystem and continued hardening of even access to inside of my machine is troubling and obviously anti-consumer. Those points coupled with the lack of even 32GB of RAM in an M1 model (not to mention the cost of the brand) makes my next computer a probably-not-apple machine.
I also have the 2019 MBP, although not base model because I'm not a monster. Didn't the base model that year still have like a 128g SSD or something comically small? (I'm only jokingly being judgemental here, although it does baffle me why someone wouldn't upgrade at least a tiny bit to a level of practically if you had the choice to buy the computer in the first place) I'm kind of satisfied but not pleased at all with it, and I only have it because my 2018 non-touchbar model had a hardware issue which they replaced it with, and the 2013 model before that was stolen. It's marginally uograded to a 256gb SSD and 16gb of ram. I'm not pleased, specifically because of how loud the damn thing is. I was able to score a deal on an old gaming PC that runs quieter (or at least less annoying) on more demanding tasks, meanwhile I can't browse Instagram or watch video without creating a ridiculous noise. The 2013 model did not have this problem, and yes I'll acknowledge that video is a demanding task and modern websites are garbage, but cmon.
I did at least upgrade to 256 GB storage (...for like $200 or something ridiculous), haha. I just didn't think to mention that minor upgrade.
Edit: I'll also mention that the fan noise hasn't been a problem for me (I rarely notice it), but that may be because I don't use too many demanding programs.
Ah I see. A friend of mine who definitely has the money and makes it as a developer bought the real base model with baseline storage, and it's a bit funny how little he can do with it. With 256 it gets me by most of the time, but I'm out of luck if I need VMs or games really. I can pick one. Didn't have the budget at the time to upgrade past that though, for the absurd prices you stated.
Sometimes the fans will spin up in the middle of the night for some kernel process presumably.
> As a reminder, your Mac's battery will die at some point, and your Mac will become useless.
You can replace the battery yourself for like 50 bucks, or have Apple do it for 2-3x that. We have MacBooks from 2013 in perfect working order at home.
You technically can, in 66 easy steps and about 1-3 hours [1], all the while risking damaging your laptop. Compare that with 3-6 minutes [2] for the actually replaceable battery on the Framework.
Replacing batteries for MBAs and MBPs are completely different experiences.
I did the same with an MBA, replacing its battery in less than an hour by virtue of the battery not being glued to the case, nor having to worry about hidden clips built into the case, nor having to carefully dig out the guts of the system and putting them back together.
The MBP is in a totally different class -- as iFixit shows, you don't start removing the battery until step 51! And they're not kidding: to get to the battery, you have to do things like remove the trackpad assembly and pry out the logic board assembly.
Sure, you can amortize the 1-3 hours of labor over years of device ownership, but at every step, you're dealing with delicate parts, putting you in danger of turning your expensive, trusty daily driver into a brick.
Batteries are a wear item, guaranteed to have to be replaced. Apple's managers, designers, and engineers could show more empathy for their customers by making it easier and less risky to replace their wear items.
The thing is, I don't even know if you're paying Apple 100-200 bucks to replace the battery, or if you're paying them that much to replace the entire top case assembly, along with the battery, since it's so damn difficult to replace just the battery, I don't know if they even try. This is at least what they used to do.[1] I tried to research and find if this is still their practice but it was proving difficult. Very wasteful if so, and all likely just to make it difficult for the user to repair their own machine. (For what it's worth, recent MacBook Airs reportedly have Apple-replaceable batteries.)
Whatever else, I can guarantee you that they are not doing it on purpose in order to make it harder to repair them. They have just been prioritizing thinness.
> or if you're paying them that much to replace the entire top case assembly
The link says that "previous" MBAs had replacable batteries, and I know that 2012-2014 had it, so I don't think it was ever the case.
> For comparison, the previous-generation MacBook Air has a screwed-down battery that can be removed and replaced by Apple and its service providers without a top case replacement, in line with other non-Retina notebooks.
Also, this article was written in November 2018, and here's what it says:
> the battery can be individually replaced in the new MacBook Air [...] In all other MacBook and MacBook Pro models with a Retina display released since 2012, when a customer has required a battery replacement, Apple has replaced the entire top case enclosure, including the keyboard and trackpad.
This implies that, at least from 2016-2019, they were replacing the entire top case assembly in MBP. I have no reason to doubt they still are. The article is mainly about the MBA so it is a little confusing, to be fair.
One final thing I'd like to add is that you can have a slim computer with a replaceable battery. You probably typed this comment from one [1]. I don't see why they couldn't adopt this adhesive strip approach for the MBP. Perhaps it would create a teeny less room for the battery, shaving off 30 seconds of battery life on a laptop that already has significantly better life (when new) than most? I don't think this """trade-off""" (if it even exists) is worth it. There are also plenty of non-Apple examples of laptops with replaceable batteries and a MacBook-level slim design. The XPS 13 apparently beats it: 15.35 mm vs 15.6 mm. The Framework Laptop is not far off at 15.85mm.
Of course, having the ability to replace everything else is great, but the battery should be a given. None of the other components are "consumable", though they might still fail eventually, or you might want to upgrade them.
Consider also that anything that Apple does to make their product less repairable by the end-user also makes it less repairable by independent repair shops. I'm not sure why anyone would waste their time and money taking their Framework Laptop to a repair shop just to have them replace the battery when they could do it themselves in a few minutes (and you don't need to "live on HN" to follow basic instructions), but it's still an option. Whereas this is usually not an option with MacBooks.
I don't know if this was directed at me, but I'm not saying it isn't easier and cheaper to replace a Framework battery. My point is that almost nobody cares. You replace your battery at most once, after like 4 years. 100 dollars every 4 years is insignificant for almost all laptop owners.
I agree that they don't care. Perhaps they should, though? It's just unnecessarily wasteful (assuming they replace the entire top case assembly, which they probably still do) and expensive. By the way, it is currently $200 [1], but it could change. I don't think it's too unrealistic to imagine that Apple does this intentionally in order to get you to buy the latest model. Why keep investing in an old, dying laptop rather than just get a new one? It makes sense to invest in the Framework Laptop because everything is replaceable, including the mobo/CPU. But it doesn't make sense to invest in an old MacBook that might have other unforeseen, unfixable issues (unless by Apple for a fortune) in the future, even after a battery replacement.
If MacBook users could either replace the battery themselves (or take it to any repair shop if they somehow don't have a few minutes to spare), they wouldn't have to face the "repair or upgrade" dilemma until much later in the laptop's life. For Framework users, it isn't a problem at all.
> I'm seriously considering selling my MBP and buying this instead
Apple's trade-in program is pretty good. My 2012 MacBook Pro was still tradeable in 2019 for something like 400 bucks. I would consider that also if you don't want to abandon the Apple ecosystem.
Before having to use a Mac for work I preferred a P53 and an X1 as work laptops, but honestly now that I'm forced to use macOS again I don't miss Linux at all. Would be great to be able to choose though, but I'm kinda locked on Xcode because it's our build system.
Do they go arbitrarily far back, or cut off at some point? (I have a 2013 Air I've barely used for years... Should've thought of it sooner.) The site just lists model names for the Macs, which is oddly non-granular compared to the iPhones by number: https://www.apple.com/uk/shop/trade-in
There's an estimator below where it says "Select your device for an estimate", and it walks you through a serial number thing and then gives you an accurate estimate. I think it does cut off at a point but I'm not sure when, and it's not the same for all devices.
Perhaps, but I think it’s a fair comment to say that more effort has been put into making it nice, objectively, than any Linux desktop alternative.
Maybe some folk prefer the alternatives, and that’s fair, but if you did some pretty objective metrics of bugs, visual inconsistencies, user workflows for standard tasks (like adding new hardware…) well, it’s impossible to deny that Apple has significantly invested in making these things “nice”.
Its really quite difficult to argue the software is really the compelling offering with Framework laptops.
You can already get that on other systems if that’s your thing, and that’s never been enough to do more than barely raise a few eyebrows from folk who are particularly keen.
I consider package management and selection of packages as something nice. I also consider interoperability and freedom as something nice. A lot more effort for those has been put into the Linux personal computing ecosystem than MacOS.
... By subjective, I do not mean KDE vs Gnome vs Windows Desktop vs MacOS Desktop and all the Desktop Apps. FOSS is more than these types of things, I don't care about that stuff, it's just noise to me, the less of it is the better... Values are subjective, I value different types of things, but at the same time I am aware the primary values that DEs and big desktop apps concern themselves with are the most important to a large set of users, and that's ok - but you are ignoring the values outside of that area.
It's like I'm using my bike to ride down a mountain... you are using your bike to do city racing and don't understand why skinny tires and fixed gear and foot straps are not important to me.
I think you'll find that the effort is measurable, and not subjective.
How can you possibly argue that the army of programmers Apple has devoted to working on this one stack in one domain do less work than the widely distributed work done by a far smaller number of programmers on a far larger number of desktop stacks?
Don't be ridiculous.
I didn't say that the outcomes were better, just that more effort has been put in, in terms of hours-of-human effort.
> GP: Its really quite difficult to argue the software is really the compelling offering with Framework laptops.
That's a very different statement to "Apple has more person-years." In addition, by that measure, Windows is just as good, and works just fine on a Framework.
Also comes from a company that does its best to make products irreparable, irrespective of the environmental damage it causes, to pad their bottom line.
If we don't vote with our wallet, for the right thing, we deserve the worse future that we are going to get.
I do like MacOS more than Ubuntu (which I also use), and lately, customizability of Ubuntu has gone down too (As an old Mac-user, I want a menu bar, I actively dislike menus on each window) and the M1/M1X look real sweet compared to intel.
But Apple is the most anti-consumer company there is and I consider buying a Framework to give them a boost and hopefully purchase a laptop that will not need a full replacement for years. The ecological footprint of electronics is bad enough as it is, no need to reward companies that want you to throw away whole computers after 3y.
The fact an i7 scores the same as M1 in those benchmarks already tells you they are not very representative of reality. Plus power consumption will be massively different between them which is crucial for a laptop.
I like Mac OS and apple products ecosystem. For me, those things plus the nice premium build quality are key and worth the extra costs.
I don’t use windows and not really a fan of spending like 24 hours a year just dealing with Linux driver issues and other headaches only to have a worse user experience.
That’s just my preference though so I’m probably never getting the foundation laptop but think it’s a cool product.
I mentioned in another comment, the Macbook Air is the same price, has the similar enough performance for all practical purposes (better than an i5 by miles), no touch bar, and a more user-friendly battery replacement procedure
If it was 1920 wide of any height of at least 1080, I can now display 1080p media with no scaling (its really hard to get players to not scale while full-screening, and scaling from 1920 wide to 2560 wide can never work well, even with high quality Jinc or even cutting edge machine learning scalers); but also, just streaming desktops that are already on standard monitors.
But really the TLDR is that “non standard resolution” is a bit of a red herring. There are many sizes and shapes of monitor available and the concept of “standards” is an anachronism.
Even something as seemingly obvious like 1080p is a flawed standard because many TVs enlarge the input slightly to crop away junk on the edges, resulting in a physical 1080p screen showing an approximately 1060p image.
The single-thread performance of the M1 doesn't get close to the competition at its power level, even a year after its release. [0] The top item in the list is the M1, a 10W CPU. The second is an Intel requiring 125W. The highest-scoring i5 also requires 125W, and is 15th in the list.
Just a reminder that the M1 MacBook Air has no fan, and is still at the top.
When choosing a laptop you of course look are more factors than just performance, but for many, that alone will be an extremely important consideration. Not to mention that - incredibly - there isn't even a conventional battery life tradeoff for that top performance. In that sense, the M1 is a no-brainer.
M1 does beat every other laptop processor atm in single core speed, but the latest Ryzen 5X00U (and 4800U) are better in multithreaded perf (both absolute and per watt) - https://www.cpubenchmark.net/power_performance.html. Hopefully the Framework Laptop offers mainboard upgrades with these at some point.
I very nearly bought an M1 Macbook, but realized I didn't want to live with currently early stage linux compatibility. I'm still thinking of buying an M1 mac mini as a little home server though - that tiny power consumption combined with such good CPU performance is perfect for that use case (although it would be nice if I could somehow attach a bunch of hot swap HDD bays to a mac mini too).
To me the M1 one is the differentiator not even because of its performance, but because of what it allows the laptop to be: Absolute silence with 20 hour battery life. To have it also be screamingly fast is very nice to have though.
You forget the computer is there! I have been so overwhelmingly happy with my M1 (with RAM upgrade) as a dev machine. It's THAT good. My co-workers cry when they see how fast my docker containers build.
Would love to get some feedback on M1 for development.
I am used to work in Linux OS VM(Virtualbox and Vagrant in MacOS) and do most PHP/Python web development. It seems that Virtualbox won't be supported and There is only one Linux VM option available [UTM](https://github.com/utmapp/UTM)
I would hate to invest too much time for a new dev environment just for M1. How's your experience?
That's very odd, I compile a TypeScript react project practically all day while I'm working on it and haven't noticed anything. Could be legitimately faulty if you're seeing a load of issues.
Performance of the M1 is indeed impressive, and I don't mean to imply that these CPUs all perform relatively the same. But I have a 2019 Intel MacBook Pro with an i5, and it does everything I need a laptop to do: take notes, browse the web, and do some light programming and gaming (e.g. Minecraft at 60 FPS). It also runs x86 programs directly rather than through an emulation layer (though this will become less and less of a problem as time goes on).
I have thought a little about this: It is not _that_ much better than the last gen ryzens. How much of that is because apple bought more or less all of TSMCs 5nm process?
Early adopters, hardcore techies may love repairability but they will buy whatever comes out next.
What counts is the majority of buyers after these techies.
It's similar the early buyers of a Tesla who could say "ahh it will be so eco I don't have to buy another car for 10 years" and, 3 years later they buy themselves the new model as they would have normally do with a combustion engine car. Consumers first.
The key point is that repairability is important and is a marketing ploy but isn't a real fundamental issue for wealthy techies. They will just get another laptop in a year or so.
It's the non techies that are important, not us! It's the majority and long tail, not the early adopters that this laptop should be for!
As a "wealthy techie", I have the opposite view: as much as it's not difficult to me to buy the things I need or want, I hate the amount of waste generated by replacing things before they need to be replaced. I currently have a 2018 Dell XPS13. I really want a Framework laptop, but my current laptop works well enough (I wish it had more RAM and a larger SSD, though, which is a big part of why I want a new one), and it's hard to justify that waste, even if I can sell or give my old laptop away to someone else who needs it.
The 51nb "Thinkpads" achieve three goals you may find useful: reuse, upgrade & repairability. In effect, you are recycling while upgrading!
In addition to repurposing some of the old electronics & the entire chassis, they have massive performance, complete repairability & upgradability, a max of 64GB RAM & 3 storage disks, all in a smallish (but thick) Thinkpad form-factor. Oh, and they come with two power supply options (USB-C and barrel jack) and have functional VGA ports and ethernet jacks. Batteries are a bit of a problem though.
So you're not a techie in the sense it's being used.
Techie can mean someone who's an expert on technology, but when it comes to talking about techies as consumers, it's about people who always chase the latest and greatest.
It's tautological, you're not a techie in this context if you don't consume new tech.
> It's the majority and long tail, not the early adopters that this laptop should be for!
Why?
This is the way we end up having only mass-market, lowest-common-denominator products. Not fighting to get quality tooling for our niche is one of the important reasons we don't get any.
The real problem with the Framework laptop is that the ordinary consumer will not assemble their own laptop or upgrade it later. They will just go and buy a $500 cheap laptop with a 15" screen and if it breaks, buy a new generation.
> Unfortunately, not many people see repairability as a feature yet due to the toxic status quo
I’m struggling to see the feature in OPs article considering that they have gone through more laptops since I bought my MacBook Pro 2016 than I’ve owned laptops in my life.
Why was it necessary to replace a thinkpad every year, and what makes you think that the author isn’t just going to replace this one?
I have a feeling I’m going to have my MacBook Pro 2016 for longer than the author has this one.
I do support repairability, and I really don’t see why we let companies get away with ducking it over, but at the same time, there are now 3 different versions of the fairphone, sort of defeating the point of it. I’m sure frame.work is better than the fairphone, but I think you get the point I’m trying to make.
Unless you get faulty hardware, you’re probably better off taking good care of it. I mean, I still have an old 14” iBook PowerPC that works perfectly well with Linux. Its hardware is obviously not up to my current needs, but I don’t see why my current MacBook Pro 2016 won’t live with me for another 5 years at least. And by then I hope legislation has forced Apple into making things more repairable.
But there isn’t actually a very good reason to replace your hardware unless it stops working. At least not in my mind, and none of my Apple products have stopped working on their own. I’ve been luckier than some people in that regard, but I also still own a Sony tv from the early 00s that’s perfectly fine when hooked up with the MacBook so sometimes it’s also about buying the things that are tested so you don’t end up in one of the famous Apple recalls or the issues that come from them.
It’s very anti-consumption not to buy the newest thing, I know, but I sort of think the swap to frame.work for the sake of buying something repairable defeats the entire purpose of wanting something repairable unless you make the swap after your old machine literally breaks
It would be a no-brainier if Macs were still on Intel. But post M1, I’m not so sure…
The Air gives this thing a run for its money at a (albeit $21) lower price.
That said, I’m 100% behind the Framework concept though - I like the direction they’re heading in. If I needed a Linux/Windows laptop it would be a serious contender for me.
As a hardware nerd and standard laptop user I love the M1.
As a dev, I feel it's not ready yet on the software support side. It will eventually come, but if we follow the "don't buy now on the promise of future updates" principle, right now not being on AMD/Intel is a downside.
So I’m currently on Intel Macs and looking to move to an M1 imminently (holding out to see what MacBook announcements come this autumn).
What are the biggest issues you’ve seen so far from a dev perspective?
The Framework laptop has seriously got me considering moving away from macOS to Linux as a daily dev machine and keeping my Macs for home use (and music production for which I don’t want to leave Logic Pro).
But by the same token I’d love the crazy battery life, no fan noise and raw speed of an M1 for my main coding machine….
I also own an M1 Mac and it's subtle or large compatibility bugs with specific development tools and applications.
Two cases where my work was impacted
1) I work with laravel running on a couple of docker containers. There's complicated bug where a memory leak caused by some low-level package all PHP docker containers rely on which is exclusive to Rosetta. It caused all my php containers to be out of memory permanently.
It's not an issue I had in the beginning, don't know the exact cause but I wasn't able to fix it once it started happening, so had to move away from docker for development (which was not problematic luckily)
2) It can't build OBS. OBS, like many other open-source software packages need a maintainer to add support for M1 build-steps. There was a major dependency that didn't have M1 build support yet, so simply not possible until that gets fixed.
This is the case for many packages, it just needs time before everyone catches up.
But for the rest it's perfect imo. The battery life is insane, the no-noise fan is insane, the no-heat is insane. Only battery/heat issues I ever get is when I game on it, which it isn't meant for necessarily either but just saying.
Edit: Oh I forgot to mention that while the keyboard feel is great, its build quality is.. questionable. One time a key just got stuck out of nowhere. Just stuck, nothing to do. I tried repairing it myself but oddly 3 months ago (not sure about now) there were no guides on how to safely tuck it loose, clean it and put it back.
I broke it unfortunately. This has not been my experience with any other laptop or keyboard ever. I hate this part of Apple's philosophy the most, "It's broken? Please give us a lot of money."
For perspective, I moved to the mac because I didn’t enjoy compiling from source or debug install logs.
I use homebrew, and a number of stuff is still on x86 only, needing to run in a Rosetta terminal. My ruby install is one of these.
For php, php-brew couldn’t build the version I need till the end, so I bailed out of it and moved it to docker. Same for mysql 5 (upper versions seem to be available for the M1).
That was right when Hashicorp announced the Desktop pricing change, and I didn’t know what my org would do so I started checking the other options. Except docker-machine/engine is supported independently only on Intel, M1 chips need docker desktop to run containers (at least that’s my take, I’d love to be wrong).
Basically right now my local dev relies on Rosetta and docker, and I seriously think about buying an intel NUC as a side machine where I’d remote into.
I disagree. If you configure it, it will cost you a ton of money. The base configuration doesn't have even basic ports.
At contrary, today at work we bought for 900 euros (700 without taxes), plus less than 100 euros for an extra 16Gb RAM module, a thinkpad T14, that has all the features of the base model, but with 512Gb SSD, Ethernet directly on the laptop, a better keyboard, a trackpoint, more USB ports, a fingerprint reader, and a Windows 10 Pro license.
To me it doesn't make a lot of sense this laptop. Regarding repairability, it's just like any other Thinkpad, the modular IO, what is its purpose? Also you are adding components that can break, consume power, and waste space. And still you don't provide an ethernet integrated on the laptopt itself, so you have to always carry around a USB adapter that doesn't work as reliably as an integrated one.
> The base configuration doesn't have even basic ports.
The preconfigured model comes with your choice of 4 ports, the options being USB-C, HDMI, microSD, USB-A, and DisplayPort. The default config is 4x USB-C, but you get to choose.
> a fingerprint reader
Framework has that too.
Anyway, as far as I understand, if you configure it yourself, it will actually cost less money. This is the cost of the preassembled version, but for example, you could buy the "DIY edition" with everything that the preassembled model has, just without Windows, for $909 (~775 euros). The price drops even further if you remove the SSD, RAM, or Wi-Fi module and buy your own.
I understand that everyone has their own priorities when it comes to a laptop. I'm only comparing laptops within the "thin and light" category here. At 1.5kg, 13" x 9" x 0.7", the ThinkPad T14 probably falls into this category, but it is slightly heavier and bulkier than the options I listed. One of my top priorities is thin/light, but I understand those more concerned about performance or # of ports and Ethernet connectivity.
Additionally, I like this company's dedication towards right to repair, and buying the laptop supports them. It's a huge bonus that the laptop has competitive specs, though.
You should be comparing it to the MacBook M1 Air which is $899. Battery life & smoothness/performance they achieved with M1 is phenomenal. Golden handcuffs.
Let's be honest here, windows is going to be slow as a moonwalking Michael Jackson with that processor. Windows is, I kid you not, an order of magnitude slower than linux/macos.
What? Windows 10 isn’t particularly slow on my i7-4770K. You’re telling me that the i5-1135G7 (which benchmarks faster) is going to perform worse than my 8 year old CPU in general OS use?
If you’re only used to windows you might not notice how inefficient it is, and you may also be doing the “right” kind of workload for windows. On an i7 6700hq thinkpad I did the experiment and ran both windows and linux for an extended period. In my experience linux was about 30% faster for typical web development tasks (npm install, angular build, …), but performed around the same for web browsing, and was much slower in video calls.
Solaris (commercial and open-source derivatives), freebsd, openbsd, at least a dozen linux distros (from debian to things like gentoo, sourcemage, LFS...), MacOS, Windows... a handful of "toy" OSes... I've run a lot of different OSes over the years. I've used heavyweight desktops, lightweight desktops, straight X, straight terminal... across a large variety of hardware from the 90s until maybe 2016 or so. The point being, that I actually do have a lot of experience in a wide variety of environments with a wide range of interactive experiences.
Currently, my primary working environments are some sort of Unix, with CDE, stumpwm, or dwm - pretty ultralight environments by today's standards. I do think these environments can be described as "fast".
As to the actual machine in question (with the i7-4700k), that's my gaming desktop. It's run mostly Windows over the years, with some OS X thrown in there. For the past year or so it's been pretty much just OS X, but I do boot into Windows from time to time for this or that. Neither of these OSes are as fast as the stripped down environments that I prefer, but neither is appreciably faster or slower than the other, either, in my experience (unless something is broken; I could of course tell you horror stories about both platforms). From my perspective, both of them are enormous inefficient monstrosities, but the hardware is also really fast.
> In my experience linux was about 30% faster for typical web development tasks (npm install, angular build, …)
Yeah, these types of things aren't great on Windows. Especially if you run into corner cases, a lot of tools that were written for Unix environments go to dog on Windows. I don't know what most of the individual issues are, and I don't really know that it's even a matter of CPU (vs. operative latency, deadlocks, etc.). But the poster I was responding to seemed to be saying that Windows itself is too much for a modern i5 to handle, and that just isn't my experience.
You don't know what you're missing out on. I have a ryzen 3600+nvme and it takes less time to boot into Linux, unzip a large file, and boot back into windows than it takes to unzip the file on windows, even if anti-virus is disabled.
Unzipping is a single threaded task, so it's going to run at the same speed even on a 5950x.
On the hardware side, sure, it would be great to have cutting-edge tech. Money is a factor. Realistically, my hardware is good enough for my present needs.
On the software side, I am absolutely aware of what is available.
File operations do seem pretty slow on Windows. What I've noticed is that latency for individual access seems really bad. Does that latency actually scale with CPU speed?
Edit: btw, what were you using to unzip the file on Windows? How about Linux (unzip?)
What Linux kernel/distro, what sort of zip file (large files vs small, encrypted?)
I have an XPS 9560 and yeah the build quality ain't great. Had to replace the motherboard and battery after two years, and while the lid is aluminum, the bottom is not, so if you have it hanging over the edge of a desktop or a non flat surface, it flexes enough that the touchpad button no longer registers. These issues might be resolved in newer models but I probably wouldn't go with a Dell again, unless it turns out that they are the best of the worst after a comparison.
Anyone know of a Linux equivalent to Alfred and BetterTouchTool? I can get by with Gnome but these two tools are such a workflow gamechanger. Workflows and copy buffer with seamless integration in particular.
I almost feel like I have to buy this since it really is the type of laptop I’ve always wanted - but the allure of the M1 is still there, making me second guess myself.
Hear, hear! When coding (or reading) vertical space is much more important than horizontal space. Reading 300 characters wide is impossible, consequently also code also has to have shorter lines. So if your screen is ultra-wide, you either split it into areas (for which laptop screen is not big enough) or just waste a part of it. Vertical space, OTOH, is almost always used 100%. That's why I have all my widescreen displays in portrait mode too - still enough horizontal space and amazing abundance of vertical space!
I just got one of these last weekend(I ordered it in early August) and so far it’s really great. The modular I/O and general mission of the company was what initially sold me on it, but now actually being hands on with it, I definitely feel secure in my decision to get one. I can’t overstate how good these modular ports are.
I also really like that you can bring your own hardware in a lot of cases. For example I had an extra M.2 SSD laying around, so I ordered mine without one and installed it. You can also do this with the RAM, and even the wifi card.
The only thing I’ve disliked about it so far is the arrow keys on the keyboard. Having full size keys for left and right but split keys for up and down feels weird, I would have preferred all full size arrow keys and a small right shift(because let’s be honest, when was the last time you used the right shift key?).
For anyone curious about Linux on it, I’m running Arch and had basically 0 problems specific to the device. It’s my understanding there were some incompatibilities with certain kernel versions before so maybe some of these problems exist in distros like Debian with an older kernel, but I have had no issues.
All in all, it’s just an exciting project and nice to see innovation in the space that isn’t just rounded corners or a sleeker edge or something where they take modularity or performance away for the sake of aesthetics.
> because let’s be honest, when was the last time you used the right shift key?
Worth a try if you don't have the habit yet: use the pinky of your right hand when typing capitals with the left hand and the pinky of your left when typing capitals with the right.
Yeah, this is a weird comment from the OP. I use the right shift for 99% of my shift typing. This is probably because my version of "homerow" for keyboards is left shift, a, w, f, spacebar and spacebar, ., p, [, right shift. Which likely stems from years of gaming.
That's probably what I was taught, but in practice why? What's my left hand going to do with all its hundreds of milliseconds of free time while my right types a capital letter?
Eh? If I hit shift with my left and the letter with my right, that can't possibly be more stretching (it's probably less) than hitting both (the other) shift and the letter with my right?
For me it's more a matter of putting as little stress on my hands as possible. Pressing two buttons (shift and key) with one hand is just a little less comfortable than pressing just one key.
>when was the last time you used the right shift key?
I generally use the shift key closest to the key I'm typing. For keys near the center, I favor the right shift key.
The keyboard would absolutely be the show stopper for me, if I didn't just get a new laptop last year. The lack of dedicated Page Up and Page Down keys is unacceptable.
Finally somebody else mentioned it. I will never buy a laptop without a dedicated Home, End, Page Up, and Page Down keys.
I've used Chromebooks without them before, and I still to this day can never remember how to select while jumping to the end of the document, for example. It's like four keys all pressed at the same time in a very awkward way.
From the Mac side, Fn-Up/Down support seems to be universal for replacing the missing Page Up / Page Down keys. I bet Chrome is Ctrl-Alt-Fn-Down, copying Mac:
Down - Cursor trajectory
Fn - Page instead of Line
Command (Alt) - Document instead of Page
Shift - Text selection mode
(But I had to hit the keys and then look at my hands to figure out what they were, because I'm just used to keeping the modifier layers in muscle memory, so I could be wrong.)
On the contrary, I have those dedicated keys, and I am constantly accidentally hitting Insert when I want Home, which screws up line editing (which is 99% of the time what I'm planning to do after hitting Home).
For what it's worth, here are also [1] the Framework community forum thread about Arch and [2] the Arch wiki page about Framework in case you're interested.
I'm still waiting for mine to arrive (in the next batch) but I plan to install Manjaro when it does, and am cautiously optimistic that it'll be mostly painless.
Might be worth noting, I forgot about this when making my original comment, I wasn't able to get my USB drive for it to boot without disabling secure boot, but secure boot isn't something I care about so it wasn't a problem for me. I've heard it works but I can't comment on the specifics of it.
I wish that I needed a laptop so that I could buy one. I sincerely hope this company succeeds so that they are around when I do need a laptop.
> Yesterday, I put my 2019 Thinkpad on my pile of "laptops to refurbish and donate." I've bought a new Thinkpad almost every year since 2006. I think that's over.
It's addressed in the article and elsewhere in these comments, several times. They donate them after a year in order to upgrade. Seems a perfectly reasonable and responsible use case if you want a new laptop every year.
Having read the article, I don't think he really does. He buys a new one, refurbishes it and gives it away. Why that's a reasonable idea isn't explained (or at least it's not to my satisfaction).
Buying himself a new laptop every year was his motivation for quitting smoking, as he realized 17 years ago he spent about two laptops worth on cigarettes every year. It’s in the article, near the end.
Please get checked for glaucoma if you haven't already. Goes for anyone reading this comment who feels their vision is slowly getting worse. I started getting it in my 30s, it can strike early.
Yep, this is where I'm at too. 13" is just not enough space for me to get anything done productively. On my 13" MBP with my standard font size, my code editor can't show a full line of code without me having to scroll (VSCode with mostly default settings, font size 12).
"Self described early adopter consumer sees early adopter friendly project and supports it"
One might see some cognitive dissonance between the state of mind of being pro recycling and supporting reduce and reuse and being an eager consumer. However early adopters dance this line and lead the way for the masses to come after.
Only one of them has broken so far, and it was only an issue with the display. I repurposed it into a homelab/Podman host and it's been able to work just fine!
As a quick aside, if you're ever one of the 15 people who will likely do this, buy a Thinkpad dock. They're cheap, and it basically triples your I/O!
Just to continue the question, what do you do with the other laptops? You should have at least 4 or 5 more which are unaccounted for. I imagine only one is used currently.
Not them, but I keep a long trail of old laptops and generally do in fact keep them all in active use on a regular basis. For me part of the appeal was that I like distro-hopping, so multiple machines made it easy to keep rotating OSs without much trouble. The core bits (browser profile, password manager) are synced, and my projects live in version control that's easy to pull to any machine that happens to not have it yet, so I just... grab the closest machine when I want to do something and go. (And I tend to have them laying around multiple rooms so there's always one at hand)
I gave an old workstation to my mom and a spare T440p to my brother, now the x201 and T460s occupy my tinker station and bedroom respectively. Oh, and there's also a T420 that my other brother uses as a media server, but that's not really mine anymore :p
This is kind of a weird piece that really doesn't mean anything at all. I understand being in the "honeymoon phase" with a new piece of technology, but I don't feel like Cory is aware that he's doing that here. But there's some signs: He loved Thinkpads originally, but over the course of several years the company and the quality of the product went down the tubes. Now he's got a brand new laptop that he's only had for one month and is declaring it the best thing ever since sliced bread. But it's not really a fair comparison, a brand new niche product that hasn't been battled tested in any way versus a long-term established brand that he used for years. What will the Framework be like in many years? He offers extremely optimistic ideas, but obviously nothing concrete, because he just got the laptop.
I dunno, it just felt weird to me to be like "I loved this product I used for years, but it sucks now" and then say "I love this new product I've barely used!" without a hint of self-awareness that all the optimism and initial love for a product in the world won't keep it from turning into a pile of junk. How long until a "I went back to Thinkpads" article? A year, two, three?
Why being so negative? He seems to like how easy it is to tear down the machine and how easy it is to install ubuntu. That's it. I don't think this will change in 10 years, and I don't want to wait 10 years to hear about his experience.
I did think it odd that he was talking about the durability of a product that he'd barely unwrapped. But his other points seem cogent, and are orthogonal to how long he's used any of the products.
I don’t think he really talks about the durability, as much as the repairability.
For example, he notes that he hasn’t road tested it yet:
> However! Most of my use of this computer was from my sofa, while I was recovering from hip-replacement surgery. I haven't road-tested it at all.
> But I'll note here that if it turned out that a component failed due to my usual rough handling, I could replace it with a standard part in a matter of minutes, myself, in whatever hotel room I happened to be perching in, using a single screwdriver.
That reads to me as a pretty specific disclaimer and that he hasn’t stressed the durability, but more that he has a plan if it isn’t as durable.
(Though it’s a long article, and maybe he said something about durability that I missed)
I have a Lenovo Carbon X1. Can’t get more than 16 GB of RAM. Can’t get larger than 1 TB hard drive. Really crappy wifi chipset that blows up when I use a VPN.
I kinda feel like he explained what happened to ThinkPad. They went from IBM which for $150 per year would send out a tech anywhere in the world to fix your problem to Lenovo who… let’s just say isn’t as good.
How is it weird? He very clearly described the degradation in quality of ThinkPads. For what it's worth, I agree.
Also, using a laptop for a month is more than enough time to get a feel for how things might play out over the long term. He was clearly impressed enough to commit to an opinion in a short space of time. If his opinion changes, I'm sure we'll hear about that too. I think that's fair enough.
This is just how Cory writes in general. You’re not going to get a lot of measured maybes. He is an opinionated guy.
And in this case, what’s the harm? If a laptop doesn’t work out, it can be replaced. This one just got released, so a long-term reliability test isn’t even possible yet.
Or better yet, after 5-10 years of frame.work laptops being wildly successful, they will slowly start killing off the upgradeability/fixability of the laptop.
- we had 5 - 10 years of great laptops (or more likely 7-12 since people will put more effort into repairing their current models if the new are worse.)
- if they do this the market had already been proven and anyone with a few spare million $ can go and grab it.
I like what they are doing, but I am sticking with my 2 primary thinkpads for now, which is fine as well and certainly eco :) They are still a perfect level of repairability (t430s and a t470p) even if I can't upgrade the CPUs on them.
I am excited over seeing this project, but these things have stuck out for me:
1. I am concerned over long-term screen hinge strength. I can't see the build on it, but I will not cheap out on that after dealing with bad hinges (screwed into plastic, not a metal frame on a Dell)
2. I am concerned over the durability of the screen. I am not an expert at this, but I have no worry over the screen if I chuck it in a bag or my cats who sometimes stand on the top of the laptop when I put it on the floor.
3. I watched Louis Rossmann's takes on it - I was hoping he would go into the tactile feel of the keyboard. I can wait though for any commentary on that from a ThinkPad user. I can try to stop using the TrackPoint (would hate unlearning that) but a good laptop keyboard is essential. (I declare the best laptop keyboard I ever experienced is the one I have on my t430s. Lenovo has made the key travel lower and lower over the years. The t470p and others around that year I used is OK, but doesn't compare really.)
If anything were to go wrong though, hey, I can at least repair it :D But, I continue to stare in wonder over this. I'd love to be proven otherwise on these points.
The case/hinges has always been the first thing to fall apart in all my passed laptops. This is the third and the plastic case is showing age. The other two are still running well but they can't be moved around at all no more. Basically I want something as tough as an Apple Macbook but without Apple crap on it (I need Linux).
My next laptop will have the best case and hinges or there won't be another one. I can't stand seeing perfectly runnning old machines made unusable because cheap assembly.
I've been using a Thinkpad T470 for the past 2 years, and the hinges are really solid... they look like they'll last many more years. I had a couple of T400s before which basically disintegrated around the hinges, so I know whereof you speak, but at least this series of Thinkpads don't seem to suffer from that problem... looks like they learned their lesson.
> I can try to stop using the TrackPoint (would hate unlearning that)...
Same here. I’m addicted to it and can’t get over the tactile feel. I guess this will go the way of the mobile phone keyboard, assuming it hasn’t already.
I guess a workaround for now (until someone designs a trackpoint keyboard that fits in the framework) could be Lenovo's bluetooth trackpoint keyboard, I use it with every laptop and it's been great
Annoys the crap out of me that Apple tries to claim how "environmentally friendly" they are and yet the biggest problem is they make all their computers be disposable and extremely difficult to repair. They've gone out of their way to do this by soldering in memory and SSD, gluing batteries in, etc. Shame on them.
Putting upgradeability aside, Macs typically have longer usable lifespans as evinced by their relatively high resale value. Anecdotally, it's common to find 5-7 year old MacBooks being used by their original owners (I'm typing on one right now), and Apple will offer around ⅓ of the original value on a 5 year old machine as a no hassle trade-in because their refurb partners are able to sell them (you can usually get more selling privately).
It would be nice to see some objective stats on this though.
Macs can have a very long usable lifespan - if they don't require a repair. Then even young machines are quickly totaled as soon as they are out of warranty. I wonder how many macbooks ended up in landfill because of broken keyboards which where too expensive to repair (like 600$ ?), though keyboards shouldn't just break and not cost more than 100$ to exchange...
My 6 year old iMac has a fan which sounds horrible, would be easy to clean if I just could access it... and while the HD is actually still upgradable, I can't reach it any better than the dirty fan.
And of course there are batteries, which do fail quite often after like 5 years and are also not meant to be exchangeable. In Europe, it might be difficult to get Apple to exchange the battery of older laptops though.
Though apple don't encourage DIY battery replacement it's not that hard. On my old 2013 air it's really easy apart from sourcing a decent battery, and on the new M1 air it seems quite doable if you don't mind glue. https://www.ifixit.com/Answers/View/675334/Is+it+possible+to...
Yes, it is possible, a colleague of mine replaced the battery of a 2012 MB Pro. It is possible, but not easy and involved chemicals which should not be handled by anyone without basic lab experience. Things have become somewhat better but still the question is: why isn't it easy and why doesn't Apple offer battery exchange for a longer time?
If they really care about the environment, they should at least offer to fix your old MB at a reasonable price.
(At their typical battery replacement prices, there should be a healthy margin anyway).
The design of the MacBook Pro changed in 2016, and it became less repairable and easier to break (see: issues with butterfly keyboards). It would be difficult to convince me to buy a used old 2016+ MacBook Pro.
I remember the days when you could actually EASILY replace your portables battery (after 3+ years), EASILY upgrade your ram and HD.. now if any of that fails or the system needs more memory then you have you have to replace the entire computer. I'm sure this is by design so you spend more money and buy new stuff instead of upgrading what you already have.
Yeah, Apple was really good at replaceable batteries, keyboards, hard drives, RAM, and wireless cards... right up until they decided they didn't want to be. IIRC it was sometime around when they started calling PowerBooks MacBooks... 2006 or thereabouts.
Ehn.. I used to be like you. Once I used Pixelbook, a superthin fanless device, I was in awe. Pixelbook Go was similar fanless device as well. But, then I realized that they have their limitations. With a fan, one can achieve much better performance when needed. So, now, I prefer a laptop that can passively cool for normal day to day work, and then for heavy workloads take advantage of the fan.
I got the framework laptop, which should be able to push the intel chip to the max 28 TDP. I heard the fan is big, and thus not annoying. I am curious to see how it will turn out (My batch 3 order gets delivered tomorrow).
Yeah I guess I don't have a ton of experience here outside of mac laptops, but the idea of a PC laptop that only kicks in the fans when truly intensive tasks are being run sounds pretty nice.
My experience has mostly been that when any work is being done, the fans spin up. It's jarring when you're trying to focus on the problem at hand.
You need to find your fan rpm/noise threshold. On my laptop the fan becomes noticeable above 2500rpm. Someone commented about adjusting the fan curve (I have no idea how), I just manually capped the CPU performance with a powersave profile instead.
You're right about cooling for intel/amd, with the note that the M1 is a whole different beast - it can deliver good performance with only passive cooling anyway
The only real Linux related quirk I've run into so far is that you have to disable panel self refresh (it's on by default and causes stuttering). Other than that tiny thing I pretty much just installed my stuff and started using it.
One little anecdote: I got a card in the mail from Framework saying that there was a problem with the cable for the touchpad, and it had instructions on how to fix it. Contrast that to my experience with Apple where they would delete forum threads for laptop problems and spend years denying issues until legal action forced them to acknowledge it.
Anyway, I'm a fan. I'm really looking forward to when the marketplace opens up with some new parts. I really want my blank keyboard. I'm hoping 2021 will be the year I can own a laptop without a god damn windows logo emblazoned on the keys.