> It's almost as if Apple aren't even basing their hardware strategy on what developers are complaining about on Hacker News.
I’m imagining someone who went into a fallout shelter about 10 years ago, somehow only had HN as a source of news, and suddenly re-entered the world. What would such a person be suprised to learn?
I suspect Apple still being a successful company would come as a huge shock to such a person. So would the success of SystemD and Docker and Kubernetes and Dropbox and microservices, the complete takeover of widescreens in the display market, and the continued survival of Tesla.
I bet a stock picking strategy that invests in companies and ideas that HN users hate, and shorts the ones that HN users love, would be right far more often than it is wrong and would make tons of money.
The first Macbook Air was a seriously limited device, style over substance. Expensive, weak CPU, mechanical hard drive, low battery life.
It was with the second generation (2011) that they knocked it out of the park, where they added fast Core i5/i7 CPUs, SSD, greatly improved battery life, two (!) USB ports, choice between 11' and 13' models.
And most of all they sold it at an entry level price (for Apple standards). Considered that the competition at the time were bulky Sony Vaio, HP and Dell "bricks", it was like a laptop from the future.
The Macbook Air wasn't in any way intended for developers, though. All the people I knew buying Airs (even first gen) were the frequent travelers and types of people who do loads of work involving email, text documents, and slideshows. Nothing that required heavy processing, but work that involves being out of your desk and having to haul your work with you.
A lightweight PC small enough to slip in a folder was what they wanted.
The Macbook Pro was generally for more of the processing-intensive work and development side. Recently I've been seeing/hearing more and more dissatisfaction from that crowd.
It wasn't a powerhouse, but it was (the second generation one) the first successful "thin" laptop that made reasonable compromises so that it could be used a development machine.
In an era of remote servers and web scripting languages, not every developer needs a full-blown workstation.
I agree it wasn't indended for developers, but it ended up being adopted by many of them.
I used a 2013 Air with maxed specs as a dev machine for a while and still use it for web browsing and while traveling rather than lug around my more expensive MBP. With an i7, it's a very capable machine with great battery life.
Doing front end development, it was more than powerful enough for anything I had to throw at it.
> In an era of remote servers and web scripting languages, not every developer needs a full-blown workstation.
I haven't really found that to be the case, especially these days with "modern" frontend tooling, like Babel/Webpack/etc, static analyzers like ESLint/Flow/Typescript, and the various types of frontend testing. Every bit of performance helps.
You can debate whether all of that is necessary, but the reality is many companies use them.
The Apple's MBA11 was a fantastic device, but they stopped improving it and Dell came out with the XPS13. It is a similar form factor, has a bigger display, and runs Linux.
Neither were guaranteed hits when they were introduced. Apple’s biggest claim to fame at that point was the iPod. Even after the iPhone was released, it was years before critics looked at it as a product that could withstand the test of time.
In 2008 the iPhone was a $500 subsidised device (at a time when other phones were "free" or up to $200) which really didn't do anything. It didn't do MMS, third party apps didn't exist, google maps didn't have turn-by-turn. I'm sure the user experience of a reasonable sized capacitive touch screen was great compared to the alternatives - but I never even saw one in real life and on paper it was very expensive and didn't have anything to sell me on it.
Once the iPhone 3G came it out was clear the device was a phenomenon and not a rich people toy, but initially the market was pretty confused, from what I remember.
I am not an apple fanboy, but I had a couple generations of Danger products (by far the most modern at the time) prior to the release of the iPhone and owned a few generations of iPhone from the start. The 1st generation iPhone was revolutionary and deserves every last bit of credit it gets. The 3G was a yawner, like adding leather seats to a flying car. Yeah 3G is nice... but seriously, the car is freaking flying how did you not notice that??
Exactly. The iPhone had the revolutionary UI and the revolutionary functionality (functioning web browser that worked with the most pages not specially made for phones) at the moment it appeared.
I think, that you didn’t see it in person was the reason you didn’t expect it to be a success.
The 3G had 3G, but otherwise same hardware specs as the original.
What sold me was that Apple got the UI right (“smooth as butter”), the webbrowser and email actually worked, and jailbroken it was almost Unix in your pocket! It really felt like a phone with so much potential and a far ahead of anything I had seen. The things you mention were just a matter of software updates. Btw. 3G and App Store both came out in 2008.
Yeah to me it was clear after seeing a smooth UI in an internet ad. I got the 1st gen - what surprised me was the empty shop without a queue. Maybe it was luck but to me it was an obvious game changer without even touching it.
The original iPhone was unsubsidized. It required a contract until it was unlocked, but you could only buy it at the full retail price, forfeiting any subsidies.
$500 with a required contract is a subsidised device; it doesn't really matter how you phrase it. AT&T paid for it to be exclusive and the retail price would have been set taking that into account. If it wasn't there'd be no reason for Apple to allow a carrier exclusive with a required contract - the buyer would own the device.
The original iPhone was unsubsidized and it was GSM only, so there wasn't an option to offer it on Sprint and Verizon then. The 4 Gb iPhone was $499 and the 8Gb one was $599. Because Apple was new to phones, they only wanted to deal with one carrier initially.
Back then, you could get a Treo or a Blackberry for less than half as much, as they were subsidized by the carriers.
And because paying that much for a phone at that time impeded sales, Apple discontinued the 4 Gb model and dropped the price of the 8 Gb model to $399 two months after it was released. They also "gave in" to the subsidy model, since that was the standard practice: https://en.wikipedia.org/wiki/IPhone_(1st_generation)#Releas...
It was clear to me that there was a phase change when the iTunes store came out - wow, was that 2003? Anyway, the reason I am not rich is because I didn't believe in Steve Jobs, so I didn't have the faith to hold AAPL stock forever. But the iTunes store was the point at which it was clear something spectacular was happening and Apple was moving in to a new world, compared to the days when the WSJ constantly called them "beleaguered".
> So would the success of (...) Docker and Kubernetes and (...) and microservices
It's hard to tell if they're a success. With enough time and money you can make almost everything work. It does not mean that there weren't better alternatives available.
Tesla? Why would anyone on HN doubt survival of Tesla?
Except making impossible claims about delivery deadlines, they weren't mentioned in any especial sins worth mentioning. And Tesla Roadster looked definitely like the big future in 2008.
That's fine, why not? Isn't this the whole startup thing? Burn money until you IPO or get acquired? Otherwise it is look down at as 'revenue business'.
I'd go further and say that Apple isn't basing their hardware strategy on the needs of any real person. It's a laptop with only USB-C ports (and the 13-inch only has two of them). You can't even plug a USB stick into it. The only people who asked for a laptop like that are working at Apple.
Honestly, I've been working with the USB-C only MBP as my only development machine for a year now, and I haven't once wished it had a classic USB port on it. At my desk it's plugged into monitors which have ports on them. Having a single cable for all my peripherals is quite nice.
Because I go to meetings and work from home sometimes. Only thing I'd use a standard USB for would be charging, which isn't really something I do in meetings, and I have plenty of chargers at home.
This is seriously where we're supposed to be for a pro laptop?
It shouldn't work "pretty well" when you're using a $2000+ machine. It should work perfectly every single time. This is what built apple into the company it is. "pretty well" wasn't good enough.
This sounds like the type of concession I would have made in 2005 on my gentoo laptop. "Yeah, USB doesn't work, but there is a workaround that sends that data over wifi! It even usually works!"
Uhm, no, you're just supposed to spend the $19 on a USB C -> Lightning cable to connect them.
The wifi thing is just convenient for many people. If it's not good enough or "pro enough" for you, there is still a cable that you can use.
I don't really see the problem here. Assuming I buy the base model MacBook Pro and the cheapest iPhone 8, I've just spent $2000 dollars and I don't see how I can justify complaining that I need to spend an extra 1% on a cable to connect them.
It's not a workaround, it's an additional capability.
Before they added this feature, a USB cable was the only way to do this, and it still works, and works reliably. If you have a USB-A connector on your Lightning cable, you'll need an adapter, if you have a USB-C connector, you won't.
From the OP's complaint, he didn't like having to do this with USB, so I suggested the WiFi option to avoid cables. Maybe you read that as "the USB option does not work", which is not the case.
WiFi depends on the quality of your WiFi network. Mine is rock solid, so "pretty well" is my engineer answer to cover the cases when AP is flakey, range is crap, etc.
You didn't answer the question. It doesn't need to be so dysfunctionally thin and light to travel between work and home sometimes. Nobody is going to suffer from an extra pound of weight.
He did answer the question, only with reasons that you don't seem to agree with.
I also mostly use my MBP plugged in and yet still want my laptop to be as light and thin as possible for when I'm carrying it around during the day/commuting. An extra pound of weight makes a huge difference to me walking home.
> An extra pound of weight makes a huge difference to me walking home.
That's hard to believe. Are you walking 50 miles? How would you notice an extra pound? What about the rest of the stuff you carry, your clothes, your shoes? Is everything else already optimized?
Why does the (likely) most expensive full-size computing device need to lose weight vs everything else, especially when it's infringing on the actual user experience of working with it?
> That's hard to believe. Are you walking 50 miles? How would you notice an extra pound? What about the rest of the stuff you carry, your clothes, your shoes? Is everything else already optimized?
Not everyone is particularly fit. Some are old, some don't have time to exercise, some choose not to, some don't have the choice and some others are and just prefer the convenience.
> Why does the (likely) most expensive full-size computing device need to lose weight vs everything else, especially when it's infringing on the actual user experience of working with it?
I think is largely a generational issue. I bet there's a large swath of young professionals right now who grew up with ultraportables like the macbook air but are looking for just a bit more performance right now. These people are also likely to be more accustomed to typing on shallower and lighter keyboards (including virtual and mechanical) so the change won't be as big a change for them than people who grew up with typewriters and Model Ms.
There are multiple product lines. Those people who need extreme lightness should not choose the Macbook Pro line then, which should remain focused on functionality first.
I think most pros doing particularly heavy processing nowadays have a desktop or server to offload to in addition to their laptops and performance has improved enough for pros with light to moderate processing so that higher portability is better value for most people.
Nobody is saying Apple shouldn't have thin and light options.
Apple's prices and hardware designs made the 15" MacBook Pro a popular desktop replacement. For the same price as a 13" MBP and a 21.5" iMac, you could get a 15" MBP that would outperform the iMac in some cases.
Mobility is functionality, which is why things like battery life and working keys matter.
Is it really that hard to believe that some people prefer to have a lighter laptop. My daily driver is thinkpad but I really appreciate my mbp when traveling or for conferences because it's much more portable and I can carry it with ease
The point being is that should be a choice, and they have Macbook line for it if that matters more to you. It shouldnt be the priority for the Pro line when it hurts the UX, that's the issue.
My personal machine is a 12” MacBook and my work machine is a 13” Touchbar MBP. I use a very small messenger bag and so yes, I do notice the extra weight on my shoulders when I commute with the MBP.
I have optimised my bag heavily but already have a few non-negotiable medical items I must carry to stay alive and so I like to get that weight back from my laptop.
I understand that weight isn’t a big deal for everyone however it is for me. If my company offered smaller machines I would switch instantly.
In places other than the US, the majority of commuters are supporting the weight of their laptops on their backs (on public transit/walking/cycling), rather than having it sitting beside them (in a car.)
When I was in primary school we were taught that our backpacks shouldn't weigh more than 10% of our body weight. That's not much when you're in third grade, but I think most adults can comfortably carry laptops that are two millimeters thicker and weigh a pound more than the Macbook.
What do you need adapters and dongles for if your goal is to go out to a cafe or a park and work with the computer? Do you, like, need a 7tb media drive with you at all times or something? Are you constantly plugging it into different monitors? Not like you wouldn’t need adaptors for that unless you exclusively live around DisplayPort screens anyway. Do you carry around a wallet full of DVDs and an external drive for those?
Strawman - most people are not working in a cafe or park, but an office where you have to plug your laptop into actual things - projectors and so on. And almost all the projectors I use at my work, my customers' work, or the last few conferences I've been at, are HDMI - which is cool because I have an HDMI port right there on the side of my [2014, may-it-last-forever] macbook pro. It's next to the SD card slot where I can put in the SD cards from my rather nice digital camera, so I can edit photos when travelling. I earn money from photography, so as someone who is technically therefore pro, both of these connectors are useful for pro work, that isn't just typing things in cafés.
> which is cool because I have an HDMI port right there on the side of my [2014, may-it-last-forever] macbook pro
I would point out that modern non-Apple PCs don't tend to have HDMI ports, either. They either have mini-HDMI ports, or OTG ports. The dongle requirement is nearly-universal for connecting modern devices to projectors, to the point that you may as well just buy a set of dongles for each projector, rather than for each laptop.
> It's next to the SD card slot where I can put in the SD cards from my rather nice digital camera
Is there a reason you can't plug the camera into the computer (using a USB cable) to transfer the photos instead? That's what I see the photographers at my own office doing.
(I asked one just now, and they said: if you're trying to transfer photos outdoors, moving an SD card also has the chance of pushing dust into the SD card slot on the camera. They say they've had SD card slots wear out/break before. And for cameras that take micro-SD, when moving the card they're always a little paranoid they might drop the thing on the floor and lose it. All in all, cables are just easier if it's your own camera and your own computer. SD cards are just for passing off to other people.)
Easy - half the conference venues don't come with dongles, or if they do the AV guy is dealing with a crisis in one of the other 17 conference rooms and won't be able to get to your one within your 20 min slot (this happens A Lot. I do have my own HDMI-VGA though, but the VGAs seem to by dying out thankfully).
SD cards - I've never had an SLR or a high end mirrorless camera (but i haven't bought either for maybe 18 months) that's been able to transfer through itself at anything like the speed of a fast SD card in the slot. Makes a big difference when you want to dump 64GB of pics. As for dust, my favourite camera has been to Syria, the desert southwest of the USA several times, morocco, antarctica on a sailing ship (so being sprayed a lot), and not to mention my local beach. Aswell as just being the camera i throw into my shoulder bag when going out. Unless the photographers in your own office are war photographers or dirt-bike specialists, I've probably given my kit a harder life than they have. I've never had an issue with pushing dust into the SD slot of my camera or laptop. I did however dent the ISO knob on my camera when trying to tether it to my laptop when sailing across drake's passage, and a wave tipped the boat and I instinctively grabbed the macbook and let the camera hit the deck. I learnt my lesson. Fair point on micro SD though, but I have never used them for photography. I would probably be concerned too about fumbling, especially if wearing gloves.
True - but I doubt that's really a significant factor in the MacbookPro target market.
(They're much more likely to optimise them for being carried in artisan faux-vintage messenger bags by people riding share-scooters between home and the FAANG shuttlebus...)
> Nobody is going to suffer from an extra pound of weight.
I would and did for a long time, and I am very glad for the lightening trend. Try considering for just a moment that there are many people in the world who aren't as ablebodied as you. Small changes can make big differences.
I know a lot of people with physical disabilities. I've been one myself; I'm married to one.
Nobody's asking Apple to make all of their laptops heavier. And I don't think anybody's asking for 9lb / 4kg monstrosities from Apple, either.
What I see is a lot of people wishing for pro laptops from Apple that eschew their "lightness at all costs" philosophy in favor of a more balanced approach that offers some extra functionality in exchange for, say, an extra pound of weight or something.
That wouldn't preclude Apple from selling smaller, easier-to-carry models.
You can have light without being anorexic thin. The ThinkPad X1 Carbon has a 14" screen and weighs half a pound less than the MacBook Pro, and it has the missing ports.
> Nobody is going to suffer from an extra pound of weight.
For many things in life, it’s not about whether or not you suffer, it’s about what’s more or less pleasant. And for many, that even trumps an occasional inconvenience.
An external HD or removable SSD drive would be a much better system for moving between home and work. It doesn't matter how heavy the base system is if you're just moving a tiny disk, but there's no money in that for Apple.
I see this so often it's becoming ridiculous. "pfft, it works fine as long as you use it less like a desktop!" We'll I can build a much more capable desktop for one third the price, so no thanks.
If the intention is to use it as a desktop then why keep sacrificing function for fractions of a mm?
The intention isn't to use it as a desktop. The intention is to not plug anything into the computer (other than the charging cable, sometime) except when using it as a desktop.
Everything is wireless these days. Input devices, audio devices, networking.
You basically only need a cable to charge the thing (with your USB-C charger), for mass storage (where most new mass-storage devices are USB-C devices as well, and if you have a well-used old one you can keep a dongle stuck to it), and for displays (at which point you are in a "docked" use-case, and so you can just use a USB-C dock or Thunderbolt-display-with-dock.)
All other use-cases are so rare or esoteric that you can just put them off until you get home and to your desk.
Or are you constantly balancing your laptop and a USB-only printer on your lap or something?
So as soon as I get up from the desk I should stop using anything USB? Never connect to anything HDMI unless I'm at my own desk? I think that's preposterous. Not to mention, Apple wouldn't be selling very many dongles if this was really how most people used their laptops.
> and I haven't once wished it had a classic USB port on it.
While I haven't missed classic USB (thanks to a dongle with old USB slots) I have however wished I still had a magsafe power cord.
It also doesn't help that the dongle I bought is a multi port one (ethernet, USB-A, HDMI and others) and for whatever reason, using it on battery noticably reduces battery life.
It's decent when your employer springs for a "dock" type devices that allows a single USBC hookup to the laptop, including power, which is the setup I had at my previous role.
It sucks when you have to attach and detach a whole bunch of dongles each time you sit down at your desk, as with my current role.
I could buy the dock thing myself I suppose, but then it's personal property I have to worry about and could disappear, etc.
Lately I've been working at a company that uses YubiKeys for 2FA. You can use a traditional YubiKey, but the YubiKey Nano is especially popular because you can put it in your laptop and leave it there. It barely sticks out at all, just enough that you can touch it and activate the touch sensor. [1]
I started noticing people in meetings with their new MacBook Pros and six inch cords sticking out the side, and on closer inspection, there was a YubiKey Nano plugged in at the end of the cord.
Meanwhile, the folks with ThinkPads or older MacBooks just had a Nano tucked into one of their USB-A ports like it was no big deal.
I bought myself the full sized Yubikey instead of the nano because I'm very much of the opinion that the inconvenience is a feature, not a bug — do you _really_ want to leave your Yubikey attached to the device that holds your password manager too?
That's a great point about not leaving the YubiKey plugged into the laptop all the time. But the company actually suggests that people get a Nano and leave it plugged in for convenience. Maybe that isn't the best security advice!
Thanks for the pointer to the USB-C YubiKey Nano - it looks like a fine solution for the dongle problem.
That's even more funny, because the MBP's touch-bar is essentially the same type of device as the YubiKey—an isolated offboard CPU with its own TPM that can be unlocked with a fingerprint and then asked to encrypt secrets for the host/parent PC. The touch-bar just has more screen.
I'm surprised nobody's hacked the YubiKey app into making use of the touch-bar as the YubiKey "device."
Doesn't that kind of defeat the point of the Yubikey? I would think you would want to separate the key and the computer when you yourself are separated from your computer.
The copy on the Yubikey Nano's website says, "designed to remain in port." Removing it is still an option, but it doesn't sound like it's the point of having it.
You can also use the fingerprint sensor as a replacement for the ubikey. Which works arguably as well - at least it satisfies the criteria that you separate the key and the computer when you yourself are separated from the computer.
It's not just a cosmetic issue. The people with the six inch cords are leaving them plugged into the laptop all the time as they walk around the building, just like the people with the USB-A ports are leaving their YubiKeys plugged in all the time.
The difference is that if they bump into somebody or something, that dongle is likely to get broken, and it could even damage the USB-C port in the MacBook. That's not going to happen with a Nano plugged directly into a USB-A port.
> I'd go further and say that Apple isn't basing their hardware strategy on the needs of any real person.
Real person here! I love the USB-C only approach. I love being able to charge from either side of my laptop, and I love having single cable to plug in at my desk for RAID/monitor/ethernet/charging/etc. I want my laptop minimal; if I want it to do more, I can add more.
I on the other hand wouldn't mind if they made the laptop a little bit thicker in exchange for better battery life.
When the discrete graphics chip is being used, plus a retina display, and or several dongles plugged in for ethernet, usb-a or whatever, battery life on my 2017 MBP while doing development work is maybe 3-4 hours.
It was quite a shock to go to that from the 8-9 hours I used to get from my older MacBook air. Whereas previously I would leave certain apps (notably image editing apps or VMs) open while developing, I now find myself opening them, doing a task and then closing them so as not to activate the discrete graphics chip and drain the battery.
I now find myself opening them, doing a task and
then closing them so as not to activate the discrete
graphics chip and drain the battery.
Sounds like what's really needed here is a way to tell MacOS not to use the discrete graphics chip when on battery power.
Is there no way to do that? It's been a while since I used a MBP with discrete graphics. There used to be a menubar app for that, but I always felt it should have been an option baked into MacOS itself.
There is an app for that (https://gfx.io/), but it doesn't switch automatically when battery is in use so you have to enable it manually (and you have to close any apps that are using the discrete chip first).
Mostly though I just use that app to know when the discrete chip is enabled, and then quit whatever program enabled it when I've finished with it.
They can only make the battery 99WH, and still have it allowed to go on an airplane. The battery is already ~83WH, so sadly, there may not be much to gain there.
While it's technically true, it's worth noting that part of the weight & thickness savings in the 2016+ MBPr was from simply reducing the battery from 99.5 Wh to 76Wh. I strongly dislike this change.
Just take a look at the Lifebook U937/938. Thinner and lighter than the Air, but has 1xUSB-C, 2xUSB-A, SD, HDMI and even RJ45(!). All the while sporting a 15W TDP CPU and slotted RAM. (And yes, battery life is good as well.)
I don't have the new body style of Macbook, I have a 2015, but I can tell you that when I switched from a PC laptop to the 2015 Macbook, it was thin enough that I could now travel with my laptop and an iPad in my carry-on. Before I had to make the choice of carrying only my laptop or packing a roll-aboard to be able to fit everything I needed for a work trip.
What does a thinner laptop enable you to do? What a question. Might as well ask why compact cars are more popular in the city than pickup trucks.
I had a similar experience back in 2012 when I bought a 13" MacBook pro after carrying around a PC laptop.
But, I wrote "at this point".
Does 3mm, at about the same weight, really make much difference?
I suppose incremental improvements(?) in "thinness" accumulate, and in 2028 we'll all be complaining that the new MacBook Pro at 2.8mm thick is easily bent and has no ports.
"At this point" does a marginally faster CPU make a big difference? Not to me, my "workhorse" laptop is a 2013. But if they stuck with a 2013 CPU in a 2018 Macbook, people would go ape shit even though for people like me, the faster processor doesn't change a single thing about how the machine works.
Way too often people on here see something and think "that doesn't help my use case, therefore it's a stupid decision" and never stop to think "hey maybe someone has a use case I don't have, and maybe this helps them". To you, 3mm doesn't sound like a lot. To me, a few hundred Mhz or another few GB of RAM doesn't sound like a lot. But you don't see me screaming in the comments thread "how dare Apple put a faster processor in their Macbooks instead of shaving off another few MM of thickness!" because that's just ridiculous. Someone out there is happy that the machine is faster, even if that person isn't me. I have the ability to put myself in someone else's shoes and be happy on their behalf.
Some people's laptops sit on a desk all day and that's fine. Mine doesn't, and every MM shaved off is another pair of socks I can pack into my luggage.
Always with the hyperbole. How would you feel if your laptop suddenly increased 2mm in thickness but didn't gain any weight? Most people wouldn't even notice the difference if you didn't point it out to them.
Definitely, they should put a SCSI port on the next version.
Weren’t we having this discussion when they dropped the floppy drive? Should they keep USB-A forever? When is it ok that they drop USB-A? The original move to USB-A was met with resistance to people then that were angry that suddenly their ADP or other older devices wouldn’t connect — but then device markers saw the advantages of USB and quickly moved to embrace it. They are starting to see the advantages of Thunderbolt 3 and eventually, USB-A will go the way of the serial port — but not without the requisite hang-wringing and “get off my lawn” first. In 3 years, nothing but dollar-store junk will still use USB-A.
> In 3 years, nothing but dollar-store junk will still use USB-A.
Good. Then I'll give up my laptop with USB-A and buy one with USB-C only. In the now, where I live, many things still use USB-A. This is the main reason I didn't even consider a new MacBook Pro for replacing my 2015 one.
A while back someone posted an article comparing Nikon and Canon in the professional photography market starting in around the 1970s.
Tellingly, Nikon did what HN seems to want: they had a very conservative approach to "pro" model cameras, emphasizing backwards compatibility and respecting what their "pro" users asked for. Canon, on the other hand, experimented a lot and, despite not actually inventing many of the new ideas, was willing to try them out on "pro" cameras.
This difference of approach cost Nikon its leading market share among "pro" users. There may be a lesson there.
To add another point: I've had a 2017 MBP for over a year now and I don't mind that it only has USB-C plugs. I have an adapter for the rare cases when I want to plug in some older USB device.
I actually really like their push towards USB-C everywhere. The only major device I own that doesn't use USB-C is actually my old iPad. If they killed their proprietary adapter in favor of USB-C I'd be very happy.
"If I had asked people what they wanted, they would have said faster horses." -- Henry Ford
Nobody asked for USB ports when they had serial/parallel ports, or for the floppy drive to disappear, or to lose the VGA connector.
I don't know if Apple have made the right calls here (on either the USB-C ports or the keyboard), but they do have some history in making eventually-winning calls on at least some of these things. (That _also_ built the Newton, so I'm not gonna claim they _always_ get it right.)
It'll be interesting to see how these design decisions fare over then next 3-5 years...
> Nobody asked for USB ports when they had serial/parallel ports, or for the floppy drive to disappear, or to lose the VGA connector.
I don't recall being annoyed by those changes. In particular, the VGA->DVI transition was handled really well. Just about everything was DVI-I and would accept a VGA signal over a DVI cable, so you just bought a VGA->DVI cable and continued using VGA.
Macs with USB but no SCSI connector were a big cause of complaints where I was back then (I worked around lots of designers with SCSI Syquest and Zip drives, and all their client archives of high-res photoshop and page layout work on them), and _everybody_ in my circles thought Apple were insane for leaving floppy drives off... (These were a long time back though...)
To be clear, I remember those transitions being very smooth, but I was a Windows user. The first machine I bought without a floppy drive was my laptop in 2005 when CD-RW had long since replaced floppies on the Sneakernet.
My school was mostly iMac G3s and I vaguely recall the lack of a floppy drive being annoying once or twice, but the Windows machines had floppy drives and you could easily share files on the network, so it was never a big deal.
Heh - my school had AppleIIs and BBC Micros. My first year at Uni we used a VAX 11/780. I guess we had somewhat different experiences.
On the other hand, I think the most impressive and smoothest transitions I've ever seen in the entire personal computer space were Apple's transition from 68K to PowerPC processors, and almost as smooth their transition from PowerPC to X86. I'm still incredibly impressed with the attention to detail they showed getting those major changes to work so smoothly for users.
> I don't recall being annoyed by those changes. In particular, the VGA->DVI transition was handled really well. Just about everything was DVI-I and would accept a VGA signal over a DVI cable, so you just bought a VGA->DVI cable and continued using VGA.
Just about everything was USB-C and would accept a USB-A signal over a USB-C cable, so you just bought a USB-C->USB-A cable and continued using USB-A.
I don't see a difference. Except the part where you say the transition was handled well. I didn't feel good applying that to USB-C...
There are a lot of USB devices that do not attach via a cable. You can't just use a different cable in that case, because you were not using one to begin with.
VGA devices always connected via a cable, so you weren't adding bulk. Even adapters for devices with integrated cables (grrr...) were pretty minor compared to the sheer heft of the VGA cable itself.
> Nobody asked for USB ports when they had serial/parallel ports, or for the floppy drive to disappear, or to lose the VGA connector.
I remember all of these technologies, and they all had major pain points. Serial and parallel ports were slow and needed tuning to work with IRQ/DMA (don't quote me on this), floppies were slow and unreliable and VGA was blurry on LCD monitors.
There's no current replacement for USB that does the job better. Nothing for HDMI. Same for SD cards. Nobody was asking to get rid of these ports so that they could use the next big thing. Apple had Thunderbolt 2 co-existing with USB on their older laptops, they could do the same with the newer ones. In fact, USB-C charging is limited to 100W, so 15" MacBook Pros run at their limit could power-throttle or lose charge when running intensive workloads.
Indeed and perhaps the height of absurdity - if you were to go buy a brand new iPhone 8 today, the cable that comes with the phone has a USB 3.0 connector not an USB-C connector at the far end.
So you can't even plug a brand new iPhone 8 directly into a brand new Macbook Pro without buying a cable as a separate purchase.
Except you don’t need to, generally. Setting up a phone doesn’t require a computer connection and, since USB-A phone chargers are ubiquitous, it doesn’t make much sense to go Thunderbolt 3 on iPhone’s default cable yet because the days of the iTunes setup process are long gone. The primary use of that cable that comes with iPhone is for charging — and it does include a charger. Unless you are the .025% of iPhone owners that do development, your complaint really doesn’t matter. With AirDrop, AirPrint and iCloud, there really isn’t much point to connecting to a computer for the vast majority of users.
Generally except if you commute/travel via long distance by train or bus and there is only one power outlet per passenger, and you need your phone as a modem for your laptop. Or at an airport with limited outlet availability. Or at a coffee shop with limited outlet availability. Or if you prefer to back up to your laptop locally via iTunes.
>"Unless you are the .025% of iPhone owners that do development, your complaint really doesn’t matter."
Telling people their view doesn't really matter on account of a statistic you completely made up? Dismissing others people's views because their use case differs from your own? Brilliant.
I can't remember the last time I plugged my iPhone or iPad into my computer. And I can remember the last time I borrowed someone's USB-2/3 charger. It was like, two days ago.
I'm with Apple on this having been the right choice.
Yeah except Apple has failed to do that twice in a row now.
Both the iPhone 8 and iPhone X shipped after the introduction of the USB-C only Macbooks. And both phones have shipped with only a lightning to USB-3 cable.
So instead of telling me to "remember my words" you might instead try to remember the facts.
I recall a review in APC magazine in mid-2003 of Lindows, a live CD Linux distribution, mentioning that you could pair it with a “cheap-as-chips 128MB USB drive for $70” for persistent storage. The phrase stuck with me due to its absurdity—you can get a lot of chips for $70! Now, fifteen years later, you can almost get 2000× the storage for the same price. (I say “almost” as APC spoke of 70 AUD which is somewhat less than 70 USD.)
I don't think I've used, or seen anyone else use, a USB stick in about a decade. We can't just keep these legacy ports around forever, cluttering up our devices.
USB dongles are absolutely standard in pro audio. Products made by Steinberg, Avid, and many plug-in shops will not work without a USB dongle.
Aside from a few high-end audio interfaces, most music hardware - including licensing dongles - is USB-A.
Of course you can buy the inevitable wart-on-a-cable adaptor from Apple. But it's both fragile and ugly, and when you have an iLok with thousands of dollars of software on it, you really don't want fragile.
Edit: to be fair, some of these licensing schemes are moving to cloud licensing. But that has issues of its own - and not all are.
The USB-C adapter is fragile and ugly? Ugly is debatable, but fragile seems misplaced. Having an iLok on the end of a short flexible wire seems far safer than having it rigidly protrude from the side of a laptop.
As for high end audio interfaces, replacing the included USB-A cable with a USB-C cable seems a trivial matter. And sooner or later they'll be included in the box.
They just need a cable that goes from their device to a USB-C port. That's it. Most of the time, the device in question has a USB B port on it so all they need to do is buy a USB B to USB C cable and they're good to go.
Sure, sometimes they need to plug in a dongle and will need an adapter until USB C dongles become commonplace but that is hardly an issue at all.
> USB dongles are absolutely standard in pro audio.
That is like saying everyone is using floppy disks, why did Apple remove floppy disk drive. You will need to use dongle for a while but everyone moving to USB C/wireless is a good thing.
> I'm certain that until pioneer, m-audio, etc. jump on board, they will lose consumers.
I am sure Apple must have thought of that before pulling the plug on USB A. The point is this is classic Apple move to let go old tech and embrace new.
I had stopped using floppy disks completely by high school. The reliability of floppies of that era was so poor that despite only having 56K at home, I'd start an FTP upload of my documents to my free 20 MB dialup ISP space when I left for school and by the time I arrived they were all safely there.
True, I guess. And I'm one of them. Sometimes I use numerous Linux, Windows or macOS VMs for testing stuff. At 5-10GB per VM, that's way too much for my system array. So I use LUKS-encrypted HDDs and SSDs via USB. With my pitiful uplink, cloud storage for hundreds of GB would be unworkable.
My wife, the professor, is finally going to replace her 2011 Macbook Air. She was excited to check out the new 13" MBP. Then she saw the touch bar (without a real escape), hated the keyboard after typing on it in the Apple Store, and realized she'd have to replace the dongles she uses to present to her classes (VGA and HDMI). What really killed it was when she realized she'd have to have yet another dongle to plug in her mouse.
She already has a Linux desktop both at home and at work, and asked me if I'd help her put Linux on a laptop for her. We're looking at the latest X1 Carbon..
I don't understand this. They're very, very clearly marketing the Macbook Pro to developers. Read their press release, they have a section devoted to Unity performance for goodness sake.
Look at the older 2012-2015 models - arguably the entry point for many modern developers, the devices that were so good and so reliable it cemented their love for the brand.
There's a difference between pivoting your target demographic, and just making a bad device and hoping professionals won't notice.
I prefer caps lock as control. Although for a while I ran in a hybrid mode where if I hit caps lock and released quickly it was escape. If I held it down and tapped another key it was control.
I’ve always set up my Caps Lock to be both an Esc and Ctrl key, so the touchbar change didn’t cause me any issues. Karabiner Elements is worth looking into if the virtual Esc key bothers you.
I have one of these in front of me now (been working on it for a little over a week, should have waited, d'oh!) and the virtual escape doesn't seem that big a deal. Even if my finger isn't quite on virtual key pretty much at all, but is in the middle of where the real escape key would be, it triggers "ESC".
Go ahead and return it. Apple's pretty good about returns/exchanges after new product releases. I think you have 14 days from purchase to make an exchange, so hop on it if you really like the new systems.
Sadly, don't think I can. It's the company's laptop, we put the order in after WWDC assuming it'd be a few months longer before updates, and for various reasons I didn't get to setting this particular one up until recently. We're sadly past the flip date, but honestly I'm pretty happy with it so far.
I want a new one so bad, but I will not buy a TouchBar ever.
Also, the reason for thumbprint readers on phones is that passcodes are difficult to quickly type on phones. On a notebook keyboard it’s easy. I don’t want a thumbprint reader on my notebook.
So the hardware looks very good (maybe the best looking laptops in the world?). The screen looks very good (high res + great color). The keyboard looks great and some may even say that a hockey puck as mouse looks great.
This is what sells.
I also think that Apple never intended to serve the pro market. They tried some times but brands like Microsoft, HP, Dell and Siemens are still too strong.
It's almost as if Apple aren't even basing their hardware strategy on what developers are complaining about on Hacker News.
Apple's market cap is over $900 billion on it's way to $1 trillion; so apparently the strategy of not reacting to every complaint on the internet about their products, including Hacker News, seems to be working for them.
If Hacker News had a built-in prediction market (a la https://www.predictit.org), I would have made a fortune betting against what developers are complaining about on Hacker News.
Apple has never cared what their users wanted and they've never listened to them. Design, like shit, flows downhill. The difference between then and now that is Steve Jobs isn't around to tell them that the shit landing on their heads is really gold. As a result, they're not afraid to offend Steve and they're starting to develop opinions of their own.