Hacker News new | past | comments | ask | show | jobs | submit login
The new MacBook Pro is kind of great for hackers (medium.com/ageitgey)
400 points by ageitgey on Nov 26, 2016 | hide | past | favorite | 333 comments

I returned my MacBook Pro 15' earlier today. Here's why:

* Graphics card glitches - I paid close to 4K so I don't have to deal with quality control issues.

* Touchpad is just too large. I found myself resting my palm on it all the time, and sometimes (clicking) without realizing it. Also, if you like lying down and working (which I do sometimes because of a lower back problem) the size of the touchpad will make you work extra hard to avoid accidental clicks.

* Had the machine for ~ 10 days, used the touch bar less than that. Definitely not worth the money. Hopefully in the future, they'll have the 15' option without it.

* The bootcamp experience just sucks (this was my primary reason for returning it). Currently, there's no way to gracefully switch between discreet and integrated gpu, so the battery life is terrible, like two-and-a-half-hour maximum battery life terrible. gpu-switch doesn't work either. In fact, if you use gpu-switch you'll have to rebuild both macOS and Windows as the machine will just hang when you try to boot into either.

* Recovery mode has many issues with network connectivity. A few times, I had to tether/connect to my iPhone hotspot for it to go through.

* Sharp edges everywhere.

The specs are very underwhelming too, but I was willing to tolerate lower specs for higher build quality. I actually just picked up an XPS 15 9550 from Microcenter. Got the 2.6Ghz, 16GB (expandable to 32GB), 512GB SSD, 4K touchscreen for $1350 (an open box, new for $1499).

I don't want to discount your experience by any means. And there's no denying that the price is very steep for the new MBP. But, other than price, I'm really satisfied with the new 15" MBP.

In particular, I do actually really appreciate the larger trackpad. But I'm a heavy user of BetterTouchTool and have always regarded the trackpads as one of the main reasons to get a MacBook. I don't even bother with three-finger drag now thanks to the size of the trackpad.

I think the Touch Bar should be considered for what it is: a replacement for static function keys. Apple of course hyped it like they hype everything. But considered realistically in context I consider it a success. I actually do use it some. Some of the simplest things work the best; for instance, I really like the options presented when taking a screenshot. I also enjoy using it for music control, scrubbing through music, and switching between music sources (including YouTube tabs). Nothing revolutionary, but then again, how could it ever be given what it is?

Your note about three-finger dragging puzzles me. I’d thought now that the trackpad is larger, this particular mode of dragging would be more useful. With smaller trackpads I would constantly run out of space to end the drag without lifting my fingers. That’s why I got used to click-and-holding with my thumb and repeatedly swiping with the index finger to complete the motion. What am I missing here?

You should try the option "Enable Dragging with Drag Lock" in System Preferences > Accessibility > Mouse & Trackpad > Trackpad Options ...

Double-tapping and leaving the finger on the trackpad lets you drag the window or other item until you short-tap again. Drag lock even allows you to lift your finger and continue dragging from a different position on the trackpad.

> What am I missing here?

I don't run out of space anymore. I don't even have to do what you just described (repeatedly swiping with the index finger). I can often just click down with my index finger and drag a window all the way to where I want it without reaching the edge of the trackpad. I could never do this in the past, so I came to rely heavily on repeatedly three-finger-dragging windows around.

I also used to prefer three-finger drag because it was physically hard to click down and hold the click while dragging. But the Force Touch trackpad makes that much, much easier, and it also makes it so that I can always initiate the click anywhere, even at the top of the trackpad.

So it's really the combination of the larger trackpad and the Force Touch design that finally prodded me to stop using three-finger drag.

Thanks for clearing that up. My error was to have only drag operations in mind where letting go mid-way is not an option. So, yes, a dragged window stays where it is. I was thinking about a dragged item and how it bounces back to its original position or ends up in the wrong container if you let go too soon.

Not to rathole on this - but now I'm confused. When I click-drag something on macOS, I can click once, and repeatedly swipe with my index finger, lifting it from the touch pad each time to continue. This is true with every operation I can think of - selecting text, dragging windows, dragging items, etc...

In theory, a three finger-swipe is basically GUI equivalent to a click and drag - so the behaviors should be identical.

Under what scenario have you found them not to be?

> The specs are very underwhelming too

I got an HP Spectre x360 (16Gb, 512Gb SSD, I7-7500U), for $999 (open box for $940). Build quality is top notch.

I am a heavy Mac user and I could understand the Apple premium. There was no real competition before, but now that's changed. There are good offerings from Dell, Lenovo, Razer, HP, Asus and many others now.

Where did you get that for $999? I got the exact same build from BestBuy for $1199 and find it for $1099 at HP's site.

BestBuy too. Link follows. At the time of purchase, it was at that price(discounted 400). Sold out quickly. (http://www.bestbuy.com/site/hp-spectre-x360-2-in-1-13-3-touc...)

That points to the i7-6500U version. The I7-7500U is this: www.bestbuy.com/site/hp-spectre-x360/5617202.p?skuId=5617202 and the least I saw on it was $1199

I didn't even notice my mistake until I received the machine. Oh well, USB-C is a nice to have but not mandatory(HP blocks eGPUs anyway). Skylake vs Kaby Lake is a very small difference, except for the graphics.

Except that BestBuy also made a mistake and sent me an I5/8/256 version(silver instead of Ash silver). Which was promptly returned. I've used the refund to buy parts for a desktop ITX PC on Newegg. I'll be using a chromebook or whatever when I can't be bothered to sit at a desk.

The Spectre is a beautiful machine though.

What were you using before? I just upgraded from a late 2008 MBP, and the difference is night and day. Computers only get noticeable better roughly every six years these days, so if you were using last year's model then of course it's going to seem like a ton of money for no real benefit.

From my perspective this thing is at least 2x as fast in real world use, substantially lighter, way better screen and speakers. And the keyboard is awesome and the build quality is outrageously good.

I get that people are upset they can't edit 5K or whatever, but you can get a desktop computer for that for less than half the price. Obviously they will be even better in three years once intel has chips that are appropriate for the MBP that also support LPDDR4, but for now this seems like clearly the best computer on the market and very clearly aligned with where the industry is going in the future.

If you're getting a desktop for half the price to do your real work, why buy the Macbook? How does being better than an eight year old laptop make it clearly the best computer on the market?

Some(most) people just don't need the raw power.

For instance if you are entrepreneur, any computer will have enough power and other things become important. An entrepreneur needs to travel a lot so battery lasting is essential, weight being low is essential, size being small is essential, it just working(software-hardware integration) is essential.

You can't move a desktop easily.

As an entrepreneur and engineer I use Macbooks a lot. Heavyload is done in servers. I used to compile my own gentoos(Linux distros) and built my own desktops and servers from discrete components in the past to get the best bang for the buck. Not anymore.

Macbooks are great computers in overall design, if you are careful enough to avoid first generation designs(that applies to any product from any company). Once they iron all the bugs it just works.

So you don't need a powerful machine, but you're still willing to pay a premium for one because it's small with a big battery? That makes even less sense.

I can't speak about MBP as I have only ever bought Windows machines but I was in the market for a new laptop for my new software dev business (one-man startup at the moment) and I looked at a few: Although I should say that my brother is still using his 2008 17" MBP to this day so they do know how to build them!

My shortlist was the XPS 13, XPS 15, Surface Book and Latitude E7470.

I discounted the XPS firstly for having too many coil whine issues even after 3 generations[1]. In addition, I had read about the key travel being short (1.2mm I think) but it wasn't until I actually tried one in PCWorld that I realised it's horrific to type on for any length of time. Not something I had experienced in the past. The 4k screen is wonderful though.

The Surface Book, while having a nice keyboard and sumptuous screen, has a terrible warranty: 1 year hardware support out of the box and for 3 years it was around £350. Even then, all they would do is send you a second hand replacement and take yours away. I read about some people that had been sent badly scratched replacements even though theirs was perfect. Too risky for a £2000+ machine.

The Latitude e7470 ticked all the boxes: easily expandable, tough, 14" and a screen res of 2560 x 1440. Also, I found one in the outlet store (scratch and dent) for £800 with a 3 year onsite next day warranty: I haven't received it yet but I have a 7 year old one at home that still runs as a Windows Server 2012 R2 machine and apart from having crap battery life, it still runs.

I hope your XPS is ok, it's stunning to look at in the flesh but I didn't want to take the risk that my £2000 machine started whining and to have Dell say "it's by design", and I hated the keyboard, so I stuck with the slightly less glamorous but no less capable Latitude.

[1] - https://www.google.co.uk/search?q=xps+coil+whine

Great move. Keep hearing from forums, reddit small issues here and there for this first-gen notebook. Looks like many others are paying premium to Apple to be their beta testers. Really hope Apple release non-touch bar version in parallel for future refresh.

> * The bootcamp experience just sucks (this was my primary reason for returning it). Currently, there's no way to gracefully switch between discreet and integrated gpu, so the battery life is terrible, like two-and-a-half-hour maximum battery life terrible. gpu-switch doesn't work either. In fact, if you use gpu-switch you'll have to rebuild both macOS and Windows as the machine will just hang when you try to boot into either.

Also a problem on previous models. I was working on my own GPU switching solution but gave up due to lack of time and the fact most of my applications work just fine under OS X.

I'd be interested doing the same with Debian or similar Linux though.

What is an ideal Linux distro for programmers using the XPS 15?


I checked, but I couldn't find out if Dell still has that Sputnik programme they did a few years back, where you could order one of these with Ubuntu LTS preinstalled. I'm wondering if everything "just works" yet.

They do indeed still have the Sputnik program running. Barton George (from Dell) still updates his blog[0] with status info[1].

[0] - https://bartongeorge.io/

[1] - https://bartongeorge.io/2016/10/04/the-new-xps-13-developer-...

They're now called "XPS Developer Edition"

They do, and with Dell being a private company as well as Dell (the man) being a Linux user (supposedly), it'll probably keep being a thing.

elementaryOS of course! :)

razer blade pro has a different touchpad placement, it's on the right side of the keyboard. I also have some difficulties with the touchpad of my current laptop, I click it all the time.

I need to pick a new laptop for work (dev.) and I'm considering a MBPr 15" from last year or a XPS 15" with an i7, 16GB DRAM and a 256GB SSD.

If you have have heads ups I would like to know. Thnx

depending on your needs 256gb ssd may be cramped.. would do the 512gb if you can... I've filled 256, but haven't had issue with 512.

USB-C charging looks great on paper but disappoints when devices don't charge.

Your device may support a subset of the different USB charging protocols:

    * USB 1.1 lo power: 5V/100mA
    * USB 2.0 hi power: 5V/500mA
    * USB 3.0: 5V/900mA
    * USB BC (battery charging): 5V/1.5A
    * USB quickcharge 1.0/2.0/3.0, proprietary Qualcomm standard
    * USB PD (power delivery) 5 profiles offering up to 100W (5/12/20V @ 1.5/2/3/5A)
MacBooks also use a nonstandard 15V USB-PD profile.

Unrelated to the MacBook, but problems i see with USB-C are:

Chargers may offer cryptographic signatures in the future for authentication against a whitelist at the device.

Second and most problematic: The MacBook is a good citizen here, but many laptops (HP business series, Dell XPS series) only support USB-C PD with profile 4&5 (20V/3A+). This rules out the car dongle as well as cheap USB power banks.

The connector is always the same, the customer cannot deduce charger/device compatibility. The experience will suck.

Edit: typos/formatting

In other words, exactly like the early days of usb-micro/mini? I distinctly recall having chargers from blackberries that wouldn't support my first android phones due to being underpowered. I would imagine over time we'll see something almost identical only with two unofficial categories instead:

One set of chargers will be for mobile devices and just support the highest standard we see in them.

One will be for laptops and the same - just supports the highest profile for them.

> In other words, exactly like the early days of usb-micro/mini?

Overall worse, because USB-C may also be used for other connections (eg DisplayPort), adding to the confusion.

Plus all the different cables which can do different things and not support specific profiles of some things and others but they all have the same connectors. Confusing as heck.

This is not harder the the issue of 8P8C connectors.

Depending on cable configuration, pinout, wall plate and structured wiring system that 8P8C might be usable (or not) for multiple different types of data networking, from the assorted ethernet speeds to E1 to token ring, or for a serial console, or delivering power and audio to a remote speaker, or hdmi-over-utp, or even -48V telephony, and let's not even get started on the only-subtly-different but actually incompatible RJ45 connector, or people sticking RJ11 plugs in 8P8C ports.

And yet the world has coped with this proliferation.

Nontechnical people tend not to deal with multiple usecases for 8P8C connectors in everyday life. Despite being fairly technical, until I googled I would have called them RJ45.

The generic term is 8P8C for the multitude (and I've seen many of them in Telecom) uses, RJ-45 is one specific use - ethernet cables.

Over time, as you've noted - the vast majority of uses of 8P8C has turned out to be networking - it will be interesting to see if USB-C likewise, makes a similar evolution.

RJ45 doesn't refer to Ethernet. In fact, there doesn't seem to be such a thing as RJ45.

There's an RJ45S, but that's definitely nothing to do with Ethernet. Use of the term RJ45 is simply wrong, in any circumstance, as far as I can tell.

There's no term for the use of 8P8C for Ethernet purposes other than well, Ethernet, or its specific profiles ...BASE-T, as far as I can tell.

I almost always hear "RJ45" to identify the 8-conductor ethernet female or male connector - depending on the context. It is universally understood, and there is no confusion about it. There is nothing wrong with using it in everyday conversation.

I have never once heard the phrase "8P8C" used to refer to an ethernet jack. Not once (outside of this thread) - but I have heard it used that way when referring to various 8-pin telco connections - it was a common term of art in the 90s when describing telco installations that used that configuration. When talking about Ethernet, and people are trying to be specific, they usually reference EIA-TIA-568B/A.

There are certain words, like "Bandwidth" - that, might technically mean the width of the band (typically in Hz), but have grown over time to refer to data rate as well. And that's cool - language is versatile that way.

This interesting tangent about common parlance for connector names demonstrates another way in which USB Type-C's adoption trajectory is characteristically similar to 8P8C: people are already giving it a technically incorrect common name, "USB-C".

> In fact, there doesn't seem to be such a thing as RJ45.

In fact, language and terms are not set in stone.

Well, not quite.

Both chargers and devices need not support all charging standards/profiles and thus may disagree on working together. A working (all USB PD profiles up to 100W supporting) charger looks inherently the same as a profile-1 only one, and both may even say "USB PD" in the specs.

Thank god the MacBook accepts the widespread 5V/3A USB-PD power level and even USB BC.

Wow, that's more horrible than I thought. Thanks for the summary.

I am honestly befuddled by USB-C. The allure of a universal connector? That's kinda pointless when the cables look mostly the same but support different feature subsets. It's insanity.

It's bad enough that many manufacturers (I'm looking at you, Dell) don't differentiate between USB 2 and 3 Type A. C is so much worse for this.

Instead of a pretty good $1200 13" Air we have a $1800+ cersion that's lost features (MagSafe, worse keyboard) to be a hair thinner that also requires $200 in dongles to connect to anything.

Seriously, fuck you, Apple.

I don't know why you are getting downvoted(I guess it's the fuck you at the end), but I absolutely agree - if a cable fits in a port, it should just work. Anything else is horrible design that's user hostile. Apple sells an LG USB-C display, and if you use the USB-C cable bundled with the MacBook Pro, it doesn't work. And you don't get an error message - it just doesn't work.

Some companies, for example Nintendo, figured this out a long time ago. Notice how with their consoles, if the disc/cartridge fits in the console, the console will always play it, even between generations. The customer shouldn't have to research arcane names and study symbols on cables - if it fits, it should just work. And USB-C is just a mess at the moment.

> Notice how with their consoles, if the disc/cartridge fits in the console, the console will always play it, even between generations.

Well, that's not quite true. Both the Wii and Wii U have a standard-sized disc slot; on the Wii you can insert small GameCube discs into that slot and they'll play, but on the Wii U they won't. On the portable side, 3DS cartridges do have a tab to prevent them from fitting into a DS, but that wasn't the case for the handful of games exclusive to the brief-lived DSi.

Recent Nintendo consoles have also had compatibility issues with standard storage devices. The Wii supported SD cards, but wasn't compatible with SDHC cards, which are almost all cards with a capacity of 4GB or higher. This was eventually rectified with a software update... but the update only applied to the system menu, not to games which could access SD cards themselves, including notably Super Smash Bros. Brawl. The Wii U, for its part, supports storing games on external hard drives, but doesn't provide as much power over USB as most hosts do, requiring the use of a USB Y cable and a separate USB power source even for drives that don't normally require external power.

I plugged my new macbook pro into an OWC usb-c hub with the apple cable and nothing happened. Tried the (annoyingly short) cable that came with the hub and it works.

No clue why.

The reason why is probably because he assumes a negative thing that hasn't happened and then rants against something that isn't a problem while being a bit of a dick...

I haven't come to a conclusion w/r to those "shiny" new MacBook features yet, but I'm not an Apple user either. Time will tell though.

I hope Dell will improve its power circuitry in the future and support USB BC and PD 5V/3A. Let's hope it's just that current power chipsets lack those modes because of time to market pressure and Apple is ahead of its competitors here.

I can understand your anger at Apple, I hear the same a lot from design and audio professionals... You may have got some downvotes for that last statement.

Exaggeration and insults don't improve your argument.

Having had the Macbook 12" for a year and a half now and a Nexus 6p for a year, it's really been quite wonderful. I can charge my laptop or phone using the same charger – of course not as fast as the OEM charger, but wonderful for being on the move. I love that I can use a typical battery backup to charge my Macbook. I really wish everything of mine USB-C, and it will be soon.

Right now it feels wonderful with OEM chargers, but you do have me worried about the future buying replacements and accessories.

There will be the cheap Chinese chargers that will suck. Then customers will complain, and Belkin and others will notice and make good chargers with proper marks on the packaging. It won't take too long before customers and shops know what to buy or sell.

All chargers are Chinese, not only the ones which suck. Let's leave cheap shots at other countries out of here, thanks.

I have that car adapter as well as a USB power bank. Both don't charge the Dell XPS 13, because it wants 20V/3A. The docs of both chargers and the Dell are light on the USB-PD details. The chargers are great for phones so i didn't return them.

However there's not a single car charger nor a power bank that does usb-pd at 20V/3A at the moment. So sad.

> Charger may offer cryptographic signatures in the future for authentication against a whitelist at the device.

Who would ever expect Apple to do something like that? Oh wait...

That argument looks like FUD to me. That would be a surefire way to get shunned by the USB-IF and be forbidden from using official USB identification on your products.

Apple is already doing something similar though on the new MacBook Pro, there is a software block on Thunderbolt 3 devices that haven't also been "macOS certified" by Apple:


No, sorry, the new MacBook pro sucks for hackers. It's great for prosumers who like gadgets and benefit from USB-C. Hackers do not benefit from a closed box with non-expandable performance.

I get it that prosumers like to think of themselves as "hackers", but ...that's just not how it works. Come on.

In my own (possibly and probably inaccurate) opinion, I feel as though "hackers" aren't the people who need professional gear to do professional (or daily) tasks. They're the ones who can make the most out of as little equipment as possible.

"See that toaster over there? It's been reprogrammed to automatically deposit my cat's food every fourth hour and have it warm as well."

"That 2009 dinosaur of a smartphone sitting in the corner? It's an IP surveillance camera."

"That first generation Xbox, it's powering the zoom feature of the Hubble Telescope."

That last one might have been a bit of an exaggeration, but my point is that something isn't great because it's the latest. Something is great because someone increased it's value after using it or created something of higher value than the equipment used to create it.

> Hackers do not benefit from a closed box with non-expandable performance.

To address this point... some might. But not all will. And I certainly think that fewer will than the generations before. I really hope USB-C will be as great as these companies say it will, but until then I'll happily use my different ports that work as they are expected to.

All professionals need professional gear to do professional work. That's kind of the definition of a professional - someone who can afford the right tools, and knows how to use them to get a job done quickly and competently.

Turning a toaster into a cat feeder is tinkering, not professional hacking. There's nothing wrong with tinkering. But it's the difference between wiring up a Raspberry Pi as a heating controller, and building a company that sells fully licensed and certified heating controllers all over the world with support infrastructure.

One is hobby project, the other... isn't.

A useful definition of a professional tool is one that lets you forget you're using it because it's so transparently intuitive you never have to think about its needs.

I don't think the 2016 MBP does that. The ports are (literally) a side issue. The problem is more that Apple are thoughtlessly losing their reputation among professionals, because Cook, Schiller and co don't seem to be thinking hard enough what they're doing, and don't appear to have an understanding of what their professional customers are looking for.

...Which is not something super-thin for the sake of it, or with a gimmicky touch bar. It's something expandable with ports that "just work", no physical or metaphorical rough edges, with the option to have decent memory (i.e. 32GB) and a reasonable processor speed bump.

This shouldn't be hard or controversial, but for some reason it seems to be beyond Apple's understanding.

I'm hardly a hater. I bought the 12.9" iPad Pro last week, and I'm loving it. But the laptop format is challenging because you either stay conservative, or you go full experimental with (say) a dual-display clamshell. or even a touch panel instead of the trackpad.

Half-hearted innovations like the touch bar glued onto an ungenerous spec look like gimmicks for the sake of it, not serious attempts to improve professional productivity.

The original meaning of hacker is closer to the guy tinkering. Now it's everyone who works at a startup.

Don't get me wrong, but professionals are those who deploy Win3.11 machines running on embedded i386 cores for big money today. It's more about meeting some specs, getting certifications and providing reliability than it is about technical details.

Your definition is more bleeding-edge-users who are constantly limited by technology and could justify to pay a couple thousand for a 10% increase in performance. This group overlaps but is not equal to the professional users.

Many real professionals will love the new MBP lineup while many more will hate it.

I still don't understand this seemingly rigid idea of what a "pro" needs in a computer. (High performance, who cares about the battery life or form factor.) Surely it depends what your line of work is.

I'm a programmer, mostly web based apps and related servers. For my usage, I need an SSD (~512gb) and a recent cpu with 16gb+ ram. The touchpad has been what kept me on the mbp... Looking at the Razer Blade Pro, love the keyboard layout, but its totally overkill for my needs.

An integrated gpu, with a higher end i5, with a big ssd and lots of ram for half the price would be more appealing to me.. even bringing a lower rez screen would be okay for my needs... love the for factor though.

Everyone has different needs, as you said.

> I need an SSD (~512gb) and a recent cpu with 16gb+ ram.

Can you expand on that? I've seen this with folks in the movie/production industry but for webdev it seems like that's more of a want.

What's your daily task load, where integrated GPU and 32GB of ram are tasked?

I didn't say an integrated gpu and 32gb are taxed... I said I needed an SSD greater than 256gb (512), and at least 16gb of ram. However, in many corp environments if you need those, you get a hefty machine.

If it's being taxed, it's probably because some idiot spec'd an HDD that pulls resources too slowly For modern JS dev, you really need an SSD more than anything else. Mainly because the build/watch process is tracking many thousands of small files which is significantly worse on hdd. 60+ seconds vs. under a second for any change to take effect in the browser. This can be as much as an hour a day wasted. The 5 hours of wasted time in a few weeks are more costly than the upgrade to ssd.

-- edit

As to 512gb, it's because after all the software, that can take 100gb... creative assets well over that depending on the projects... it's easy to hit 240gb between the OS, software, projects, and assets.

Beyond that, show me off-the shelf hardware that can be configured with 16+gb ram and a 512gb SSD that doesn't have the other stuff I don't need?

>The problem is more that Apple are thoughtlessly losing their reputation among professionals, because Cook, Schiller and co don't seem to be thinking hard enough what they're doing, and don't appear to have an understanding of what their professional customers are looking for.

I just wanna know what cable they use when they need to plug their iPhone into their MacBook. I don't seem to recall Apple selling a usb-c to lightning cable.

Well said. A hacker can get by with a Rasberry Pi with wires hanging off it, while the self proclaimed "pros" whine about memory and performance whilst demanding the latest slick design.

I'm a hacker. I just don't hack on computers, so I'd rather not have to waste time hacking the computer for me to do other hacking on.

I find it weirder that people whining about that are usually not impaired by it for their dayjobs? I hear people shouting about wanting gpus and multicores and 16 gb and then that they are doing React dev on it as professional. What do you need that monster for when doing that dev? When I do 3d game dev, ML or image processing/recog I need something heavy (although... I really only need that locally for the first from that list). The rest I can do on mostly anything after 2009... Including React dev.

For me (closer to react guy), I want an SSD and plenty of ram... a high end core i5 with integrated gpu will do... but in many corp environments there are only a handful of build options, so you get the uber machine or 8gb ram with spinning rust drive.

I mean, I can "get by" on anything - an iPod Touch that runs Emacs 23, just to pick an example off my desk. When the chips are down, I'll get the job done with it. When the chips aren't down, and I have something more closely approaching my druthers with regard to the tools I use, I'm not going to go for the iPod.

The hacker as reverse engineer benefits a lot from low-level access.

I agree that USB-C might be helpful in that respect, but it totally depends on the drivers. If there is a device that I can make hiccup through timing attacks it depends on what liberties I have in the driver.

For Bluetooth for example you have Ubertooth. If the Bluetooth radio on my laptop would be accessible on a low enough level there would be no need to use other hardware to execute attacks.

I'm just a hobbyist with a jtag programmer, some digital analysis, nothing fancy. A professional reverse engineer is another beast with microscopes, etc.

A hacker in the form of someone exploiting vulnerabilities in web applications does benefit not from a single computer, but from many. I don't think he/she would care about the specs too much.

> They're the ones who can make the most out of as little equipment as possible

... like a $4k laptop?

It's not like Apple is the only company to support USB-C on laptops... in fact, as far as I can tell, every major manufacturer has it on their most recently-released laptops.

> To address this point... some might. But not all will.

How, exactly, would someone... anyone... BENEFIT from closed-box performance? Just because someone might not benefit from a laptop you can upgrade doesn't mean the converse... that they somehow benefit from having one that you can't upgrade.

(side note: I can think of ways... decreased cost, decreased size... but the laptops certainly don't cost less, and any difference in size w/ a 15" laptop is negligible)

Actually, it was a 8192 processor MasPar at the University of Central Florida that did the image correction for the Hubble Telescope before corrective lenses were installed.

The MasPars were fun to play with. ;) So you're not that far off on your XBox comment!

By this version of what a hacker is, it makes it sound like Apple is saying "This is a sub-par piece of equipment but you'll make the most of it because you're a hacker who's more productive with less!" I agree with the definition on it's own merit.

I think it's funny how signaling things like "expandable performance" and "deep key travel" get tied to being a hacker. The only thing you can upgrade even on an expandable laptop is memory and disk, which a hacker is probably maxing out to begin with. (Even expandable laptops are pretty limited by the chipsets these days in how much RAM they can handle). And given the high resale value of Macs, your typical Silicon Valley worker will probably spend more on lattes than on simply selling their Mac every couple of years and buying a new one.

HP Zbook first gen here: I can also replace CPU and GPU. I didn't do it and never will do but it's in the user manual.

I upgraded RAM and swapped the DVD with a 1 TB SSD. The HD is still in there, auto shutdown after 5 secs. I use it for storing large files I don't need on the SSD but could be handy to be online sometimes, like raw videos from my camera.

I'd like to replace the keyboard with one without the number pad. Possible in theory but there is no part that fits on the market. The only 15" laptops without number pad at the time were the Mac and I think the XPS, which was overheating. Maybe the problem with the latter is solved now.

What does "hacker" even mean? I don't understand what this has to do with how suitable the MacBook Pro is.

I'm a developer and this laptop is as great as any other, for me. But USB-C has very little to do with that. USB-C is convenient (until it's not) for anyone, regardless of 'hacker' or not.

From what I've understood, the definition of hacker is:

A person who uses computers to gain unauthorized access to data. [1]

So in this context, it's almost entirely controlled through software. One can be a hacker with hardware, but as of now I doubt USB C is going to make anyone as much of a hacker as a specific type of software might. And this new MacBook has very little to do with changing how real "hackers" might do things.

[1] https://en.oxforddictionaries.com/definition/hacker

I admit I'm guilt of misusing the word, but if I ever get into a serious conversation or argument I'd be sure to use the dictionary definition.

That's not the original definition of the word, or what it means in this industry. Google "MIT hacker" for something closer to what the OP is talking about.

The jargon file [http://www.catb.org/jargon/html/H/hacker.html] is a more appropriate source of definition here. It defines the term "hacker" as follows:

"hacker: n. [originally, someone who makes furniture with an axe]

1. A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary. RFC1392, the Internet Users' Glossary, usefully amplifies this as: A person who delights in having an intimate understanding of the internal workings of a system, computers and computer networks in particular.

2. One who programs enthusiastically (even obsessively) or who enjoys programming rather than just theorizing about programming.

3. A person capable of appreciating hack value.

4. A person who is good at programming quickly.

5. An expert at a particular program, or one who frequently does work using it or on it; as in ‘a Unix hacker’. (Definitions 1 through 5 are correlated, and people who fit them congregate.)

6. An expert or enthusiast of any kind. One might be an astronomy hacker, for example.

7. One who enjoys the intellectual challenge of creatively overcoming or circumventing limitations.

8. [deprecated] A malicious meddler who tries to discover sensitive information by poking around. Hence password hacker, network hacker. The correct term for this sense is cracker.

The term ‘hacker’ also tends to connote membership in the global community defined by the net (see the network. For discussion of some of the basics of this culture, see the How To Become A Hacker FAQ. It also implies that the person described is seen to subscribe to some version of the hacker ethic (see hacker ethic).

It is better to be described as a hacker by others than to describe oneself that way. Hackers consider themselves something of an elite (a meritocracy based on ability), though one to which new members are gladly welcome. There is thus a certain ego satisfaction to be had in identifying yourself as a hacker (but if you claim to be one and are not, you'll quickly be labeled bogus). See also geek, wannabee.

This term seems to have been first adopted as a badge in the 1960s by the hacker culture surrounding TMRC and the MIT AI Lab. We have a report that it was used in a sense close to this entry's by teenage radio hams and electronics tinkerers in the mid-1950s."

>No, sorry, the new MacBook pro sucks for hackers. It's great for prosumers who like gadgets and benefit from USB-C. Hackers do not benefit from a closed box with non-expandable performance.

You'd be surprised. Linus Torvalds, a hacker one would presume, used to have a MacBook Air as his primary laptop, and praise it as the best laptop ever made, saying that other companies have failed to produced something as good (though after 2014 he moved to a Chromebook). And the reasons he gave for praising it for are exactly what people think Apple has too much emphasis on: thinness, lightness and battery life.

Of course that's a single data point (through for more data points you can go to any software conference, where Apple laptops usually dominate both the speakers and audience, even though they are less expandable and have less performance than some gaming/desktop-replacement laptops).

An Apple laptop can fit well with some hackers, for at least two reasons, I think:

First, tinkerer != hacker. A hacker can be a tinkerer, but it's not the same thing. There are people who love to hack on specific things, create new stuff etc, but could not care less about dealing with the hardware or customizing their window manager. A lot of hackers I know in fact tend to be quite minimalistic in those areas.

Second, a hacker isn't necessarily all about raw performance and 8GB graphics cards. A lot of hacker types and great programmers in C or whatever can do wonders with very little hardware.

Now, other kind of professionals, like 3D artists, number crunchers and such, might definitely want more GPU/RAM. But even for a lot of creative professionals, stable and "what we know" trumps "latest and greatest". Most professional music studios I know, for example, have 2 and 3 generation old setups, and never jump to the latest OS until 1-2 year after its out.

In the photography world, were I dabble (and once did professionally), we have this notion of "measurbators" and "pixel peepers". They are those that are obsessed with ISO performance, megapixel count, sharpness, synthetic tests of camera gear and so on, but seldom create any output of any particular worth. I guess the same would be for PC people obsessed with benchmarks beyond a particular point, especially if their workflows don't need them. A hacker, in this regard, would be the opposite. This guy, is a hacker, photography wise:


Plus the whole article sounds like an absurd mouthpiece for Apple.

USB is standard, so now "hackers" are supposed to have an raging lust for standards on Apple hardware?

"Don't worry, the dongles are USB standards too". I don't give a rats ass if dongles are standard are not. I am not lugging around dongles to connect my phone to my laptop.

I use Ubuntu and One Plus 3 and USB 3.1 Type C works like a charm with 10 Gbps speed. I got 100 pieces of really good quality cables for around $2.5 each, used up around 25 around my house, office and car (removed link)

I am not spending 1 bazillionton dollars on the same or worse quality Apple cables.

25 USB cables all for connecting your phone to your computer?

> Hackers do not benefit from a closed box with non-expandable performance.

Agreed, and well said. This one line concisely sums up the first and last word on this whole pitch.

After a week with a 13 inch, what I've mainly noticed is how physically unwelcoming it is:

The edges are very sharp, and the air vents on the bottom are right where you grab the laptop to pick it up, which gives it a knife-like feel. Also, by expanding the track pad far beyond it's useful size, there is now no gap between it and the space key. I have discovered that I have a habit of resting my thumb just below the space bar and now I tend to bump the pointer on accident now. The arrow keys are now a continuous run of keys with no way to orient quickly like previously (where the side arrows were slightly smaller and made it obvious where the up key was without looking.) . Finally, the keyboard action is very short, as you would expect with such a thin laptop.

I do most of my work with an external keyboard and monitor, so it isn't that big a deal to me, but I can see it being hard on people who use their laptop exclusively.

> I have discovered that I have a habit of resting my thumb just below the space bar and now I tend to bump the pointer on accident now

This is surprising and disappointing. I really enjoyed using Apple's touchpads with tap to click because their accidental press recognition was good enough that I never really worried about this. I typically had to turn tap to click off when using Linux because I would randomly click while typing.

I wonder why they haven't been able to get it to work as well on the new touchpad

Just speculating here, but this is precisely the sort of thing Apple generally fine-tunes and fixes with software updates in very short order. I would expect it will be so this time, too.

> I typically had to turn tap to click off when using Linux because I would randomly click while typing.

Palm detection should typically be tweaked through various driver parameters.

Not sure why you're being downvoted; I appreciate the advice.

However, I don't think all drivers have palm detection unless I'm mistaken?

Also, unfortunately while I would have enjoyed configuring the system to my personal desires, these days with the limited free time I have, I'd rather be able to just do what I was going to do on my laptop instead of having to spend time tinkering with it to get it right. Apple seems to do very well in this respect since it almost always knows when I want to tap vs when I just have my palm there.

"...what I've mainly noticed is how physically unwelcoming it is: The edges are very sharp..." - I noticed this too on the early Microsoft Surface power-bricks (not sure if it applies to the later Surface models), the power-bricks had unusually sharp-edges, at the time it struck me as unnecessary and not human-centric. I wondered if the designer ever had the opportunity to interact with the tactile experience before it went into production. Has the design process broken down and designers no longer see a production prototype before they sign-off?

>>The edges are very sharp

Just curious, have you used Mac laptops previously? Because sharp edges are something that have existed since the original MacBook Air.

I do agree they are annoying though. I always have outlines on my wrists after using my laptop on my lap for a while.

Yes, I have the 2014 version of the 13 inch macbook pro. The increased sharpness is very noticable to me, particularly where the underside vents are now placed, creating that knife-like feel when you pick it up.

Sharp edges seems to be a trend that has migrated to other manufacturers as they ape Apple's design. My three year old Lenovo, and new work supplied Dell are both unforgiving to type on for the same reason.

>since the MacBook Air.

I'm using a late 2011 MBP and the corners where you open the top are deadly.

My 2007 Macbook had this issue as well, part of why I went with a ThinkPad for my next laptop.

I am disappointed by the Touch Bar. It is such a small and dim screen and a lot of attention is required for interacting with it. There is no tactile feedback, so touch typing is impossible. QuickType and Emojis are useless for me. It might be a good accessibility feature, but QuickType is much too slow and the upward movement of the entire hand/arm interrupts the flow of typing. I think the Touch Bar is great for designers who can benefit from a 'general purpose touch/slider input device' (e.g. for color mixing, navigating timelines, parameter fine-tuning). It might also reduce cognitive load a bit since it reduces the need for memorizing keyboard shortcuts. On the other hand, users who are blessed with a good memory will probably not benefit very much in that regard because pressing a key combination is much quicker than scanning and touching the Touch Bar.

The only attractive feature of the Touch Bar model really is Touch ID. If Apple would sell the Function Keys model with a Touch ID power button and one more USB port, I would happily buy it. But right now, I am a little bit confused and baffled by Apple's new MacBook Pro product line. I think I am going to wait another generation to see whether Apple gets back on track and whether the Touch Bar can stand the test of time.

I think apple is a little bit lost without steve

Most amusing comment in this thread, but in all seriousness I think that's quite true.

The function keys were never used for anything in OSX, as the "culture" is to use key chords instead. So even if it's not used much, I think it's an improvement.

I use the function keys a lot for changing volume, screen/keyboard brightness, pausing/playing music. I also launch Terminal and Firefox to F3 and F4, I heavily make use of ESC for all kinds of things (VIM, closing UI dialogs) and I occasionally use the function keys for some shortcuts such as finding the next occurrence with F3 in Foxit Reader, F10 to compile, F7 to show the desktop, F6 to switch between Karabiner profiles. For me it would be quite a loss.

They're heavily used in JetBrains IDEs. Lack of them makes OSX unsuitable for multi/cross-platform development.

I bought a new 13" MBP with touchbar and I'm returning it on Monday. I don't like the keyboard and I _really_ don't like the touch bar, and I seem to only get about 3 hours of battery life. I'm going to stick with my early 2014 MacBook Air until Apple figures their stuff out.

Making any judgments like this on battery life after a single day or even a few days of use is VERY unwise.

The main reason is that your system needs to index your entire drive; this will require MUCH more power than typical power use until this process is complete. Should give a new system at least a week to shake out before freaking out about battery life.

I've also noticed the battery life is quite bad. My first one might have been defective as it only gave about 3 hours of battery life (from the in computer estimate), which is the same as my 4 year old 2012 macbook pro.

Second 13" touch bar machine now reports about 5 hours, which is still not great.

Did you double check that Spotlight isn't still indexing, etc.?

Also, for what it's worth, I love the new keyboard. It took me a few days to get used to it but just yesterday I measured myself typing on it at 130 wpm, the same speed I can do on normal keyboards. It's very satisfyingly clicky.

The internet decided the keyboard is terrible. Did you miss that memo?

...I actually really like it too.

No, I decided that the keyboard is terrible. Does anyone that agree with you have hands-on experience and anyone that disagrees got its facts off the internet?

It was more a comment on internet group-think and how people on places like reddit become an echo chamber where people complain about things they haven't even tried themselves.

It is a perfectly reasonable opinion to like or not like the keyboard, if you have actually tried it for a while.

I have hands on experience: it's a great keyboard if you're not a clicky button-smasher.

I am not, so... works for me.

I'm guessing you're more heavy in terms of key strike?

Yes, I've been hearing about the battery life from several people now. I was very excited to get one but I tried it at the store and really didn't like the keyboard. If, in the future apple just decides to do the whole keyboard as a screen, then I guess I'll just have to always use an external keyboard.

The keyboard feels like a plastic tube – including about the same noise. I'm glad that my student days are over and I don't use my MBP in the library anymore!

I have one of those: Lenovo Yoga Book - the virtual keyboard uses haptic feedback, and I actually like it less than the new MacBook Pro keyboards.

That said, while I feel that the new keyboard is slightly better than the last new keyboard, the sound of it makes me want to rip the keys out - it is very loud in a not-so-good way.

The keyboard immediately made me decide never to buy such a device. It is absolutely terrible.

Same here. I find it, on the other hand, very plausible that some people will like it, in theory, but a lot of the good things I heard about the keyboard seem to come from people who just received their computer. I generally take those kind of feedbacks with a grain of salt as the buyers remorse bias is now a very well understood phenomenon :D

Just to add some anecdata: I tested the keyboard for 10 minutes with TypeRacer in an Apple Store and I found it to be OK. The clicking certainly does provide enough feedback and since I am used to mechanical keyboards it also does not bother me how loud it is (I actually like loud keyboards). I also liked the short key travel.

I tried it too and I actually don't like the short key travel. It feels lower quality than the old keys.

I remember when I first got my MacBook with the Force Touch trackpad (that doesn't physically move but instead using haptics to simulate the 'click' feeling) and how much I hated it initially. The illusion of the click worked fine, it just wasnt nearly as strong enough. A week later and I love it and hate the previous buttons which just feel mushy and archaic.

Similar story with iPhone 7 home button, except I've just come to accept it and ignore it but wish we still had a proper physical home button.

The smaller battery is disappointing. Yep, the CPU uses less power but that only counts if your MBP is more or less idle. In the end, footprint matters and not thickness … Apple abandoned too much battery capacity IMHO.

The non-TB version is nice with a lower power CPU and a higher capacity battery, BUT only has 2 ports instead of 4, which is nothing short of absurd. They really screwed up on this front IMO.

Hell yeah! I sincerely regard those 2 ports on the cheaper model simply as hardware crippling. Much like AV manufacturers punish you by giving you less audio/hdmi ports if you go for their entry models.

What's the use case for needing 3 USB? Wherever you sit normally (home or work) just get a usb hub/dock and use the single connection to the laptop for keyboard, mouse, charging, and driving a monitor.

When out and about, when do you need 3? Can't ever recall seeing someone at a coffee shop, conference, etc needing 3 USBs.

Sometimes I work at home and hook my laptop up to my keyboard, mouse, monitor, and outlet at the same time. Admittedly I don't do this very often but it's nice to be able to do this as I need to.

I think the idea is you would have a USB-C hub that has all of those ports on it, so you come to your desk to work and plug in a single plug and away you go with charging/monitor/keyboard/mouse/ethernet.

3 seems like a minimum to me. Power, mouse (yeah, bluetooth, but the only ones I like require a dongle), and phone. My current laptop has two USB ports, and it covers about 99% of my needs, but it's also frequently full.

Charge + peripherals makes 2 useless and even makes 4 difficult unless you go with a new TB3 monitor that also provides power.

As I speak I'm using magsafe + thunderbolt2 + HDMI + 2 USB, 1 for peripherals and the other for playing back video.

I don't really mind 2 ports instead of 4, but I wish they put one on each side so I could charge from either side.

Try the non-touchbar. Much lower power parts (CPU takes half as much), no touchbar of course, a real ESC key, and a larger battery.

I've been on the non-touch 13" for about a week now and I'm really happy with it. I moved over from a 2012 11" MBA. The form factor of the 13" is great for grabbing 30 minutes here and there during travel to do some work.

Battery life is good, and 16GB is a big step up from the 8GB I had on the MBA. I've been running a decent-sized system in minikube with no problems.

The only issue I still seem to have is that Hangouts absolutely obliterates the battery. A one-hour call took battery life from 9:45 to 1:15!

Been using the non-touch bar version for a little over a week now and I'm averaging 10-12 hours of battery life. This is one great machine.

I bought a 13" MBP with the Touch Bar too, and I am also getting bad battery life, on the order of 3-5 hours depending on the workload. I've ordered a 13" without the Touch Bar in hopes that the lower wattage CPU, lack of the Touch Bar, and bigger batter will result in much more battery life.

Apple say about 10 hour battery life on their website. Three hours seems like something is really wrong.

I get 8-10 hours easily on mine. With VSCode, Chrome, XCode and Android Studio open.

With AS? Yeesh. Mine (previous gen) will drain in about 2 hours with that or PyCharm open. IntelliJ stuff is super power-hungry unless you put it in power-save mode (and then it's not even as useful as a "plain" editor like sublime text or something).

I mean, I like all the power-hungry features, and I'm usually plugged in - I'd prefer they keep adding useful things rather than worry about my battery. But it does hurt at times :)

Maybe initial Spotlight indexing.

When have companies ever p published honest battery light estimates?

My experience with two 2015 rMBPs and an Air has shown that for light work, 10 hours is a completely reasonable estimate, which matches what Apple estimates.

Of course driving an external monitor, streaming video from the Internet, and using your computer for more than just word processing and web browsing, your system will use more power.

Ditto on the 2015 rMBP. The moment you get beyond light work you're looking at 5 hours but for light internet browsing it really does push towards 10. It's awesome.

Well, my almost an year and a half old Dell Inspiron 5558 gives me 5 to 6 hours (gave 7 to 8 when new) with just web browsing. I don't think that's all that great.

And mind you, I am the kind of person who has managed to accrue about 13 months of hard disk head flying hours according to the SMART stats so the battery isn't exactly in a good condition.

Apple advertises "up to 12 hours" battery life on the Macbook Air 13". I reliably get over 14 with work and web-browsing.

Is the keyboard like the one on the Macbook? I tried it once in an Apple Store and didn't like it either. It feels weird to not really press the keys.

It took me a couple of days to get used to the Macbook keyboard. I had a similar transition time the last time Apple made a major change in their keyboards.

I'm kind of astonished at how many people turn their noses up at different things based on a first impression, even in groups that you might consider early adopters of tech.

> I'm kind of astonished at how many people turn their noses up at different things based on a first impression, even in groups that you might consider early adopters of tech.

Some early adopters of tech don't enjoy enduring regressions, and can see the regression from a mile and a month away.

I'm especially grumpy when I see absurd, hyperbolic marketing used to sell anti-consumer regressions.

Doesn't this just support his point? You're more or less saying that you wrote off the keyboard as a regression at least a month before having the opportunity to even try it.

I've used the keyboard for a week at this point and at this point have completely adapted to it. It doesn't feel like a regression to me at all.

Maybe keyboards were already à solved problem. People had no desire for change here. I find recent Apple quite off its market. They push markers higher because it was 'apple genius' before, without checking if the crowd still Dreams about that. A bit lazy and Self centered.

>>Maybe keyboards were already à solved problem.

This is a terrible attitude for a technologist to have. Cellphones were a "solved problem" before the first iPhone came out. Laptop monitors were a "solved problem" before the first retina MacBook Pro came out. Look where we are now.

Of course, this doesn't mean every change made to an existing technology will be an improvement and/or be embraced right away and by everyone, but I see it as a Good Thing that a major technology player is still experimenting with things that everyone else has written off as "already solved."

Cellphones I can agree, retina display not at all. This was not just about technology but how Apple is approaching its products. Latest MBP felt too much like technology solving non problems: bestest trackpad (too big, "no" palmrest), thinnest keyboard (no tactile feedback, needs to spend 1s on ensuring your fingers are at the right location), dynamic touchbar: too dynamic, forces people to leave the content and focus down. All this surrounded by this feel that what Apple did since the iPod/iPhone, cross coupling all their technological improvements along their product line (ipod helped the iphone, iphone helped the macbook air, etc etc) is now applied blindly because it worked before and Apple has to occupy the terrain and grab the spotlight.

Smartphones are a regression wrt to being phones. The iPhone is a mobile computer, which wasn't solved when it came out.

I didn't want to say this, but indeed calling someone with a 2016 smartphone isn't quite an improvement. If I was a genius, I'd retrofit my old Motorola v50 with few sensors and wifi capabilities.

> It took me a couple of days to get used to the Macbook keyboard

To be (un)fair, you can get used to a rock in your shoe as well. That doesn't mean you should.

Having to "get used" to a keyboard to return to your previous typing ability seems to be a contra-indicator of the new Macs having a good keyboard.

That's reading a lot into my words. I was happy to "get used" to the new keyboard because I thought it was awesome that my laptop had become thinner and lighter... two things that I value more than having a keyboard that might be slightly better.

Thinner and lighter only matter when carrying it around. Keyboards matter when you're using the computer. Most folks who use their computer 8 hours or more a day that I know tend to do more with the keyboard than they ever do lugging it around. So the keyboard really matters, even in small increments.

I'm also going to make what I think is a reasonable assumption and say that a vast majority of folks are going to carry their laptops around in bags. In which case, the 7 ounces saved (for reference sake, this is about the same weight as a 45 watt power brick, minus the extension cord) is nearly meaningless, and the half inch on each dimension will be completely meaningless.

If that were true, 13" wouldn't be the most popular laptop size.

The difference between 13" and 15" for size and weight is much greater than the size and weight savings afforded by the new keyboard. Also, both the 13" and 15" MacBook Pros have the same keyboard.

On a side note, I think that a 13" monitor hits a sweet spot. It fits in a lot of bags, has just enough screen size to be productive without requiring an external monitor, and the performance difference between 13" and 15" computers is usually quite small (if it exists at all).

> The difference between 13" and 15" for size and weight is much greater than the size and weight savings afforded by the new keyboard. Also, both the 13" and 15" MacBook Pros have the same keyboard.

It's a cumulative process. My first Macbook was a white 13.3" polycarb core 2 duo. The new 15" MBP is a pound lighter, almost half as thick, and only half an inch deeper and an inch wider. Samsung has a 15" laptop that's under 3 pounds, making it very palatable even for people used to a 13"-er.

At what point along that evolution should Apple have stopped striving to get smaller?

Good point, but the price has something to do with it too.

To me that sounds like you're the target market for what was the Air, or what is now the MacBook. The MacBook Pro is their top of the line and shouldn't be diminishing something as important as the keyboard which people use all day for an extra mm off the thickness. They've crossed the line on form over function.

Actually no. You have to get used to any keyboard that's functionally different from the previous one. It's very common in the mech keyboard communities.

You get a rock out of your shoe and it feels like a relief. But to me, going back to a keyboard with a lot of key travel feels like walking around in deep snow.

going back to a keyboard with a lot of key travel feels like walking around in deep snow.

Nicely put!

I tried switching to the 12" earlier this year. I ended up giving up due to the poor performance of the machine relative to my needs, but I did get used to the keyboard - once I got used to it, was able to pretty fast on it. (Then again, my work and home setups feature Das Keyboard Pros, one brown, one blue switches)

It is only slightly better

It seems like an ergonomic nightmare.

Similar, but more travel.

When I first saw the touchbar, I immediately thought of a plastic, overheating, underpowered piece of crap laptop you find at any Retail store. Literally looking at that touchbar makes me think of having to uninstall 20 HP bloatware apps from a family's shiny new craptop.

This entire article is raving about a single, relatively small but important detail that Apple has been notoriously hostile towards and worst at up until now: cross-manufacturer port compatibility. So hostile that they went out of their way to find ridiculous loopholes in their compliance with the EU's Common EPS Memorandum of Understanding on USB-B.

Yes, it's a great feature, but giving Apple so much credit for introducing it is the ultimate irony.

Agreed, and giving Apple this much credit for "look how well these work with my USB-C Android phone!" compounds the irony beautifully.

> ridiculous loopholes in their compliance with the EU's Common EPS Memorandum of Understanding on USB-B

Care to explain? I'm not familiar :)

The EC organized a voluntary Memorandum of Understanding for mobile phone chargers, where the major phone companies stated their intent to standardize on a common phone charger. (depending on who you ask, to avoid the EU making a binding rule about it otherwise)

Apple signed this too, but as we all know didn't actually add a micro-USB-B connector to their phones like everybody else, but just providing an adapter to USB, which was not forbidden, but arguably against the spirit of the memorandum.

Citing wikipedia, Some observers, noting Apple's continued use of proprietary, non-micro USB charging ports on their smartphones, suggested Apple was not in compliance with the 2009 Common EPS Memorandum of Understanding. The European Commission however, confirmed that all MoU signatories, "have met their obligations under the MoU,"[15] stating specifically, "Concerning Apple's previous and present proprietary connectors and their compatibility with the agreement, the MoU allows for the use of an adaptor without prescribing the conditions for its provision"


I would disagree with their choice if micro-USB wasn't so lousy mechanically.. I've had more Micro-USB devices fail due to socket failure, than anything else.

Be careful with the idea that the charger is just a regular USB-C charger. These laptops will draw 3-4 amps at 20 volts. Most phone chargers are designed for 1-3amps at 5 volts, so they would only provide a trickle of charge to these laptops. Older chargers (and computers ports!) can be damaged by these higher power draws. Also, the USB-C cable the Macs come with is rated up to a hefty 5 amps. Some of the cheap phone cables out there could actually pose a fire danger if they were to handle 5 amps.

Heh. Was amused by this line

> I’m sure it’s only a matter of time until Amazon is flooded with cheap versions of this idea that tweak it just enough to avoid patent issues. I look forward to buying $3 breakaway USB-C cables in the future.

Has this guy not seen what's going on with Amazon and the cheap USB-C cables that are flooding it[0]? They're literally destroying people's laptops by drawing too much power and frying either end.

[0]: http://www.theverge.com/2016/2/4/10916264/usb-c-russian-roul...

This author doesn't know what he's talking about

"It’s not the Nexus’ fault that my MacBook got fried — it was just doing what it was supposed to do: ask for as much power as it can get. It’s not the MacBook’s fault either — its ports weren’t designed to handle delivering that much juice nor to know that they shouldn’t even try. It is the fault of the cable, which is supposed to protect both sides from screwing up the energy equation with resistors and proper wiring."

This is simply not true. Both the device that is supplying and receiving power should be regulating. The cable should just be "dumb". I've done a bunch of testing on various chargers using cheap USB power meters and 2A USB dummy loads. For instance iPhones/iPads will monitor the voltage of the charger and if it starts sagging will reduce the amperage they draw. (my own testing has only been with "classic" USB type A stuff though)

The only cable I've heard of that actually fried anything was one that was wired completely wrong putting power on the data pins or something like that.

Are you familiar with Benson Leung (Google hardware engineer working on the Pixel) work reviewing various USB-C cables and accessories? He's extensively demonstrated that there are USB-C cables on the market that are dangerous.






Yes, he is who I was referring to when I mentioned "The only cable I've heard of that actually fried anything was one that was wired completely wrong putting power on the data pins or something like that."

edit: I will admit I am partially wrong though - checking a sample of his reviews reveals the case of Type A <> C cables, where as a bridge between the standards the cable is presenting as a charger - in those cases, yes it should not just be "dumb". I still maintain that well-behaved chargers should not supply more power than they are capable of, and well-behaved devices should (and many do) monitor their charging environment and back off when the voltage sags.

I fried my Nexus 5 with the original google USB-C cable and charger.

It's not always a wrong cable. It can be a small connection problem. Then the USB-C power protocol goes up with it's power, similar to cell power with a bad signal. Then it gets hot and catches fire.

Beware if you charge via USB-C in our bed. you can be dead.

Nexus 5 was micro USB.

Perhaps he meant Nexus 5X, which is USB-C.


Only chargers that don't follow the spec can be damaged by the power draw. That is the problem with adapters and USB-A cables that use wrong resistor and USB-C device will pull 3A from USB-A charger that can only provide 2.4A at most.

Also, the charge cable is electronically marked for 5A. Regular USB-C cables only do 3A, which limits them to 60W with USB-PD.

Are the MacBook USB-C ports special in some way that makes them USB-C and also a non-standard charger port?

If not, then they must be compliant USB-C ports and therefore would not be supplying an incorrect voltage. As far as I understand the fundamentals of electricity, current doesn't matter. A 5v USB power source that can supply 100 amps would still only supply whatever the device was designed to draw when charging.

But all of that is moot if the MacBook ports are "special" USB-C ports and somehow they negotiate a voltage higher than 5V.

The USB Power Delivery (USB-PD) spec allows devices to negotiate up to 20V - the key word, however being "negotiate", means this is no different than how older USB charging solutions work (start at 5V+100mA, negotiate to whatever the client device can handle and what the host will allow).

I'm always amused by how uninformed many people are about USB and power supply, if devices can't negotiate something that works for both ends they just stop. It's a shitty situation to be in if your device fails to charge, but devices that follow the USB-IF rules will never draw more current than the power source can supply, and the power source will never supply more than the device has requested.

So if I plug a dumb device into it that only makes electrical contact, it won't make that negotiation.

So, so long as Apple's MacBook USB charger is compliant, it won't destroy anything USB that I plug into it, so long as those devices are also compliant. Right?

A "dumb" device that doesn't negotiate will get the minimum 5V@100mA that the USB spec allows, dedicated chargers often decide to not drop the power after the negotiation window is over but a proper host device (laptop, etc.) will drop all power if a device fails to negotiate.

So yeah, if you plug something into the Macbook charger it will either not charge (dunno if it supports traditional USB power specs since it's a type-C only charger) or charge as any other charger already does.

Re: "dumb" devices, this is how it's supposed to work, yes, but I don't think I've ever seen a charger that won't happily dump 1A into a resistor wired over the power pins - including brand name (Apple) chargers and yes, the USB ports on a MacBook Pro. I've never seen a modern device supply only 100 mA.

For devices that DO attempt to negotiate, yes in those cases usually you'll see it work according to spec.

I haven't tried USB-C yet, it would be interesting to see if devices have gotten stricter.

Isn't there also a spec for passive power supply over USB, used by chargers. Something about combinations of resistances across the pins to signal current requirements.

USB-C is a collection of standards rather than one standard. As others have pointed out, two USB-C devices can use the same standard yet not operate together.

The standard covers both 20V and 5V power supply. If the device needs 20V and only gets 5V, there is nothing that says it must work, or even inform the user that it, in fact, is not working.

I'll just go ahead and contradict the running opinion and say: I like my 2016 MacBook Pro 15".

I've owned a few Macs and this one is my favorite. Despite my initial impressions, I really like the keyboard. I'm also quite happy with the trackpad which I've found to be excellent as usual and not picking up stray contact.

The only place I'm not "ecstatic" or "pleased" is with the TouchBar - and to be clear, I'm not displeased. I just don't really notice it. It's there doing its thing and I'm using the laptop, doing my thing. Occasionally I need the escape key and tap where I'd expect the escape key to be and while I don't get the tactile feedback it works.

All in all - the TouchBar is a net zero to me. I didn't lose anything by losing my Fn keys but I don't feel I gained anything with the TouchBar except TouchId which is nice.

TL;DR: Overpriced? Definitely. Pleased with product? Yes. TouchBar? No strong feelings. Returning it? No.

I'm finding it really odd that people are praising the new keyboard. I must admit that I have only tested it in the Apple Store, but I find it terrible, the short travel feels really bad to me. Is this something that you just get used to?

Bought my new MBP yesterday. I was expecting to have to get used to the keyboard, but was typing on it just fine straight away. Now I actually prefer the feel to my previous 2012 Air.

I'm not sure - I use a gaming keyboard on my desktop at home (Steel Series) and while it is mechanical with deep travel, you barely have to tap a key to get it to register. That is, you don't need to fully press a key to trigger it. At work I use a CODE keyboard which is also mechanical with a pretty deep press.

Compared to other Mac keyboards? My MBA feels super squishy - it's actually uncomfortable to use now.

All in all - after about 10k words on it, I like it.

If you're curious and in need of an upgrade just buy one. If you don't like it, return it.

Lots of people complained about the original Apple 'chicklet' style laptop keyboard when it was first introduced circa 2006 for the same reasons. In my experience with many different keyboards over the years once you get used to shorter key travel it's hard to go back to taller keys.

As for as laptop keyboards, it's pretty great. The greater issue is with the palm rejection of the giant touchpad. That is considerably worse than before (when it was never a problem). This is something that can be solved with software though, so not too worried.

Net zero only if you don't factor in the extra price you had to pay for it.

> Despite my initial impressions, I really like the keyboard.

Have only tried typing a few lines in store and I also enjoyed it, but might be personal preference: use keyboard with Cherry MX Brown at home and got used to light/low-force actuation.

Getting a maxed out 13'' to replace my 2008 one.

I have an 13" since almost a week. I think it's great. 95% of my time is spent in iTerm/VSCode/XCode/Android Studio.

Some remarks:

- Keyboard, especially arrow keys, took me 2 days to get used to, but now it feels weird typing on a old macbook. I actually love it and prefer it now.

- The thumb + touchpad thing mentioned elsewhere here was definitely a big problem in the first day or two. It isn't anymore (guess I got used to it? not sure because I didn't try to avoid it)

- USB-C is freaking awesome. I bought an adapter with ethernet, HDMI, usb 3 and SD that actually replaces the 3 adapters I had to carry around. And because of UPD, I only have one cable to plug to the mac and everything is there including power.

- I don't miss magsafe as much as I thought I would. Although I would happily buy an adapter if it is thin enough (some are coming).

- Touchbar is actually pretty great, although it being a touch screen, the lack of touch feedback can be annoying at first. Pretty ESC is annoying at first, but I got used to it and don't mind it now.

- Touchbar would be an _awesome_ medium to get notifications (such as long running terminal jobs etc...)

- I didn't get any of the battery life issues people are talking about. Actually, I get 8-10 hours out of it easily (ie plug it at the end of the day because I forgot it was unplugged).

- Thinner bezels around the screen makes it somehow look bigger (even though the visible area is the same size and the screen/lid itself is smaller).

- HiDPi is freaking great. Finally I can use a 4k monitor smaller than 32" and still get retina display (1440p HiDPi and other intermediate resolutions up to native 3820x2160 are fully supported)

- It is really thin (no thinner that a Macbook Air, but still). It's feels really great.

- Actually, I just noticed that because it is thinner, my wrists don't get hurt by the edges like they used to (the exact opposite of what someone mentioned here).

Assuming you're a touch typer, how on earth do you manage to use AS without real function keys? Almost everything beyond just typing code requires them.

As a daily user of various JetBrains IDEs, the lack of a proper keyboard makes the new MB Pro inconceivable for me. I'm actually going to have to go Win or Linux for my next machine.

I've never really used the F-keys since my Visual Studio days (F5/F9/F10/F11) when I used to to .NET 2.0 on Windows. However with the touchbar, when you press Fn, the touchbar displays the F-keys instantly (you can choose for them to be displayed by default).

For Android Studio, I also remap the keys to Cmd+something out of consistency with XCode. For instance, I mapped build to Cmd+B etc...

OK that makes sense.

For those of us who use a variety of JetBrains products, the F-key shortcuts are the consistency. Having them faked on a minuscule screen, rather than being findable by touch, would make the new MacBook unusable for me.

Sad, as I find OSX to be the best in a bleak landscape of OS's, but given Apple's obtuse insistence on knowing what's best for all, to the extent of removing critical components of the universal standard keyboard, my current MB Pro will certainly be my last. Ugh Windows!

Can you share what USB-C adapter you went with?

Sure: https://www.amazon.fr/gp/product/B01IQV5U68/ref=oh_aui_detai...

Although don't use it for 4k, as it is only HDMI 1.4 (4k@30hz). I am not aware of hubs like this with HDMI 2.0 yet. I bought a USB Type C => DisplayPort adapter for the 4k monitor.

I agree with all of this - I have a new 13" also, and I love it!

The new 13" MacBook Pro is actually 12% thinner than the MacBook Air, but it does not have these beveled edges.

Sad to see all these "I get used to it", we spend money to enjoy, not to make ourselves suffer...

Getting "used to something" doesn't necessarily mean suffering, or compromising. In my case it wasn't at all.

I do enjoy this laptop a lot, thank you very much.

Look, if you want to willfully misunderstand "it's different but you adapt" as anything but "it's not actually a problem" that's up to you...

... but at that point the problem isn't with the tech

We needed to "get used to" typing with our fingers quickly on a touchscreen as well. Now >95% of the people I know are incredibly proficient.

I got used to not having a parallel port. Had to mothball my Zip drive, but it was for the better.

USB-C is awesome. It is the future and in some ways it is good Apple are being the way they are.

But... I wish Apple would put USB-C in the iDevices (honestly why does the iPhone and iPad use Lightning when USB-C exists??). It is annoying that I can ditch all my cables except that one fucking extra Apple cable.

Time will tell, but I wonder if reliability will be the issue. USB-C has 24 contacts crammed in there. That's a lot of tiny parts for something that gets manhandled and lint packed as much as a phone charger connection.

If I was on the hook to warranty replace $600 devices when the internal connector gets ruined I might take my time jumping on board too.

It isn't like Lightning is super reliable. I have had many official cables fall apart within a year of normal business travel and at least one port on an iPhone 6 fail most likely due to something getting in the port (although Apple never stated that obviously).

I doubt USB-C is that much worse than Lightning. Even if it is say 10% less reliable as a cable it would still be much better to have a USB-C port simply because you are more likely to have a USB-C cable if everything else uses it.

at least one port on an iPhone 6 fail most likely due to something getting in the port

I've found a wooden toothpick works wonderfully to clear out accumulated pocket lint from the port every 6 months or so.

You may have just saved me a trip to the Apple Store, thanks. I hadn't realized there was a build up of lint, and cables were failing to "clip" into my phone.

Sadly nothing fixed it and I tried a dozen things.

Totally anecdotal but I have had a Nexus 6P for over a year now and no issues with the USB-C port so, for me, it is just as durable so far as Lightning :)

honestly why does the iPhone and iPad use Lightning when USB-C exists??

It didn't when Apple started using Lightning.

Because Lightning is significantly thinner, also less wide, and mechanically superior in very way. There's a much firmer "snap" when you plug it in, the connection stays connected, and the port itself is more durable as well, due to not having a plasticy thin thingy in the middle that holds all the contacts.

Lightning is a superior connector design.

Couldn't agree more. I recently used Type C for the first time and it just feels so cheap in comparison. Nowhere near as bad as microUSB but still, there's just something about it.

The only problem I have with their use of Lightning in phones is that they want headphone manufacturers to switch to Lightning...

USB-C feels cheap compared to Lightning or compared to USB-A? The latter would be the more appropriate comparison, I think.

The list of superior proprietary tech is amazingly long. And most of it is long dead. Lightning will hopefully be added to that list.

So if lightning has 'auto reverse' then it's strictly superior right?

lightning is possibly thinner, and it , most importantly, might emit less RF intereference because of the geometry of its connector

I'd think (but have zero experience here, would love to hear otherwise if I'm wrong) that C would be better for RF interference purposes. C has the contacts inside a solid shroud, lightning has them exposed. Seems like lightning would need extra shielding on the port side, where C takes care of some / all on the cord?

My (crazy) theory is that they will remove all ports on the iPhone soon, and rely on wireless charging and data. It may be necessary to achieve Ives (well documented) dream of a screen that covers the entire front edge-to-edge.

They havent switched to USB-C because people will br super annoyed if they just supported it for a couple of years.

It also explain headphone jack removal. Removing two ports at the same time would make people super mad. And they also need to stimulate the wireless headphone market.

> It may be necessary to achieve Ives (well documented) dream of a screen that covers the entire front edge-to-edge.

Source on this?

I didn't know this is what Ive wants to achieve, but it's something I've assumed every device manufacturer will be racing towards (even if it is on a niche device to test market reaction) and I've wondered why no one seems to be boldly pushing this?

To not piss people off with changing the port on the iPhone again.

They'll probably change to USB-C eventually though, and the smart time to do that was in the same year that they did on all their other hardware, which is also the same year that they dropped the headphone jack.

IMHO they should have done it with the iPhone 7 has they ditched the 3.5mm jack forcing people to buy headphones that will only work with an Apple iPhone or iPad. Hell not even their laptops have Lightning ports so they are literally useless for anything else.

I also find it amusing that they released the new MacBook Pro with a 3.5mm jack when they clearly don't need it. It makes even less sense on a laptop than a phone IMHO.

They will do it next year, so they could sell more USB-C-Lightning adapters for Lightning headphones people bought for iPhone 7.

I also wish they would adopt this standard, although I can't say I'm looking forward to yet another headphone transition.

I don't get it. It's bad for hackers because of the tepid software updates, increasingly developer unfriendly application environment, lack of full touch in an age where every other manufacturer does that ad a standard, and awkward meshing with its own ecosystem.

USB-C is pretty great, actually. Still pretty raw for the mainstream tech crowd, but it's not like that for tech-literate consumers.

Yeah, I think this article has got me more excited for USB-C than the new MacBook Pro

Since when do hackers want touch-screens on their laptops? I feel like that's pretty much the least hacker-y feature a laptop can have.

I only speak for myself when I say it's something I would love to have at my disposal.

Seeing my colleague intuitively reach out to the screen to use the Android Virtual Device when we're discussing something convinced me it was actually a nice thing to have.

Testing touch UI design with a mouse pointer is futility embodied.

Mobile and web are touch native now. It's fantastic for mobile developers to be able to have a similar input method for testing as use.

I'm not sure what makes touch not poweruser and "hacker" friendly, but it's not the 80s anymore. Most of us have significant portions of our workload on the web and the web is, by and large, an extremely touch-friendly medium.

A hacker craves not these things.

Hello. This subtle "you seem different from my dated 80s image of hackers so you aren't one" game you're playing is boring, and if you check out the score no one is buying what you're selling.

It's faster to pop up to the screen to press a link and return to the home row than it is to pop to the trackpad and do the same unless the pointer is quite close to target. Both have similar repetitive strain implications.

First time I've ever been addressed with a "Hello" on HN. I rather like it.

I disapprove of your sentiment in the above post but will die for your right to basic civility.

> The new charging block that comes with the MBP looks exactly the same as any traditional MBP charger

Actually not, the convenient little 'arms' for the cord are missing. And the cord itself is rather stiff and not very flexible … and there's no green / red charging status light either. It's OK as a USB-C charger but it looks and feels different from a traditional MBP charger.

The "arms" on MagSafe adapters are actually bad for the cable - wrapping it around them puts undue stress on the cable, which causes them to fray faster. And a stiff/inflexible cable should also prevent it from fraying.

Seems like Apple took an proprietary, easily-damaged charging cable and replaced it with a standard, less easy to damage one.

The charging indicator light is a definite loss, though.

As someone who took good care of my MacBook charging adaptors and cables, I found them to be perfectly durable. Those arms are only problematic when one wraps the cable 'immediately', instead of giving an extra loop and then beginning the winding.

Mbp chargers fall apart quickly. 'Taking good care' in this context sounds like 'you're holding it wrong'

I'm not sure what you mean

He's referencing Steve Jobs' email response to the iPhone 4's antenna issues.

>Those arms are only problematic when one wraps the cable 'immediately', instead of giving an extra loop and then beginning the winding.

True, but I've literally never seen anyone do that in public. I do (the 2 times I've done so), one coworker does, everyone else grabs and wraps as quickly and tightly as possible.

They can be used "safely", but the affordance isn't there. It practically begs to be misused. Bad design in a nutshell.

> there's no green / red charging status light either.

That really sucks.

> And the cord itself is rather stiff and not very flexible

Did Apple finally fix the incredibly shitty and prone to fraying soft rubber on the cables?

Probably too early to tell. Given that they've apparently changed the material, we can hope that one of the reasons is to toughen the sheath.

I'm away from my laptop at the moment so I can't see which brand/model I have exactly, but there are a number of accessories for chargers that I find more convenient than the built-in arms, especially because I use the "long" power cable. It's something like this: https://www.amazon.com/Cable-Organizer-Macbook-Power-Adapter...

Finally somebody writing a realistic post about the USB-C port situation! So much short sighted whining when this really is a great step forward.

Probably because OP has an Android phone.

I'd honestly love to use my laptop's Ethernet dongle with my iPad or iPhone, but that won't happen. Apple has just recently doubled down on Lightning with the Apple TV Remote, the iPhone 7 headphones and the new Magic Trackpad/Mouse/Keyboard trio. For iPad accessories, they've even introduced the all-new Smart Connector instead of a USB-C port on the side of the iPad.

Check this out...

NooQee USB Type C Adapter to Lightning 8-Pin 2-Pack [0]

NooQee Lightning 8-Pin to USB Type C Adapter 2-Pack [1]

[0] https://www.amazon.com/dp/B01M340K3B/

[1] https://www.amazon.com/dp/B01K496YTC/

A USB-C port would not fit on the side of the iPad. So, not really an option.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact