* Graphics card glitches - I paid close to 4K so I don't have to deal with quality control issues.
* Touchpad is just too large. I found myself resting my palm on it all the time, and sometimes (clicking) without realizing it. Also, if you like lying down and working (which I do sometimes because of a lower back problem) the size of the touchpad will make you work extra hard to avoid accidental clicks.
* Had the machine for ~ 10 days, used the touch bar less than that. Definitely not worth the money. Hopefully in the future, they'll have the 15' option without it.
* The bootcamp experience just sucks (this was my primary reason for returning it). Currently, there's no way to gracefully switch between discreet and integrated gpu, so the battery life is terrible, like two-and-a-half-hour maximum battery life terrible. gpu-switch doesn't work either. In fact, if you use gpu-switch you'll have to rebuild both macOS and Windows as the machine will just hang when you try to boot into either.
* Recovery mode has many issues with network connectivity. A few times, I had to tether/connect to my iPhone hotspot for it to go through.
* Sharp edges everywhere.
The specs are very underwhelming too, but I was willing to tolerate lower specs for higher build quality. I actually just picked up an XPS 15 9550 from Microcenter. Got the 2.6Ghz, 16GB (expandable to 32GB), 512GB SSD, 4K touchscreen for $1350 (an open box, new for $1499).
In particular, I do actually really appreciate the larger trackpad. But I'm a heavy user of BetterTouchTool and have always regarded the trackpads as one of the main reasons to get a MacBook. I don't even bother with three-finger drag now thanks to the size of the trackpad.
I think the Touch Bar should be considered for what it is: a replacement for static function keys. Apple of course hyped it like they hype everything. But considered realistically in context I consider it a success. I actually do use it some. Some of the simplest things work the best; for instance, I really like the options presented when taking a screenshot. I also enjoy using it for music control, scrubbing through music, and switching between music sources (including YouTube tabs). Nothing revolutionary, but then again, how could it ever be given what it is?
Double-tapping and leaving the finger on the trackpad lets you drag the window or other item until you short-tap again. Drag lock even allows you to lift your finger and continue dragging from a different position on the trackpad.
I don't run out of space anymore. I don't even have to do what you just described (repeatedly swiping with the index finger). I can often just click down with my index finger and drag a window all the way to where I want it without reaching the edge of the trackpad. I could never do this in the past, so I came to rely heavily on repeatedly three-finger-dragging windows around.
I also used to prefer three-finger drag because it was physically hard to click down and hold the click while dragging. But the Force Touch trackpad makes that much, much easier, and it also makes it so that I can always initiate the click anywhere, even at the top of the trackpad.
So it's really the combination of the larger trackpad and the Force Touch design that finally prodded me to stop using three-finger drag.
In theory, a three finger-swipe is basically GUI equivalent to a click and drag - so the behaviors should be identical.
Under what scenario have you found them not to be?
I got an HP Spectre x360 (16Gb, 512Gb SSD, I7-7500U), for $999 (open box for $940). Build quality is top notch.
I am a heavy Mac user and I could understand the Apple premium. There was no real competition before, but now that's changed. There are good offerings from Dell, Lenovo, Razer, HP, Asus and many others now.
Except that BestBuy also made a mistake and sent me an I5/8/256 version(silver instead of Ash silver). Which was promptly returned. I've used the refund to buy parts for a desktop ITX PC on Newegg. I'll be using a chromebook or whatever when I can't be bothered to sit at a desk.
The Spectre is a beautiful machine though.
From my perspective this thing is at least 2x as fast in real world use, substantially lighter, way better screen and speakers. And the keyboard is awesome and the build quality is outrageously good.
I get that people are upset they can't edit 5K or whatever, but you can get a desktop computer for that for less than half the price. Obviously they will be even better in three years once intel has chips that are appropriate for the MBP that also support LPDDR4, but for now this seems like clearly the best computer on the market and very clearly aligned with where the industry is going in the future.
For instance if you are entrepreneur, any computer will have enough power and other things become important. An entrepreneur needs to travel a lot so battery lasting is essential, weight being low is essential, size being small is essential, it just working(software-hardware integration) is essential.
You can't move a desktop easily.
As an entrepreneur and engineer I use Macbooks a lot. Heavyload is done in servers. I used to compile my own gentoos(Linux distros) and built my own desktops and servers from discrete components in the past to get the best bang for the buck. Not anymore.
Macbooks are great computers in overall design, if you are careful enough to avoid first generation designs(that applies to any product from any company). Once they iron all the bugs it just works.
My shortlist was the XPS 13, XPS 15, Surface Book and Latitude E7470.
I discounted the XPS firstly for having too many coil whine issues even after 3 generations. In addition, I had read about the key travel being short (1.2mm I think) but it wasn't until I actually tried one in PCWorld that I realised it's horrific to type on for any length of time. Not something I had experienced in the past. The 4k screen is wonderful though.
The Surface Book, while having a nice keyboard and sumptuous screen, has a terrible warranty: 1 year hardware support out of the box and for 3 years it was around £350. Even then, all they would do is send you a second hand replacement and take yours away. I read about some people that had been sent badly scratched replacements even though theirs was perfect. Too risky for a £2000+ machine.
The Latitude e7470 ticked all the boxes: easily expandable, tough, 14" and a screen res of 2560 x 1440. Also, I found one in the outlet store (scratch and dent) for £800 with a 3 year onsite next day warranty: I haven't received it yet but I have a 7 year old one at home that still runs as a Windows Server 2012 R2 machine and apart from having crap battery life, it still runs.
I hope your XPS is ok, it's stunning to look at in the flesh but I didn't want to take the risk that my £2000 machine started whining and to have Dell say "it's by design", and I hated the keyboard, so I stuck with the slightly less glamorous but no less capable Latitude.
 - https://www.google.co.uk/search?q=xps+coil+whine
Also a problem on previous models. I was working on my own GPU switching solution but gave up due to lack of time and the fact most of my applications work just fine under OS X.
I'd be interested doing the same with Debian or similar Linux though.
 - https://bartongeorge.io/
 - https://bartongeorge.io/2016/10/04/the-new-xps-13-developer-...
If you have have heads ups I would like to know. Thnx
Your device may support a subset of the different USB charging protocols:
* USB 1.1 lo power: 5V/100mA
* USB 2.0 hi power: 5V/500mA
* USB 3.0: 5V/900mA
* USB BC (battery charging): 5V/1.5A
* USB quickcharge 1.0/2.0/3.0, proprietary Qualcomm standard
* USB PD (power delivery) 5 profiles offering up to 100W (5/12/20V @ 1.5/2/3/5A)
Unrelated to the MacBook, but problems i see with USB-C are:
Chargers may offer cryptographic signatures in the future for authentication against a whitelist at the device.
Second and most problematic:
The MacBook is a good citizen here, but many laptops (HP business series, Dell XPS series) only support USB-C PD with profile 4&5 (20V/3A+).
This rules out the car dongle as well as cheap USB power banks.
The connector is always the same, the customer cannot deduce charger/device compatibility. The experience will suck.
One set of chargers will be for mobile devices and just support the highest standard we see in them.
One will be for laptops and the same - just supports the highest profile for them.
Overall worse, because USB-C may also be used for other connections (eg DisplayPort), adding to the confusion.
Plus all the different cables which can do different things and not support specific profiles of some things and others but they all have the same connectors. Confusing as heck.
Depending on cable configuration, pinout, wall plate and structured wiring system that 8P8C might be usable (or not) for multiple different types of data networking, from the assorted ethernet speeds to E1 to token ring, or for a serial console, or delivering power and audio to a remote speaker, or hdmi-over-utp, or even -48V telephony, and let's not even get started on the only-subtly-different but actually incompatible RJ45 connector, or people sticking RJ11 plugs in 8P8C ports.
And yet the world has coped with this proliferation.
Over time, as you've noted - the vast majority of uses of 8P8C has turned out to be networking - it will be interesting to see if USB-C likewise, makes a similar evolution.
There's an RJ45S, but that's definitely nothing to do with Ethernet. Use of the term RJ45 is simply wrong, in any circumstance, as far as I can tell.
There's no term for the use of 8P8C for Ethernet purposes other than well, Ethernet, or its specific profiles ...BASE-T, as far as I can tell.
I have never once heard the phrase "8P8C" used to refer to an ethernet jack. Not once (outside of this thread) - but I have heard it used that way when referring to various 8-pin telco connections - it was a common term of art in the 90s when describing telco installations that used that configuration. When talking about Ethernet, and people are trying to be specific, they usually reference EIA-TIA-568B/A.
There are certain words, like "Bandwidth" - that, might technically mean the width of the band (typically in Hz), but have grown over time to refer to data rate as well. And that's cool - language is versatile that way.
In fact, language and terms are not set in stone.
Both chargers and devices need not support all charging standards/profiles and thus may disagree on working together. A working (all USB PD profiles up to 100W supporting) charger looks inherently the same as a profile-1 only one, and both may even say "USB PD" in the specs.
Thank god the MacBook accepts the widespread 5V/3A USB-PD power level and even USB BC.
I am honestly befuddled by USB-C. The allure of a universal connector? That's kinda pointless when the cables look mostly the same but support different feature subsets. It's insanity.
It's bad enough that many manufacturers (I'm looking at you, Dell) don't differentiate between USB 2 and 3 Type A. C is so much worse for this.
Instead of a pretty good $1200 13" Air we have a $1800+ cersion that's lost features (MagSafe, worse keyboard) to be a hair thinner that also requires $200 in dongles to connect to anything.
Seriously, fuck you, Apple.
Some companies, for example Nintendo, figured this out a long time ago. Notice how with their consoles, if the disc/cartridge fits in the console, the console will always play it, even between generations. The customer shouldn't have to research arcane names and study symbols on cables - if it fits, it should just work. And USB-C is just a mess at the moment.
Well, that's not quite true. Both the Wii and Wii U have a standard-sized disc slot; on the Wii you can insert small GameCube discs into that slot and they'll play, but on the Wii U they won't. On the portable side, 3DS cartridges do have a tab to prevent them from fitting into a DS, but that wasn't the case for the handful of games exclusive to the brief-lived DSi.
Recent Nintendo consoles have also had compatibility issues with standard storage devices. The Wii supported SD cards, but wasn't compatible with SDHC cards, which are almost all cards with a capacity of 4GB or higher. This was eventually rectified with a software update... but the update only applied to the system menu, not to games which could access SD cards themselves, including notably Super Smash Bros. Brawl. The Wii U, for its part, supports storing games on external hard drives, but doesn't provide as much power over USB as most hosts do, requiring the use of a USB Y cable and a separate USB power source even for drives that don't normally require external power.
No clue why.
I hope Dell will improve its power circuitry in the future and support USB BC and PD 5V/3A. Let's hope it's just that current power chipsets lack those modes because of time to market pressure and Apple is ahead of its competitors here.
I can understand your anger at Apple, I hear the same a lot from design and audio professionals... You may have got some downvotes for that last statement.
Right now it feels wonderful with OEM chargers, but you do have me worried about the future buying replacements and accessories.
However there's not a single car charger nor a power bank that does usb-pd at 20V/3A at the moment. So sad.
Who would ever expect Apple to do something like that? Oh wait...
I get it that prosumers like to think of themselves as "hackers", but ...that's just not how it works. Come on.
"See that toaster over there? It's been reprogrammed to automatically deposit my cat's food every fourth hour and have it warm as well."
"That 2009 dinosaur of a smartphone sitting in the corner? It's an IP surveillance camera."
"That first generation Xbox, it's powering the zoom feature of the Hubble Telescope."
That last one might have been a bit of an exaggeration, but my point is that something isn't great because it's the latest. Something is great because someone increased it's value after using it or created something of higher value than the equipment used to create it.
> Hackers do not benefit from a closed box with non-expandable performance.
To address this point... some might. But not all will. And I certainly think that fewer will than the generations before. I really hope USB-C will be as great as these companies say it will, but until then I'll happily use my different ports that work as they are expected to.
Turning a toaster into a cat feeder is tinkering, not professional hacking. There's nothing wrong with tinkering. But it's the difference between wiring up a Raspberry Pi as a heating controller, and building a company that sells fully licensed and certified heating controllers all over the world with support infrastructure.
One is hobby project, the other... isn't.
A useful definition of a professional tool is one that lets you forget you're using it because it's so transparently intuitive you never have to think about its needs.
I don't think the 2016 MBP does that. The ports are (literally) a side issue. The problem is more that Apple are thoughtlessly losing their reputation among professionals, because Cook, Schiller and co don't seem to be thinking hard enough what they're doing, and don't appear to have an understanding of what their professional customers are looking for.
...Which is not something super-thin for the sake of it, or with a gimmicky touch bar. It's something expandable with ports that "just work", no physical or metaphorical rough edges, with the option to have decent memory (i.e. 32GB) and a reasonable processor speed bump.
This shouldn't be hard or controversial, but for some reason it seems to be beyond Apple's understanding.
I'm hardly a hater. I bought the 12.9" iPad Pro last week, and I'm loving it. But the laptop format is challenging because you either stay conservative, or you go full experimental with (say) a dual-display clamshell. or even a touch panel instead of the trackpad.
Half-hearted innovations like the touch bar glued onto an ungenerous spec look like gimmicks for the sake of it, not serious attempts to improve professional productivity.
Your definition is more bleeding-edge-users who are constantly limited by technology and could justify to pay a couple thousand for a 10% increase in performance. This group overlaps but is not equal to the professional users.
Many real professionals will love the new MBP lineup while many more will hate it.
An integrated gpu, with a higher end i5, with a big ssd and lots of ram for half the price would be more appealing to me.. even bringing a lower rez screen would be okay for my needs... love the for factor though.
Everyone has different needs, as you said.
Can you expand on that? I've seen this with folks in the movie/production industry but for webdev it seems like that's more of a want.
What's your daily task load, where integrated GPU and 32GB of ram are tasked?
If it's being taxed, it's probably because some idiot spec'd an HDD that pulls resources too slowly For modern JS dev, you really need an SSD more than anything else. Mainly because the build/watch process is tracking many thousands of small files which is significantly worse on hdd. 60+ seconds vs. under a second for any change to take effect in the browser. This can be as much as an hour a day wasted. The 5 hours of wasted time in a few weeks are more costly than the upgrade to ssd.
As to 512gb, it's because after all the software, that can take 100gb... creative assets well over that depending on the projects... it's easy to hit 240gb between the OS, software, projects, and assets.
Beyond that, show me off-the shelf hardware that can be configured with 16+gb ram and a 512gb SSD that doesn't have the other stuff I don't need?
I just wanna know what cable they use when they need to plug their iPhone into their MacBook. I don't seem to recall Apple selling a usb-c to lightning cable.
I agree that USB-C might be helpful in that respect, but it totally depends on the drivers. If there is a device that I can make hiccup through timing attacks it depends on what liberties I have in the driver.
For Bluetooth for example you have Ubertooth. If the Bluetooth radio on my laptop would be accessible on a low enough level there would be no need to use other hardware to execute attacks.
I'm just a hobbyist with a jtag programmer, some digital analysis, nothing fancy. A professional reverse engineer is another beast with microscopes, etc.
A hacker in the form of someone exploiting vulnerabilities in web applications does benefit not from a single computer, but from many. I don't think he/she would care about the specs too much.
... like a $4k laptop?
It's not like Apple is the only company to support USB-C on laptops... in fact, as far as I can tell, every major manufacturer has it on their most recently-released laptops.
> To address this point... some might. But not all will.
How, exactly, would someone... anyone... BENEFIT from closed-box performance? Just because someone might not benefit from a laptop you can upgrade doesn't mean the converse... that they somehow benefit from having one that you can't upgrade.
(side note: I can think of ways... decreased cost, decreased size... but the laptops certainly don't cost less, and any difference in size w/ a 15" laptop is negligible)
The MasPars were fun to play with. ;) So you're not that far off on your XBox comment!
I upgraded RAM and swapped the DVD with a 1 TB SSD. The HD is still in there, auto shutdown after 5 secs. I use it for storing large files I don't need on the SSD but could be handy to be online sometimes, like raw videos from my camera.
I'd like to replace the keyboard with one without the number pad. Possible in theory but there is no part that fits on the market. The only 15" laptops without number pad at the time were the Mac and I think the XPS, which was overheating. Maybe the problem with the latter is solved now.
I'm a developer and this laptop is as great as any other, for me. But USB-C has very little to do with that. USB-C is convenient (until it's not) for anyone, regardless of 'hacker' or not.
A person who uses computers to gain unauthorized access to data. 
So in this context, it's almost entirely controlled through software. One can be a hacker with hardware, but as of now I doubt USB C is going to make anyone as much of a hacker as a specific type of software might. And this new MacBook has very little to do with changing how real "hackers" might do things.
I admit I'm guilt of misusing the word, but if I ever get into a serious conversation or argument I'd be sure to use the dictionary definition.
[originally, someone who makes furniture with an axe]
1. A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary. RFC1392, the Internet Users' Glossary, usefully amplifies this as: A person who delights in having an intimate understanding of the internal workings of a system, computers and computer networks in particular.
2. One who programs enthusiastically (even obsessively) or who enjoys programming rather than just theorizing about programming.
3. A person capable of appreciating hack value.
4. A person who is good at programming quickly.
5. An expert at a particular program, or one who frequently does work using it or on it; as in ‘a Unix hacker’. (Definitions 1 through 5 are correlated, and people who fit them congregate.)
6. An expert or enthusiast of any kind. One might be an astronomy hacker, for example.
7. One who enjoys the intellectual challenge of creatively overcoming or circumventing limitations.
8. [deprecated] A malicious meddler who tries to discover sensitive information by poking around. Hence password hacker, network hacker. The correct term for this sense is cracker.
The term ‘hacker’ also tends to connote membership in the global community defined by the net (see the network. For discussion of some of the basics of this culture, see the How To Become A Hacker FAQ. It also implies that the person described is seen to subscribe to some version of the hacker ethic (see hacker ethic).
It is better to be described as a hacker by others than to describe oneself that way. Hackers consider themselves something of an elite (a meritocracy based on ability), though one to which new members are gladly welcome. There is thus a certain ego satisfaction to be had in identifying yourself as a hacker (but if you claim to be one and are not, you'll quickly be labeled bogus). See also geek, wannabee.
This term seems to have been first adopted as a badge in the 1960s by the hacker culture surrounding TMRC and the MIT AI Lab. We have a report that it was used in a sense close to this entry's by teenage radio hams and electronics tinkerers in the mid-1950s."
You'd be surprised. Linus Torvalds, a hacker one would presume, used to have a MacBook Air as his primary laptop, and praise it as the best laptop ever made, saying that other companies have failed to produced something as good (though after 2014 he moved to a Chromebook). And the reasons he gave for praising it for are exactly what people think Apple has too much emphasis on: thinness, lightness and battery life.
Of course that's a single data point (through for more data points you can go to any software conference, where Apple laptops usually dominate both the speakers and audience, even though they are less expandable and have less performance than some gaming/desktop-replacement laptops).
An Apple laptop can fit well with some hackers, for at least two reasons, I think:
First, tinkerer != hacker. A hacker can be a tinkerer, but it's not the same thing. There are people who love to hack on specific things, create new stuff etc, but could not care less about dealing with the hardware or customizing their window manager. A lot of hackers I know in fact tend to be quite minimalistic in those areas.
Second, a hacker isn't necessarily all about raw performance and 8GB graphics cards. A lot of hacker types and great programmers in C or whatever can do wonders with very little hardware.
Now, other kind of professionals, like 3D artists, number crunchers and such, might definitely want more GPU/RAM. But even for a lot of creative professionals, stable and "what we know" trumps "latest and greatest". Most professional music studios I know, for example, have 2 and 3 generation old setups, and never jump to the latest OS until 1-2 year after its out.
In the photography world, were I dabble (and once did professionally), we have this notion of "measurbators" and "pixel peepers". They are those that are obsessed with ISO performance, megapixel count, sharpness, synthetic tests of camera gear and so on, but seldom create any output of any particular worth. I guess the same would be for PC people obsessed with benchmarks beyond a particular point, especially if their workflows don't need them. A hacker, in this regard, would be the opposite. This guy, is a hacker, photography wise:
USB is standard, so now "hackers" are supposed to have an raging lust for standards on Apple hardware?
"Don't worry, the dongles are USB standards too". I don't give a rats ass if dongles are standard are not. I am not lugging around dongles to connect my phone to my laptop.
I use Ubuntu and One Plus 3 and USB 3.1 Type C works like a charm with 10 Gbps speed. I got 100 pieces of really good quality cables for around $2.5 each, used up around 25 around my house, office and car (removed link)
I am not spending 1 bazillionton dollars on the same or worse quality Apple cables.
Agreed, and well said. This one line concisely sums up the first and last word on this whole pitch.
The edges are very sharp, and the air vents on the bottom are right where you grab the laptop to pick it up, which gives it a knife-like feel. Also, by expanding the track pad far beyond it's useful size, there is now no gap between it and the space key. I have discovered that I have a habit of resting my thumb just below the space bar and now I tend to bump the pointer on accident now. The arrow keys are now a continuous run of keys with no way to orient quickly like previously (where the side arrows were slightly smaller and made it obvious where the up key was without looking.) . Finally, the keyboard action is very short, as you would expect with such a thin laptop.
I do most of my work with an external keyboard and monitor, so it isn't that big a deal to me, but I can see it being hard on people who use their laptop exclusively.
This is surprising and disappointing. I really enjoyed using Apple's touchpads with tap to click because their accidental press recognition was good enough that I never really worried about this. I typically had to turn tap to click off when using Linux because I would randomly click while typing.
I wonder why they haven't been able to get it to work as well on the new touchpad
Palm detection should typically be tweaked through various driver parameters.
However, I don't think all drivers have palm detection unless I'm mistaken?
Also, unfortunately while I would have enjoyed configuring the system to my personal desires, these days with the limited free time I have, I'd rather be able to just do what I was going to do on my laptop instead of having to spend time tinkering with it to get it right. Apple seems to do very well in this respect since it almost always knows when I want to tap vs when I just have my palm there.
Just curious, have you used Mac laptops previously? Because sharp edges are something that have existed since the original MacBook Air.
I do agree they are annoying though. I always have outlines on my wrists after using my laptop on my lap for a while.
I'm using a late 2011 MBP and the corners where you open the top are deadly.
The only attractive feature of the Touch Bar model really is Touch ID. If Apple would sell the Function Keys model with a Touch ID power button and one more USB port, I would happily buy it. But right now, I am a little bit confused and baffled by Apple's new MacBook Pro product line. I think I am going to wait another generation to see whether Apple gets back on track and whether the Touch Bar can stand the test of time.
The main reason is that your system needs to index your entire drive; this will require MUCH more power than typical power use until this process is complete. Should give a new system at least a week to shake out before freaking out about battery life.
Second 13" touch bar machine now reports about 5 hours, which is still not great.
Also, for what it's worth, I love the new keyboard. It took me a few days to get used to it but just yesterday I measured myself typing on it at 130 wpm, the same speed I can do on normal keyboards. It's very satisfyingly clicky.
...I actually really like it too.
It is a perfectly reasonable opinion to like or not like the keyboard, if you have actually tried it for a while.
I am not, so... works for me.
I'm guessing you're more heavy in terms of key strike?
That said, while I feel that the new keyboard is slightly better than the last new keyboard, the sound of it makes me want to rip the keys out - it is very loud in a not-so-good way.
Similar story with iPhone 7 home button, except I've just come to accept it and ignore it but wish we still had a proper physical home button.
When out and about, when do you need 3? Can't ever recall seeing someone at a coffee shop, conference, etc needing 3 USBs.
As I speak I'm using magsafe + thunderbolt2 + HDMI + 2 USB, 1 for peripherals and the other for playing back video.
Battery life is good, and 16GB is a big step up from the 8GB I had on the MBA. I've been running a decent-sized system in minikube with no problems.
The only issue I still seem to have is that Hangouts absolutely obliterates the battery. A one-hour call took battery life from 9:45 to 1:15!
I mean, I like all the power-hungry features, and I'm usually plugged in - I'd prefer they keep adding useful things rather than worry about my battery. But it does hurt at times :)
Of course driving an external monitor, streaming video from the Internet, and using your computer for more than just word processing and web browsing, your system will use more power.
And mind you, I am the kind of person who has managed to accrue about 13 months of hard disk head flying hours according to the SMART stats so the battery isn't exactly in a good condition.
I'm kind of astonished at how many people turn their noses up at different things based on a first impression, even in groups that you might consider early adopters of tech.
Some early adopters of tech don't enjoy enduring regressions, and can see the regression from a mile and a month away.
I'm especially grumpy when I see absurd, hyperbolic marketing used to sell anti-consumer regressions.
I've used the keyboard for a week at this point and at this point have completely adapted to it. It doesn't feel like a regression to me at all.
This is a terrible attitude for a technologist to have. Cellphones were a "solved problem" before the first iPhone came out. Laptop monitors were a "solved problem" before the first retina MacBook Pro came out. Look where we are now.
Of course, this doesn't mean every change made to an existing technology will be an improvement and/or be embraced right away and by everyone, but I see it as a Good Thing that a major technology player is still experimenting with things that everyone else has written off as "already solved."
To be (un)fair, you can get used to a rock in your shoe as well. That doesn't mean you should.
Having to "get used" to a keyboard to return to your previous typing ability seems to be a contra-indicator of the new Macs having a good keyboard.
I'm also going to make what I think is a reasonable assumption and say that a vast majority of folks are going to carry their laptops around in bags. In which case, the 7 ounces saved (for reference sake, this is about the same weight as a 45 watt power brick, minus the extension cord) is nearly meaningless, and the half inch on each dimension will be completely meaningless.
On a side note, I think that a 13" monitor hits a sweet spot. It fits in a lot of bags, has just enough screen size to be productive without requiring an external monitor, and the performance difference between 13" and 15" computers is usually quite small (if it exists at all).
It's a cumulative process. My first Macbook was a white 13.3" polycarb core 2 duo. The new 15" MBP is a pound lighter, almost half as thick, and only half an inch deeper and an inch wider. Samsung has a 15" laptop that's under 3 pounds, making it very palatable even for people used to a 13"-er.
At what point along that evolution should Apple have stopped striving to get smaller?
Yes, it's a great feature, but giving Apple so much credit for introducing it is the ultimate irony.
Care to explain? I'm not familiar :)
Apple signed this too, but as we all know didn't actually add a micro-USB-B connector to their phones like everybody else, but just providing an adapter to USB, which was not forbidden, but arguably against the spirit of the memorandum.
Citing wikipedia, Some observers, noting Apple's continued use of proprietary, non-micro USB charging ports on their smartphones, suggested Apple was not in compliance with the 2009 Common EPS Memorandum of Understanding. The European Commission however, confirmed that all MoU signatories, "have met their obligations under the MoU," stating specifically, "Concerning Apple's previous and present proprietary connectors and their compatibility with the agreement, the MoU allows for the use of an adaptor without prescribing the conditions for its provision"
> I’m sure it’s only a matter of time until Amazon is flooded with cheap versions of this idea that tweak it just enough to avoid patent issues. I look forward to buying $3 breakaway USB-C cables in the future.
Has this guy not seen what's going on with Amazon and the cheap USB-C cables that are flooding it? They're literally destroying people's laptops by drawing too much power and frying either end.
"It’s not the Nexus’ fault that my MacBook got fried — it was just doing what it was supposed to do: ask for as much power as it can get. It’s not the MacBook’s fault either — its ports weren’t designed to handle delivering that much juice nor to know that they shouldn’t even try. It is the fault of the cable, which is supposed to protect both sides from screwing up the energy equation with resistors and proper wiring."
This is simply not true. Both the device that is supplying and receiving power should be regulating. The cable should just be "dumb". I've done a bunch of testing on various chargers using cheap USB power meters and 2A USB dummy loads. For instance iPhones/iPads will monitor the voltage of the charger and if it starts sagging will reduce the amperage they draw. (my own testing has only been with "classic" USB type A stuff though)
The only cable I've heard of that actually fried anything was one that was wired completely wrong putting power on the data pins or something like that.
edit: I will admit I am partially wrong though - checking a sample of his reviews reveals the case of Type A <> C cables, where as a bridge between the standards the cable is presenting as a charger - in those cases, yes it should not just be "dumb". I still maintain that well-behaved chargers should not supply more power than they are capable of, and well-behaved devices should (and many do) monitor their charging environment and back off when the voltage sags.
It's not always a wrong cable. It can be a small connection problem.
Then the USB-C power protocol goes up with it's power, similar to cell power with a bad signal.
Then it gets hot and catches fire.
Beware if you charge via USB-C in our bed. you can be dead.
Also, the charge cable is electronically marked for 5A. Regular USB-C cables only do 3A, which limits them to 60W with USB-PD.
If not, then they must be compliant USB-C ports and therefore would not be supplying an incorrect voltage. As far as I understand the fundamentals of electricity, current doesn't matter. A 5v USB power source that can supply 100 amps would still only supply whatever the device was designed to draw when charging.
But all of that is moot if the MacBook ports are "special" USB-C ports and somehow they negotiate a voltage higher than 5V.
I'm always amused by how uninformed many people are about USB and power supply, if devices can't negotiate something that works for both ends they just stop. It's a shitty situation to be in if your device fails to charge, but devices that follow the USB-IF rules will never draw more current than the power source can supply, and the power source will never supply more than the device has requested.
So, so long as Apple's MacBook USB charger is compliant, it won't destroy anything USB that I plug into it, so long as those devices are also compliant. Right?
So yeah, if you plug something into the Macbook charger it will either not charge (dunno if it supports traditional USB power specs since it's a type-C only charger) or charge as any other charger already does.
For devices that DO attempt to negotiate, yes in those cases usually you'll see it work according to spec.
I haven't tried USB-C yet, it would be interesting to see if devices have gotten stricter.
The standard covers both 20V and 5V power supply. If the device needs 20V and only gets 5V, there is nothing that says it must work, or even inform the user that it, in fact, is not working.
I've owned a few Macs and this one is my favorite. Despite my initial impressions, I really like the keyboard. I'm also quite happy with the trackpad which I've found to be excellent as usual and not picking up stray contact.
The only place I'm not "ecstatic" or "pleased" is with the TouchBar - and to be clear, I'm not displeased. I just don't really notice it. It's there doing its thing and I'm using the laptop, doing my thing. Occasionally I need the escape key and tap where I'd expect the escape key to be and while I don't get the tactile feedback it works.
All in all - the TouchBar is a net zero to me. I didn't lose anything by losing my Fn keys but I don't feel I gained anything with the TouchBar except TouchId which is nice.
TL;DR: Overpriced? Definitely. Pleased with product? Yes. TouchBar? No strong feelings. Returning it? No.
Compared to other Mac keyboards? My MBA feels super squishy - it's actually uncomfortable to use now.
All in all - after about 10k words on it, I like it.
If you're curious and in need of an upgrade just buy one. If you don't like it, return it.
Have only tried typing a few lines in store and I also enjoyed it, but might be personal preference: use keyboard with Cherry MX Brown at home and got used to light/low-force actuation.
Getting a maxed out 13'' to replace my 2008 one.
- Keyboard, especially arrow keys, took me 2 days to get used to, but now it feels weird typing on a old macbook. I actually love it and prefer it now.
- The thumb + touchpad thing mentioned elsewhere here was definitely a big problem in the first day or two. It isn't anymore (guess I got used to it? not sure because I didn't try to avoid it)
- USB-C is freaking awesome. I bought an adapter with ethernet, HDMI, usb 3 and SD that actually replaces the 3 adapters I had to carry around. And because of UPD, I only have one cable to plug to the mac and everything is there including power.
- I don't miss magsafe as much as I thought I would. Although I would happily buy an adapter if it is thin enough (some are coming).
- Touchbar is actually pretty great, although it being a touch screen, the lack of touch feedback can be annoying at first. Pretty ESC is annoying at first, but I got used to it and don't mind it now.
- Touchbar would be an _awesome_ medium to get notifications (such as long running terminal jobs etc...)
- I didn't get any of the battery life issues people are talking about. Actually, I get 8-10 hours out of it easily (ie plug it at the end of the day because I forgot it was unplugged).
- Thinner bezels around the screen makes it somehow look bigger (even though the visible area is the same size and the screen/lid itself is smaller).
- HiDPi is freaking great. Finally I can use a 4k monitor smaller than 32" and still get retina display (1440p HiDPi and other intermediate resolutions up to native 3820x2160 are fully supported)
- It is really thin (no thinner that a Macbook Air, but still). It's feels really great.
- Actually, I just noticed that because it is thinner, my wrists don't get hurt by the edges like they used to (the exact opposite of what someone mentioned here).
As a daily user of various JetBrains IDEs, the lack of a proper keyboard makes the new MB Pro inconceivable for me. I'm actually going to have to go Win or Linux for my next machine.
For Android Studio, I also remap the keys to Cmd+something out of consistency with XCode. For instance, I mapped build to Cmd+B etc...
For those of us who use a variety of JetBrains products, the F-key shortcuts are the consistency. Having them faked on a minuscule screen, rather than being findable by touch, would make the new MacBook unusable for me.
Sad, as I find OSX to be the best in a bleak landscape of OS's, but given Apple's obtuse insistence on knowing what's best for all, to the extent of removing critical components of the universal standard keyboard, my current MB Pro will certainly be my last. Ugh Windows!
Although don't use it for 4k, as it is only HDMI 1.4 (4k@30hz). I am not aware of hubs like this with HDMI 2.0 yet. I bought a USB Type C => DisplayPort adapter for the 4k monitor.
I do enjoy this laptop a lot, thank you very much.
... but at that point the problem isn't with the tech
But... I wish Apple would put USB-C in the iDevices (honestly why does the iPhone and iPad use Lightning when USB-C exists??). It is annoying that I can ditch all my cables except that one fucking extra Apple cable.
If I was on the hook to warranty replace $600 devices when the internal connector gets ruined I might take my time jumping on board too.
I doubt USB-C is that much worse than Lightning. Even if it is say 10% less reliable as a cable it would still be much better to have a USB-C port simply because you are more likely to have a USB-C cable if everything else uses it.
I've found a wooden toothpick works wonderfully to clear out accumulated pocket lint from the port every 6 months or so.
Totally anecdotal but I have had a Nexus 6P for over a year now and no issues with the USB-C port so, for me, it is just as durable so far as Lightning :)
It didn't when Apple started using Lightning.
Lightning is a superior connector design.
The only problem I have with their use of Lightning in phones is that they want headphone manufacturers to switch to Lightning...
They havent switched to USB-C because people will br super annoyed if they just supported it for a couple of years.
It also explain headphone jack removal. Removing two ports at the same time would make people super mad. And they also need to stimulate the wireless headphone market.
Source on this?
I didn't know this is what Ive wants to achieve, but it's something I've assumed every device manufacturer will be racing towards (even if it is on a niche device to test market reaction) and I've wondered why no one seems to be boldly pushing this?
I also find it amusing that they released the new MacBook Pro with a 3.5mm jack when they clearly don't need it. It makes even less sense on a laptop than a phone IMHO.
USB-C is pretty great, actually. Still pretty raw for the mainstream tech crowd, but it's not like that for tech-literate consumers.
Seeing my colleague intuitively reach out to the screen to use the Android Virtual Device when we're discussing something convinced me it was actually a nice thing to have.
I'm not sure what makes touch not poweruser and "hacker" friendly, but it's not the 80s anymore. Most of us have significant portions of our workload on the web and the web is, by and large, an extremely touch-friendly medium.
It's faster to pop up to the screen to press a link and return to the home row than it is to pop to the trackpad and do the same unless the pointer is quite close to target. Both have similar repetitive strain implications.
Actually not, the convenient little 'arms' for the cord are missing. And the cord itself is rather stiff and not very flexible … and there's no green / red charging status light either. It's OK as a USB-C charger but it looks and feels different from a traditional MBP charger.
Seems like Apple took an proprietary, easily-damaged charging cable and replaced it with a standard, less easy to damage one.
The charging indicator light is a definite loss, though.
True, but I've literally never seen anyone do that in public. I do (the 2 times I've done so), one coworker does, everyone else grabs and wraps as quickly and tightly as possible.
They can be used "safely", but the affordance isn't there. It practically begs to be misused. Bad design in a nutshell.
That really sucks.
Did Apple finally fix the incredibly shitty and prone to fraying soft rubber on the cables?
I'd honestly love to use my laptop's Ethernet dongle with my iPad or iPhone, but that won't happen. Apple has just recently doubled down on Lightning with the Apple TV Remote, the iPhone 7 headphones and the new Magic Trackpad/Mouse/Keyboard trio. For iPad accessories, they've even introduced the all-new Smart Connector instead of a USB-C port on the side of the iPad.
NooQee USB Type C Adapter to Lightning 8-Pin 2-Pack 
NooQee Lightning 8-Pin to USB Type C Adapter 2-Pack