I'm not talking about i3/i5/i7, but rather U/Y/H. This letter determines the TDP (thermal design power/point) them machine is designed to run at. The TDP will govern the setting for the base clock speed, and, just as importantly, the throttling behabior under load.
Processor series TDPs are Y: 4.5W, U: 15W, H: 45W.
The new MacBook Air appears to have a Y series processor, like the MacBook, which means it will be aggressively throttled to keep power consumption and heat generation low.
Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.
The Pro is still a much better choice for programmers. The 13 is suitable for many things, but the 16, with the H series processor, is really preferred for computationally intensive work.
You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.
OEMs can still control load performance, it's not entirely determined by TDP.
Anandtech has a good article: https://www.anandtech.com/show/13544/why-intel-processors-dr...
Depending on OEM settings and cooling capacity, it's possible for Intel CPUs to indefinitely run at greater than their base clocks and TDP.
Why does Intel let OEMs play with that? It's to give OEMs a knob they can play with to balance perf and price for their segment. It lets OEMs cheap out on cooling (or go with new form factors like the tiny GPD models) and go with the low minimums, or have a full system that's capable of higher TDP like Apple's tend to be.
On my fanless 12" MB with a 7W TDP m7-6Y75 (1.3 Ghz base, 3.1 GHz max turbo), running a Prime95 torture test stabilizes at the following for the CPU via Intel Power Gadget: 9.5W, 1.95 GHz, 90°. (Laptop feels warm to the touch - it only ever feels "hot" under GPU load.) Normal all-thread load stabilizes around 2.5 GHz.
Your post may apply if we're working with 4K+ resolution video files, rendering and other activities, but the modern programmer, even compiling binaries, will be fine on a MacBook Air.
How do I know?
I use a 2 generation old Macbook as my primary personal development machine. I write Go, Rust, Java and TypeScript using the common tool chains for all those languages.
A perfect example of this the recent Level1Techs video about compiling an Unreal engine game using the AMD Threadripper 3990X. Some concrete advice:
1. Often the heaviest aspect of a programmer's workflow isn't compiling the code, but running automated tests using VMs and stuff
2. compilers tend to benefit from large cache sizes and good single-threaded performance. Multi-threaded performance only makes a difference in limited scenarios.
I've also heard that storage performance, and in particular storage latency, can make a big difference on compiler performance, but the difference is starker on Windows than on Linux because NTFS is a lot slower than many Linux file systems (EXT4, XFS, etc.).
The thing is, we as programmers don't have a good mental model for how the software tools we use (IDEs, virtualization/docker, compilers) scale with modern hardware, and very few technology reviewers are producing content that would help us understand.
I don't see Airs as development platforms. They are portable platforms.
I wrote a little about this; it's for Rails but the same ideas apply:
Add some server processes, UI watchers/builders, and my Thinkpad starts to freeze up. In an ideal world, I would have a low-latency remote desktop going, but even in 2020 it seems like a pipe dream.
5ms encoding lag, 10ms network lag, totally workable.
On the flip side standard issue laptops are now 16" Macbook Pro's with 32GB of RAM which is pretty hardware.
Not to mention previous MacBook Air were Dual Core only.
A lot of usecases involve running a replication of the server environment locally in resource intensive containers
I'm surprised no one else operates this way.
The performance gains are immense.
It's immensely flexible and I'm surprised it's not more common.
Also, a cloud VM, if it breaks, is replaced "for free" and "instantly"; fixing a physical machine has a cost and is a delay. Unless you're big enough to keep a spare (probably makes sense for many dozens of machines), it also looks a bit uneconomical from the business's POV.
Never mind that your cloud costs in a year end up being comparable to said workstation.
That's why I'm tempted to build a custom workstation at home: so that I always have access to the compute resources I need to do my job.
A typical compile for me takes about twice as long compared to colleagues with newer H-series processors, but the individual projects are pretty small so that often isn't a huge deal.
However, once you try and run nearly a dozen docker containers for local testing, things slow down quite a bit. Multitasking while those tests are running is very difficult.
Great machine, only wish I had waited for the true tone and keyboard redesign but those are minor, have grown to love the keyboard.
I'm still using a 2013 MBA as my primary machine day to day. It has an i7 and 8gb of RAM. I've been waiting for the keyboard change before I upgraded, but today I see my options are i5 and i7 (as 2013). I'd get 16gb of RAM, but literally have no clue about the chip stuff. I'm assuming we've moved on somewhat from 2013?!
Any help much appreciated! Thanks in advance!
Complicating things recently is the introduction of the Xeon W and i9 lines, but for the most part the Xeon ones have yet more a new features and i9 are higher core counts to compete with AMD.
In broad strokes in the laptop space the majority of enhancements since 2013 I would say is higher RAM capacity (that Apple as a manufacturer has not to pursue compared to Lenovo, etc.), still yet lower overall power consumption to improve battery life, DDR3 to DDR4 ram, bigger L1/L2/L3 caches, and normal improvements to embedded graphics to support bigger onboard and external displays, and PCI-E enhancements for drives as NVMe-style drives are what I would consider the new standard.
i3 - Dual Core
i5 - Quad Core
i7 - Quad Core and slightly higher Clock speed.
That is it. Apple has kept it fairly simple most of the time.
Worth nothing the i7 is only very very very slightly faster as you are limited by thermals. I dont think it is worth the price. If you ever need the processor power then the current ( and future ) MBA isn't for you.
And in all honestly the CPU performance between your 2013 MBA and this MBA isn't all that different if you are getting the Dual Core. After all your 2013 CPU had 15W of thermals to play with, this newer CPU only has 9W peak, ( and 7.5W average ).
But the overall package of MacBook Air still many times better. Screen Quality, Ports, SSD Speed. Speakers. Not sure if they have updated the WebCam, if not it will be worst than your 2013 MBA.
My work won't be processor intensive, just a personal machine. But given how long I kept my old one, I intend to keep this one for a long time, and I'm not sure where the best price:performance tradeoff is. I could argue going low-end and replacing it sooner, or high end, and keeping it forever.
that said, about a year or two ago was also when it just started feeling too slow to be usable (beachballs all the time). I blame web bloat, mostly :-/
I bought it new, so I definitely got my money's worth out of it.
PHP Dev often means some sort of local server running too.
compilers are often compute bound (certainly on SSDs).
web servers are almost never compute bound.
It's OK to geek out about hardware, if that's your thing. But it's not really necessary for most people -- even programmers.
45W will get your laptop pretty warm and noisy and will drain your battery.
Been looking at the newer gaming notebooks, and I'm waiting the "real world confrontation" of 10th generation i7s (specially those that have AVX512) vs Renoir (lots of cores, AVX2 on a single instruction and (not as much as desktop) lots of cache), since I run some numeric heavy code and it's not quite clear how they'll stand up to each other.
Why do I run this on a notebook and not a desktop? Because I'm always on the move, and I can pack the notebook and go anywhere I have to and be able to work with or without internet connection (which is sometimes the case).
But I am also interested in the upcoming processor generations.
The P1 comes with a 9th gen i7, I assume they'd eventually get a gen 3 with 10th gen i7 and then I'll be able to compare.
Bychance, have you seen the article from puget systems on how to use MKL with Ryzen? It explores some of the different trade-offs between AMD and Intel for numeric computing.
So put it on a stand and plug it in? Problem solved.
No it isn't any more suitable, because you can put your laptop on a stand and plug it in to achieve the same thing.
HiDPI. Once I had used a HiDPI screen for a while, I can't really tolerate LoDPI displays anymore.
I bought a 2018 15 inch MBP with 6 core i9, 32 GB of ram and vega GPU with the idea that it would be my one stop development machine. The system is terrible, especially considering the outrageous price - it does have sufficient compute power but the thermals are insanely bad - it's constantly turning the fans to 100% and I can't even tune the power usage - when I run a VM/emulator along the IDE people in the office start turning their head in my direction because of the fan noise.
And the performance still isn't even close to a mid range desktop machine which would have cost 1/4 the price.
I'm seriously considering selling my MBP buying this Mac Book Air top config (I need an OSX machine to develop iOS/OSX apps, otherwise I would go for Lenovo X1 Carbon) and building a desktop which will be always on VPN for heavy lifting + it can replace my gaming console.
I have a MacBook Pro 2018 and Intel NUC running Linux with exactly the same CPU. In typical C/C++/Rust builds, the Linux NUC is perceptibly much faster than the MacBook.
Didn't dive much deeper, but I'd guess the difference is thermals (despite the NUC having a small enclosure) and much higher system call overhead in macOS.
That said, I still like having a Mac as well, since there are so many great applications that I use frequently. But a sweet spot (as you suggest) is a reasonably powerful MacBook for the things that macOS is good at and a powerful Linux workstation or server for compute.
FWIW, the Y CPU in the Air is set at 9W instead of the usual 4.5W
For example, the only ThinkPad of that generation's T-series that had one was the T470p which had a significantly different design in terms of batteries, thunderbolt features, etc. to accommodate the significantly larger cooling system. With the very next year's model, all the models got quad-core with their usual low-voltage/low-power (and low battery usage) design.
Does that still holds when using the power adapter?
No? The base model had a i5-8210Y?
I'm not totally sure I agree. I typically do a lot of programming on my MacBook Air. It's simply a box I use to ssh to something else more powerful to do the actual runs.
Yes, if you're running computationally intensive stuff on your MacBook Air you're going to be disappointed, but if what you're mostly doing is typing into a terminal screen then it's probably perfect. And probably even over-powered? :)
This isn't progress. This is the baseline. Apple have gone from bad to OK, and they're celebrating as though they've achieved something amazing.
As far as talking about it being amazing, its called marketing spin. This is how it works. However, those two sentences do not say anything about it being amazing. It simply focuses on the positive features of the keyboard. The two sentences above clearly communicate to mac users that the company has fixed the problems that people wanted fixed. Did you really expect a bunch of public self-flaggelation? They are telling us clearly that they did what we asked for. Perfect.
They tried a new design, which was horrible to use and had a high failure rate. They continued to claim the new keyboard was amazing, and stubbornly continued to use this crappy keyboard long after the problems were apparent.
And now they are touting a "normal" keyboard mechanism as if they've invented something new and wonderful... only Apple could get away with such transparent BS.
What in that says "invented" "new" or even "wonderful?" It seems like you're reading into the text what isn't even there.
IIRC, the peripheral line itself was started with the original Magic Mouse, which was branded as such because it didn’t have separate external actuatable buttons, but rather was just a smooth surface with a multitouch digitizer on the top half + a single actuatable microswitch underneath the shell†. Apple wanted the image of it “magically” figuring out when you left/right/middle-clicked (despite no L/M/R buttons) or scrolled (despite no scroll-wheel.) Also, the “plug in to pair” experience might have contributed to the claimed “magic”—it was a fairly unique approach to pairing at the time.
† Which is a design with some real benefits, like being easily disinfectable, with no crevasses close to the hand for filth and germs to accumulate in. (There is a crevice on the Magic Mouse, but it’s on the bottom, where your hands will never touch it.)
There is also a bit of “magic” in the Magic line of peripherals that’s not in the hardware itself, but rather in the OS: when the Magic line of peripherals—Apple’s first Bluetooth line of peripherals—was introduced, Apple added a feature to macOS where macOS will “train” the Apple EFI firmware to recognize devices paired in macOS itself, such that the firmware will later attempt to connect to such paired devices on boot. This means that e.g. holding Option on your Bluetooth keyboard to select an alternate boot device on an iMac would actually work. Which was kind of necessary, as those are the peripherals iMacs shipped with.
- Magic Mouse
- Magic Mouse 2
- Magic Trackpad
- Magic Trackpad 2
- Magic Keyboard
- Magic Keyboard with Numeric Keypad
"Refined" is also an adjective that has a definition close to "new and wonderful". It's not just a word to bring out an emotional response to make people feel like buying the thing.
Imagining a word to mean what you want it to, and then reacting negatively to that, that doesn't say as much about Apple as it does the observer.
> with impurities or unwanted elements having been removed by processing.
> developed or improved so as to be precise or subtle.
It basically just mean improved, which most people would agree.
When has any tech company ever done that?
People want to hear wailing and gnashing of teeth, which is silly.
Ignore the marketingese and don’t let it bother you so much. The important thing is that they listened to their customers and created a better product as a result.
Anyway, I consider fake reviews to be dishonest .. I consider this to be more like "putting a positive spin on it".
I actually liked typing on the butterfly switches, the only thing I did not like was the left/right arrow keys being bigger.
Despite having Double the Key Travel of Butterfly ( 0.5mm to 1mm ) It still felt the same as butterfly. The old Scissor had 1.3mm, while only 0.3mm difference, it felt night and day.
The new scissors also claims higher stability. Although I doubt this has anything to do with the design but rather of the "height" of the key be lower.
It is indeed new, but I know if it is wonderful yet. I haven't had a long period of time to try and use it.
Somehow I think Apple can handle it. Anyone wasting energy complaining is only doing so because they are waiting to buy again.
It makes as much sense defending them as it does yelling at them about a keyboard.
I think it would be amazing if the most valuable, most design-focused company in the world admitted to everyone that they made a mistake. It would do a lot towards allowing everyone else to make mistakes without beating themselves up over it. After all, if the thousands of specialist engineers, paid billions in salaries, given the best equipment in the world, in a company that really (and I mean really) values design, can make a mistake, then it's kinda OK that your home page looks a little crappy on mobile.
Exactly. In Steve Jobs days he would have either jokingly admit there was a mistake or at least say something that people dont like it ( hence admitting there is a problem ).
The new Apple put up a big middle finger and didn't act until there are class action.
> MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro. The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience. The inverted-T arrow keys help you fly through lines of code, spreadsheets, or game environments.
says anything about "fixing problems that people wanted fixed" or "doing what people asked for." That would read:
> We used a lower-travel keyboard with a butterfly mechanism and alternative arrow key layout on recent models, and you said you didn't like it. We listened. The MacBook Air now features a proven scissor mechanism with a return to 1mm travel and classic inverted-T arrow key layout.
They're implying that they've come up with something new, which is a lie. That's not perfect.
So this scissor design with 1mm of travel seems to be something new. The inverted-T arrow keys are not new, but the scissor mechanism itself is, a refinement of previous designs.
What ? No. They are telling us they refined it, not that they fixed their mistake. It's not perfect, it's lame and cheesy.
No, we moved on and do not care. There are many companies that make laptops and keyboards. 5 years too late.
Further, it can take revolutionary change to maintain incremental improvements. Filling HDD with say Helium has a lot of knock on effects hidden by the spec sheet.
"We've gone back to where we were three years ago after a mistake, sorry."
I don't understand what the issue is with Apple trying to sell their products. It sounds like people here are upset that marketing and sales exists, and that they use language to try to make their products seem impressive, and that consumers aren't rational when it comes to buying things.
Yes. We expect adults to own up to their mistakes, so why don't we hold corporations to the same standard and instead just accept corporate bullshit from them?
Sent from my MBP 13" 2018 with sticky spacebar.
New Apple doesn't act. Refuse to listen. Even with the Repair programme they still act as if it was not their fault.
That would be refreshing, I'd respect that.
Yep this is the right move, and as someone else says, this would be worthy of respect.
I think we've all just about had it with corporate bullshit -- and to be sure that says more about this moment in time than anything else.
Apple is consistently guilty of blowing smoke up our collective asses. It would be nice if they could give it a rest and simply be honest. But here we are.
"The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience."
Is any of that false? How do we know?
There's nothing wrong with a company saying "we screwed up." Which Apple has done already. There'a also nothing wrong with a company saying, "This keyboard is great," if it is in fact great.
They pretty clearly admitted fault when they offered free out of warranty repairs for keyboard issues.
What I noticed in the image that made me excited about this device is the function keys. If the 16in MBP had function keys, I probably would have purchased one already. I do wonder if the top spec of the new MBA is going to hold up to my usage though.
I don't blame the browser vendors (except maybe that V8 made JS juuuuuust good enough to make something like Node viable). They took a thing that run slow for everyone, and they made it faster.
I do blame application developers for writing everything to the web because it's there. There's instances of folks doing better than most in Electron/JS land, but it's still nothing close to the native or even managed Java/C# apps of yesteryear.
Really I should just link to my last Electron rant, seems like they're getting closer together nowadays... https://news.ycombinator.com/item?id=22598148
>MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro.
That saved me from having to google "hey, is the 'new magic keyboard' the thing in the 16" MBP that I've been waiting for in the Air and 13" MBP, or is it something else entirely".
Some people just seem to choose a target for life (like the Favored Enemy of a Ranger from Dungeons & Dragons) and never give it a break, no matter what.
I suspect that even if Apple comes out with the best keyboard that mankind is ever going to make, some of you are still going to be angry about how they removed optical drives 4000 years ago.
- Apple knows that at the very least, some set of vocal people don't like the previous keyboard. They also know that many of their customers had to get repairs, even if they liked the keyboard. Those customers might understand that "butterfly = bad"
- They need to tell people that they've fixed the problem but don't want to do so in a way that says "the last product was bad" (so they can't just say nothing about it)
I think we should also place some fault on other manufacturers for just blindly attempting to do what Apple does without thinking: check out the latest XPS 13. They've implemented the same arrow setup as Apple's butterfly keyboards. And yet, I haven't seen a single review online that criticizes the XPS for this choice.
Then I grew tired of the macbook fan noise when running windows 10 and debugging with the touch bar. I ordered a surface 3 laptop and immediately realized how important a nice keyboard is to me. Its tactile, its got enough travel, the keys feel nice. I type with fewer errors and I work faster. Anyone want to buy a 2018 macbook pro with 6 core and 32gb ram?
Would be seriously tempted to just buy a new one if I was confident the keyboard wasn't absolute garbage. Typing on it was fine when it worked, but it double-spaces, and a couple of the other keys are now wonky.
Never had a keyboard die on me before this -- Mac or otherwise.
I also tried out the new version of that keyboard on the recently-released MBP. It feels almost the exact same as the old one, just a slightly more shallow depth.
Hopefully that's helpful.
1. Get your keyboard repaired (free, I assume)
2. Sell that thing online
3. Get this one.
I mean, this has a quad-core option, too. It's a no-brainer better machine.
I have the 16-inch that has the same keyboard mechanism as this. It's 100% an evolution on the previous design. While it is early, it's a proven design and there's no chance of it having any of the issues that the butterfly keyboard did.
I feel like my 16-inch is the computer I intended to buy in 2016.
If you can afford the upgrade cost after you sell your current one, you won't regret it.
Unfortunately, Apple will only replace the butterfly keyboard for 4 years after initial retail purchase of the laptop.
I dread the eventual breakdown of the new butterfly keyboard after the end of February next year, when I'm on my own. I hope iFixIt will start selling DIY repair kits, or the cost to repair at Apple makes such a frequent failure a non-starter (in which case I'll turn it into a fixed-place computer in the house, with an external keyboard).
I wouldn't be surprised if Apple bended that 4 year rule, though. But also realize that Apple will eventually stop repairing all computers just due to time passage. And also, eventually, from a financial perspective it's more cost effective to find a working used computer (after heavy depreciation) than to actually do a repair even if that repair is doable.
I do think that their replacement keyboards have better reliability than the early models. I didn't have any problems after getting mine replaced - I believe I used the replacement keyboard for around two years. But I still wasn't willing to keep the computer longer. I went straight from the 2016 to the 2019 16-inch MacBook Pro.
In any event, it was a nice opportunity for an upgrade. My 2016 was still worth $1000, and I ended up with 4 more cores, double the storage, and an absolutely massive increase in graphics performance. Using the education store and picking up in a state with no sales tax did a lot to make that price more palatable.
"This is the baseline. Apple have gone from bad to OK"
Apple's scissor keyboard is pretty broadly considered the best in the industry by a country mile. Their butterfly mechanism was a bad misstep (I mean...almost indistinguishable from my Yoga 720, but compared to prior Apple keyboards), but saying that they went from "bad" to "OK" is just nonsense.
"and they're celebrating"
Advertising doing what advertising does. So brave on HN to point out that marketing is marketing-ee. Are you also telling me that the new car isn't going to make me an adventure seeking extrovert?
Why is that acceptable? If you lie or misrepresent the truth in almost any other field, you get criticised. But when marketers do it, they're immune. That's just weird.
Even I who gave up on Apple thought, hey what a nice machine.
I almost can't believe how much shit I put up with on a daily basis for over a year. If you replaced a dying 2015 Macbook Pro with a new one, I very much urge you to reconsider getting it fixed at pretty much any price. It is so very worth it.
Apple's 2016-2019 laptops were pretty unusable, hope they revert the touchbar too..
A question, has anybody tried System76 comparing to Dell/Apple?
I think this is personal taste, but I just don't love the build as much, the trackpad, and ubuntu is acting pretty flaky. The camera and wireless chipsets are not immediately working right.
There just doesn't seem to be a perfect solution.
I also have a lower-spec'ed 2020 XPS 13 laptop (8GB, core I-10) that works perfectly across the board (except for the fingerprint scanner, which I wouldn't use anyway.)
I love the hardware on both. Absolutely love it -- more than my Macbook Air; no sharp edge where I rest my wrists, touch screen, phenomenal screen (4k, which is higher than Retina, or 1080p, which gives better battery life) etc. The 2-in-1 has a taller screen shape, so you can fit more lines in your coding windows.
If you care about camera, get the laptop. It's also a little bit cheaper -- I got it for about $950 from Costco. The only thing I don't like about the laptop is that the screen doesn't tilt all the way back. The keyboard is better on the laptop instead of the 2-in-1 as well -- more depth of travel, and the Delete key is in the right place instead of offset by the fingerprint/power sensor as on the 2-in-1.
I love the new Dell XPS 13 and think both the 2-in-1 and the laptop are better pieces of hardware than the Macbook Air, and I prefer the new Kubuntu 19.04 over Mac OSX as well. (I don't have a MBP so can't compare to the Pro.)
Slightly more pricey, but worth it. Typing this on one.
I've not noticed any hiccups or problems at all around encryption.
On the other hand, I've never tried System76, and never heard of Tuxedo, thank you for mentioning!
My only beef is no NVME and it stutters a tad. If I could find a 2015 w/ 16gb/1 tb spec w/ NVME...
I'm hoping the glitches in the 16" can be worked out, eager to see how the 14.1 will look...
and I know Apple kremlinology is never the most accurate way to look at things, but boy were the Mansfield/Forstall years great...
The only problem is battery degradation (70%) and in reality it lasts <2 hours. But since I use it at home as a desktop, not a problem at all!
I've done this on a '14 13" MBP so I can help if you have any questions.
The latest XPSs look very nice.
 the drive management software didn't like GRUB, and would complain that there was no drive if GRUB was installed. Repeated reboots later, it would suddenly say "oh look, a hard drive!" and boot into Windows, removing GRUB in the process. Dell Support were worse than useless at diagnosing the problem, let alone fixing it (still unfixed, the machine is now a rather shitty games machine)
Signing 3rd party kernel modules under secure boot isn't difficult, but documentation on it is sparse. So I've kept my notes for next time: https://lawler.io/scrivings/linux-cookbooks/
The Arch guys have a great compendium on the XPS and Linux:
I shrank the 1TB SSD partition in Windows, disabled the proprietary fake-RAID, then booted Fedora's XFCE spin from a USB key. The installer handled everything else--no having to manually tinker with the UEFI partition. Post-install, the XPS's UEFI correctly booted to GRUB, which had successfully detected the Windows bootloader (again, no tinkering like with FC20) and can boot either OS.
Disabling hibernate in Windows and figuring out how to mount the encrypted Windows partition in Fedora was all that remained.
The Linux Kernel 5.5 has a bunch of improvements for the i7-9750H (and other Coffee Lake processors), which is another reason to consider giving Fedora 31 a spin. :)
I had this pleasant experience the same time I was trying to figure out how to dual-boot FC31 on a touch bar MacBook Pro. The short answer is: 80% of the MBP's hardware doesn't work in Linux. Broadcom won't release drivers that are compatible with Linux and blame FCC rules for their reason. Insane. https://github.com/Dunedan/mbp-2016-linux
For grub-uefi, it is just another file on a EFI system partition.
I enjoy this machine and would buy another.
Also, I had a windows edition which works pretty bad with linux. A friend of mine got a new Linux edition with fixed camera and I think perfect XPS is 15" 32Gb for linux. That's my alternative #1
At the end I stuck to my MacBook Pro 2015. Perfect Size Trackpad with Zero false positive. Most people put up with a few false positive on the larger TrackPad. But I dont see any reason why this have to be this way. The keyboard felt way better, even the new Magic Keyboard on MBP 16 felt exactly the same as butterfly. Touch Bar is Junk, again it sometimes freeze. And some people put up with it thinking it is a non issue.
The old Apple and its users used to be perfectionist. Nowadays a lot of people settle for mediocre.
I just wish they make MacBook Pro Classic. Just throw in a new CPU will do.
I'm not sure if it was the wifi chipset or the antenna design, and maybe some of the higher end Dells with better radio chipsets would perform better, but I returned the laptop to Dell and I was pleasantly surprised by how easy that was. At least they're doing something right. After that, I picked up a $700 ThinkPad T-series for travel and the reception is great on it.
It's faster than my Air, mostly from going 8 -> 16GB of RAM, and the high-resolution screen is great. But it still feels pretty slow and IntelliJ can get sluggish at times. The MBP has a 3.1GHz i7 so at least on paper it doesn't seem materially slower from the current gen of processors. Maybe memory speed is the issue? 1867 MHz DDR3 vs 3733MHz LPDDR4X in the new Air?
I miss the 11" form factor, but I guess that's a lost cause.
The 2015 MBP bought me (cheap) a couple years while Apple sorts out their keyboard issues, but I'm already looking forward to a replacement. It's just ok.
Overall, I found the speed of this model far exceeded my 2017 MBP from my last job, but it's kinda surprising how frequently the monitors / keyboard / mouse encounter connection issues.
That being said, I feel like I'm being a little picky, especially when I think of how nightmarish it was to get any of this shit to work with my Windows machines. It's the exception rather than the rule that anything would work the first time as expected (monitors / keyboards / mice / printers / programs).
It would be different if it had always been like this with Apple. The older models show that it used to work pretty much flawlessly. It is such a big step backwards and just because other manufacturers are crap at it doesn't make it okay. :)
I feel like I could live with the performance issues. But I'm doing a lot of switching between offices and having external screen/mouse/keyboard not working straight away is such a major pain. Even sometimes during the day after walking off with the laptop and coming back to the desk it doesn't work. It could be my specific setup but then Apple doesn't sell any official docking setups either AFAIK.
- Quad core processor
- Scissor keys
- No touch bar
- I can't adjust the volume without looking at it. Because the touchbar is flat with no haptic feedback when I land on a button, it's hard to remember the exact position of the volume 'button' without looking. Sounds trivial - but combined with point 2....
- The way the volume control expands - it actually moves the 'volume down' button AWAY from your finger, which again requires me to keep looking at the control.
This means that when a loud song comes on, it can take 2-3 seconds to quickly turn the volume down in total. I could do that with one single keypress in half a second or less on a keyboard, without needing to look at the keyboard.
That can also be the difference between missing a key detail from a quiet speaker on a Hangout.
Flashy, but it's a terrible user experience by every metric other than looks, I guess.
(This is clever, but basically undiscoverable unless someone tells you in, for example, a comment on Hacker News, which is how I found out.)
No you can't! There is a pretty long delay. If you move your finger during the delay, nothing happens. Then when it finally decides to switch modes, you have to move your finger again for it to change the volume. Hope you didn't hit the edge of the touchbar yet. Combined with the phantom button presses when using the top row of the keyboard, especially the Siri button, plus other small issues, the whole thing is bafflingly terrible.
Also pro-tip: you can change the buttons that show up in the touch bar. Settings > Keyboard > Customize Control Strip. I swapped out Siri for a "Sleep" button, which is super convenient when I walk away from my desk.
I just timed it at ~580 milliseconds, more than half a second from finger hitting the bar to the time when it stops ignoring touch input. It's easy to slide your finger more than the entire length of the volume bar in that time. It's absurdly bad. It would be weird and pretty lame if they fixed this only on newer models.
Also, where on earth is the escape key??
Different strokes for different folks, but I've never liked using the function keys for debugging. I just click the buttons on the screen. I'm a little surprised they don't have a way to set the Touch Bar buttons up to do that in Xcode though.
Moving the mouse cursor up to the toolbar always seems a lot of travel and swishing around if you're hovering over variables to see their contents in the source code.
I have found the auto/local/all view in Xcode to be a bit dumb and unable to properly expand some template objects in C++ so it's all just an exercise in frustration anyway!
Supporting more options is expensive, so it's understandable that Apple didn't want to give their customers a choice. Still, it seems like a gimick. And it appeared at the same time as the butterfly keyboard, cementing the notion that Apple had lost its way.
Using Terminal, I use the Esc key a lot for navigating and having a touch bar Esc key is not a great experience since you also don't feel feedback that you're touching the right key.
I've also accidentally hit the touch bar a few times while hovering one of my fingers above it as I press down on one of the number keys.
on a personal note, i've randomly refreshed webpages because i've overreached on the number row with the touch bar.
I press the physical button a few more times? I find that 10 times easier.
This could not have happened without the touchbar. This is horrible UX and I will never trust that (work) computer again.
Does that statement strike you as reasonable at all?
- 2 cores
- 1.1 GHz
That's not how base clock works. Base clock is not min clock. Base clock is what it's "guaranteed" to hit under sustained load if TDP is respected. It's the TDP clock. A 4ghz base clock CPU will still be far, far below that when idle.
I went from an 6700HQ (2.6 Ghz base) to a 10710U (1.1 Ghz base) and the difference is definitely there, and it's jarring enough to the point where I kind of regret it. It feels like a huge step backwards, despite the latter CPU being four generations ahead.
This does not seem to be the case for my i7-8565U nor my older m3-something. The 8565U for example feels just as snappy as my desktop i7-6700, except it’ll throttle after about 30s.
Isn't this completely down to the cpu frequency governor in the OS?
My impression was that this "stepping" is customisable, at least in linux, I regularly step my CPU manually even.
I'm not sure what Apple has here, but maybe it's not what you expect.
I had really been wanting to upgrade for Retina & better processor but I knew they would upgrade the processor and fix the keyboard if I waited for 2020... no reason to wait now.
I don't run any crazy fat Docker stacks for my own stuff at home, so this is perfect.
My at-home hobby work is in Golang and Python, and not particularly compute-intensive stuff. Neither of those have huge heavy toolchains.
Really the main thing driving an upgrade from my minimum-spec 2015 Air is the Retina display.
This is good news from Apple as I was not into any of their more recent laptops but I'll probably upgrade to this one
I only wish they had a 11" version but not a deal breaker
Still just waiting on a new 14" MBP.
- 16GB RAM
I was actually surprised by this--when I first started using this computer, I thought for sure I would need to add more RAM, which for the 2018 model is too complicated to do yourself (at least to me it seemed too risky).
I keep a lot of tabs open to look things up, but nothing excessive on that machine. I also run VSCode or Pycharm and would also bring up 5-10 containers at times.
It seriously hurt not only my productivity but also my mood afterwards just by having to put up with it for weeks.
Unless you're a very basic user I don't get why you would settle for 8GB in 2020. 8 gigs of RAM cost basically nothing, it's not worth changing your workflow in the slightest to work around that artificial limitation.
It's bizarre. If everyone used the native toolkits we'd have far less memory usage and everyone (even the memory-constrained) would have a good experience.
Also, with these memory hogs they will do a lot of allocation and deallocation. This is also a problem with interpreted languages. And allocation is the enemy of speed, and energy usage. It'll destroy your daily battery expectancy as everything gets interpreted.
With every layer of abstraction added to ease development the hardware requirements go up. You can build things fast, or you can build fast things, doing both is tricky.
I'm running linux with an anemic window manager, and with nothing but chrome and slack open (20tabs in chrome) I am consuming 6GiB.
If you add teams, pycharm and outlook (electron) it consumes 9GiB... Actually, that's also less than I expected.
Well done pycharm.
I think at 16gb you should be set for at least 5 years. Most people, even a lot of devs on company issued equipment, are working with 8gb complaining about it right here in this very thread.
If you have larger requirements, a lightweight, thin laptop with a teensy fan isn't for you. Even if it had the hardware specs, the physics of heat dissipation don't work for you and you are better off spending the same money for more hardware sitting in a box under your desk. Me and my sore back are eyeing this up, all my computing is done on a cluster anyway.
For comparison, I also have a 2012 Mac Mini at home with an SSD and 16GB RAM, and it's still chugging along pretty well too, although it's noticeably slower than the 2018 model with 8GB RAM.
I'm curious with regard to swapping if that might mean my SSD is going to wear out sooner. Maybe investing in more RAM would be worth it even if I don't feel like I need it.
Such is the accepted wisdom in much of the industry, but I'm a bit of a sceptic on this score. Of course developer time is expensive, particularly if you're in somewhere like the Bay Area where salaries are an extra 0 compared to most of the world. But we live in an era of virtualisation and outsourcing (sorry, "cloud computing") when businesses will knowingly pay many times the cost of just buying a set of servers and sticking them in a rack in order to have someone else buy a much bigger server, subdivide it into virtual servers, and lease them at a huge mark-up. All kinds of justifications have been given for this, many of which I suspect don't stand up to scrutiny anywhere other than boardrooms and maybe golf courses.
There's a nice write-up somewhere, though regrettably I can't immediately find it, of the economics of cloud-hosting an application built using modern trends. IIRC, it pitched typical auto-scaling architectures consisting of many ephemeral VMs running microservices and some sort of orchestration to manage everything against just buying a small number of highly specified machines and getting on with the job using a more traditional set of skills and tools. Put another way, it was the modern trend for making everything extremely horizontally scalable using more hardware and virtualisation against a more traditional vertical scaling approach using more efficient software to keep within the capacity of a small number of big machines. The conclusion was astonishingly bad for the modern/trendy case, to the point where doing it was looking borderline insane unless your application drops into a goldilocks zone in terms of capacity and resources required that relatively few applications will ever get near, and those that do may then move beyond it on the other side. And yet that horizontal scaling strategy is viewed almost as the default today for many new software businesses, because hiring people who are good enough to write the more efficient software is assumed to be too expensive.
There are technical reasons for this, being able to handle sudden load, but mostly it's for ideological reasons. We aren't building companies, we are building stock pumps guised as the utopian future. If you are wondering what a blue chip company looks like in tech, they are the ones that own their own infrastructure.
Maybe there is a middle road for cash poor companies, where you keep latent demand in house for the sake of cost and sense, but have some sort of insurance policy with a cloud service to step in if demand surges.
I'll admit that modern text editors and communication software have grown resource hungry, but a lot of that comes from being able to deliver a strong, cross platform experience. I remember desktop Java doing much of the same with just as bad resource usage. Same with applets.
Sure, but that immediately raises the next question of why those VMs are so big...
RAM size slowly increases as well. 4 GB was enough 10 years ago. 8 GB was enough few years ago. Today I would suggest 16 GB as a bare future-proof minimum and one can buy 64 GB for a reasonable price.
We still have room for more layers. And it's not only about efficiency, it's also about security. Desktops are still not properly sandboxed, my calc.exe still can access my private ssh key.
Once performance growth will really stop, we will start to optimize. Transistor density will double every few years until at least 2030 and AFAIK there are plans beyond that, so probably not soon.
Why should everyone have to afford that RAM?
Don't exagerate your memory requirements, you would be more than fine with 16GB.
Fortunately, I have a Dell XPS 15 with 32 GBs of RAM, but the second I start up a single VM, one more messaging app, a small handful of Docker containers, or any IDE (of which I'm running none right now), I'm going over 16 GBs.
Realistically, most of us on HN probably need around 20-24 GBs, but laptops don't come in those increments.
I develop for a living. I use 6 GB including a browser, a VM and an IDE.
Some of you greatly exaggerate the needs. Some workflows require 16+ GB of RAM, but most people complaining about RAM mismanage it or do not understand that caches are not mandatory.