What's happened is that other companies have figured out how to emulate Steve Jobs's playbook that's now several decades old and there's nobody at Apple to write new plays.
Jobs always talked about how Microsoft didn't have taste and when you look at MS products up until around 2005ish in comparison to Apple products, he was right. At some point a lot of business people realized what was making Apple successful - they weren't just selling computers, they were selling a lifestyle, they were selling cool. Oh, you are a creative? Well, shouldn't you have a Mac? It's what Einstein and Gandhi would have used.
Jobs was an absolutely brilliant marketer. Since 2005 other companies have gotten successively better at emulating Apple's design, experience, and marketing. They've distilled what Jobs knew intuitively into a formula that they can iterate on. It's not just MS, it's Dell, it's every medium and high-end manufacturer, it's Google. Please tell me what the difference between this  Pixel commercial and every Apple commercial made within the last decade is with the exception of the logo.
It's the Applefication of tech production and marketing and Apple doesn't have anything to stand out anymore.
That said, here's the truth. Just a week ago I tried out a Surface Book, and within the first few minutes I could immediately tell it was not as polished as a MacBook Pro.
The first thing that stood out is that the screen suffered from a considerable amount of ghosting. It looked great as long as I wasn't actually doing anything. Dragging a window around or even just moving the cursor though revealed clear visual defects.
I also immediately noticed that the trackpad, while high quality physically, struggled with gestures involving multiple fingers. Maybe this is configurable through software, but I just didn't find the trackpad's recognition of gestures to be nearly as good as that of a MacBook Pro.
On the same topic of gestures, doing a four-finger horizontal swipe in Windows 10 to switch between virtual desktops resulted in a very visually jarring animation. The experience just isn't polished like it is in macOS. This surprised me because I use Windows 10 extensively on my home desktop and generally find it very polished, but that's all using a normal mouse, not a trackpad with gestures.
Next I noticed that the laptop's screen was physically wobbling. I suspect this is because the screen is so much heavier than normal due to its unique detachable design. But I found this quite annoying. It was apparent that even just typing on the laptop with it on my actual lap would lead to screen wobble.
All this popped up within minutes before I even tried out any of the laptop's unique features. The version I was looking at had a price tag of $2800. At this price range, I think Microsoft still has work to do. None of this is to say it's a bad product. I was just expecting a bit more from something with such a premium price.
So yeah, taste is one thing, but execution is a different ball game.
I would definitely not want to maintain this codebase, but as an end user, I'm happy.
Windows used to have something like that too, I think it was called the Control Panel.
All twelve of them.
I agree it's likely very few people use it. It can be exactly the right tool for the job in some circumstances.
We're literally trying to be the taste and execution. I work for Cisco. Curious about candid thoughts.
I'm seriously asking. Cisco either lives on the top of rack that like 10 people in a datacenter see, much less only touch once; or it lives as a home router, that most people want to hide but still need good wifi signal.
IMO these are two different use cases. Cisco could execute almost a million times better in the datacenter by embracing better standards and making first boot and reconfiguration easier (I could write a book on this). Otherwise read as, stop pushing people to use your proprietary management systems for your gear, make those tools generally applicable then maybe people would use them.
On the consumer side, I have less experience with Cisco, but making it dead simple to deal with things like multiple wifi access points would be big, with as little setup involvement as necessary. Apple had a really simple to setup router, though it's generally been underpowered and didn't allow for more advanced configurations.
Someone really needs to fund Mikrotik big time.
While it is understandable why the legacy baggage is there and cannot be removed, the end result is a big mess, still to this day. Most (but still not all) first party software is OK to good. Third party software (not apps from the store) is a mess, usually either blurry, tiny or downright broken. Windows ecosystem being Windows ecosystem, there are many alternative to almost everything, and some software is improving. But it is still a big mess.
Apple's transition was a lot smoother due to vertical integration, a much smaller software library, more dedicated software developer base (or rather, much more willing to be early adopters of new APIs) and most importantly, a much better designed API without a 20-30 year old baggage.
It should be interesting to see people's reaction to the software, after jumping ship for macOS to Windows on these fantastic displays. I couldn't stand the mess after a week with Windows 10 on my MBP. (But I do use Windows 10 regularly on my desktop, on a "normal" display.)
Apple's API dates back to the 1980s, just like Windows'. If you want to nitpick, Windows was developed in the early-mid 1980s while Next was developed in the mid-late 1980s.
Also, I am not sure what DPI issues you're facing but I have run W10 on my rMBP for quite some time and it's been thoroughly pleasant. Perhaps your issues are fuzzy text due to mixed DPI, multi monitor setup?
So, even if a developer is willing to invest the time to rewrite an app to use UWP, the app will lose a lot of functionality. Microsoft has again poorly architected their APIs.
Unlike macOS, where there's no penalty to adopting the latest APIs. It's not as if, to use the Touch Bar, you have to give up access to other capabilities, like recording audio in the background.
More information: https://kartick-log.blogspot.in/2016/08/how-powerful-is-uwp-...
Linux world is in the same boat. It's even worse there, because a lot of the toolkits don't support high DPI modes or are buggy, and developers there are even less interested or have the know-how to fix issues.
With the emergence of cheaper 4K monitors, perhaps we'll see a change to the better, who knows.
Multi-monitor support is another issue entirely.
The harder part is to consistently fix everything that uses the old win2000 8pt default font instead of a modern one.
Microsoft's inability to execute is mind-boggling.
Because Apple designed the core OS to handle it much better, and make it easier to support for developers. (which was years in the making, far before the launch of rMBP.)
I'm not a fan, but you're interpreting the campaign differently than I.
The ones with civil rights leaders were particularly offensive. Yes, lining up for a new laptop is just like apartheid and the march on Selma. I mean honestly, how garish and impertinent can you get.
Apple's rising tide lifted all of us: it forced every other computer manufacturer take a look at the quality of what they were selling, not just feature lists and requirements.
We've all benefited from this tremendously: android was quite literally unusable crap until the last few years (queue rooters and apologizers), laptop pc's were similarly crap, and you just never got the feeling that someone who cared put these things together.
Now? Everyone is starting to put effort into the little things: the packaging, the presentation, quality of materials, usability, etc. It's a tremendous win. I'm almost tempted to say we owe Apple, but then I remember their coffers are already full.
I imagine there are other stones still left unturned, and the next Apple will be the ones to pay similar attention to detail and craftsmanship in an area where there previously wasn't any.
I'm not going to pretend that Apple did not significantly influence the direction of product design. It absolutely did.
However, you're making it sound like Apple invented product marketing. They didn't. Microsoft and Google are new to the marketing game because they're entering the consumer hardware business, which is a new venture. When you're the defacto software solution (Windows, Google Search), you don't need marketing. You don't need to build a brand. Your brand is so ingrained in culture that everyone, everywhere uses it.
It's a lot better of a spot to be than trying to reach that goal. Google and Microsoft are now going to try to leverage their software platforms to launch hardware in the same way, but it needs marketing.
If you look at the old issues of Byte, you can immediately see the difference. Most of the ads are generic spec/price lists with maybe some crappy hand-drawn graphics. A few are photos and professionally drawn images with some attempts at metaphor, but they invariably look crude, heavy-handed, jokey, or cartoony.
The primary focus in the early Apple ads is the Apple user at home, with his partner. It's the first thing you see before you get drawn into the copy. It's an attempt to sell soft values, not just hardware specs.
This ownership of soft values is a huge difference that most of the industry still fails to get, and which Apple itself seems to be forgetting.
Apple is the only company ever to create technology products that appeal equally to men and women without relying on any traditional gender branding. (At least that used to be true until "rose gold" appeared.)
MS is sort of catching up now, two decades later. But it's not a natural cultural fit for them, so I'm not expecting a stream of great things.
Google is - well, the brand mascot is an android. That's all anyone needs to know.
Jean-Louis Gassée (head of Mac development and later global head of marketing, '85 - '90) seems to be have been the intellectual force behind this in Apple.
It's good, but nothing I would be surprised seeing coming out of HP or Sony.
For what it's worth, Apple created a similarly attention-grabbing hinge on their G4 iMac back in the day.
Yes, but that was 14 years ago.
This line seems so superficial, so fluff. New new? No Google, I want them hot off the tables of production, not yesterday's cooled phones. Really, is new-newness the factor here?
Sounds more like a drug dealer:
"Hey bro... Need some shit? ... Like, new new shit?"
I work for a 100+ person company. Outside of the admin staff, everyone has souped-up PCs doing a huge load of 3D work. The only people with Apple devices (iMacs and MBPs)? The four 2D graphics designers and the executives. Why? The graphic designers would be in uproar and the execs believe clients expect them to present a certain 'aesthetic'. The MBPs are mainly used for email and powerpoint presentations.
They have execution and commitment.
When I buy a MacBook Pro I can expect to get many years of use out of it. The hardware execution is almost always exemplary and they will continue to support it on the software side during that tenure. Buying a Windows or Android product feels like you're only going to be supported for a year or two and that's it.
In regards to software, Windows XP support lasted 13 years. OSX comes out with a new version every year. Over the last few years, each is buggier than the last. Ever since Mavericks, I wait 6 months into each release for them to fix the load of bugs before I actually do the upgrade. El Capitan, especially, was just an all-around mess.
I love this computer, but I can still effortlessly criticize it in both hardware and software. They are simply not what's made the macbook stand out against a sea of black laptops.
The major change in there is that the laptop and desktop market has matured enough that, indeed, this year MBP needs to compete with 2012 MBP and every other laptop.
Before those time, just re-releasing the same box with improved internals was enough to sell, hence an ocean of generic laptop where a brand like Apple could rise to the top with a level above polish and attention to details.
The market changed, not because Apple lost its mojo, but because you can't sell a laptop on its spec alone meaning only the Apple way is really profitable now. Apple/Jobs genius was doing it before, allowing them to reap enormous benefits for years.
This year I'm looking at the new MBP to replace my old 2007 MBP. I happen to be happy with the update (unlike many here on HN), but I would have had no problem buying last year model in case of disappointment. Even a refurb 2012 model like my main machine would have been perfectly fine.
having to buy a 70 dollar charger every year vs having to buy a new laptop every 18 months.
I'll stock up on chargers.
What on Earth kind of violence are you putting your poor laptop through? I haven't had a desktop in about 10 years, and I just bought my third one in that time about a year ago. (The previous one was doing fine actually, but I gave it to my mom)
I showed the video to my wife and she was so impressed that she looked at all the videos Microsoft did detailing the development. She's been a mac user for more than 20 years and tt's the first time she's ever looked at a Microsoft product.
Apple during the Steve Jobs era had a reputation for taking risks, for coming up with new products that impressed and that people just wanted.
A lot of people I know were excited by the announcement of new products and tried to watch the keynotes or follow the live coverage. In the past year, none of my friends care about them anymore. I only cared about yesterday's event because I've been waiting to upgrade my macbook pro and the event was lackluster. For better or worse, by not taking risks and not introducing new products, Apple seems less cool than it used to be and Microsoft showing off the Surface studio highlighted this. It's a huge blow for Apple in term of marketing.
(As an out of topic aside, Apple even sucks at doing boring incremental improvements, the iMac, Mac Pro and Mac Mini are languishing and the new macbook pro 15 inch design choices are non-sensical for professionals)
You can't imagine how disappointed I was that they tap out at 16GB of RAM. I don't use that much memory every day but sometimes it would be really nice to have. I haven't bothered watching it again but the announcement gave me the impression that it started at 16 and going to the purchase page was a sad time.
The only innovation I saw yesterday which will be copied really came out of Microsoft with the Surface Studio and Surface dial. I'm sure microsoft would really like to see a surge in high-end desktops. The only worthwhile technical feature Apple did was adding touch id in the touch bar, but even then how often would I use this? Also seems like the more logical place for touch id is actually on the mouse where my fingers are a majority of the time.
The only commonality between these two is there marketing departments thinking people want to spend $3k+ on products, no reversal there.
As soon as they can pack the power i'm looking for in my laptops into an apple style case, i'll be there. Until then i'll be sticking with my boxy black Clevo.
1983: http://cosy.com/language/cosyhard/cosyhard.htm - http://cosy.com/language/cosyhard/ampropn.gif (note: no trackpad - but then, the original powerbook didn't, either: https://en.wikipedia.org/wiki/PowerBook_100, and that round trackball is in just the same position as that round lid latch...)
Apple does a great job integrating elements that may have existed before, and is definitely a trendsetter in styling, but they also get credit for creating a lot of elements that existed before. ("Great artists steal.")
iPhone seems in a very similar position. Sure, there are some people who will switch from iPhone to an Android phone. But Apple's biggest challenge seems to be convincing their current customers they need a new phone at all.
I found this out by getting stranded on the side of the road somewhere after I discovered that Uber had auto-upgraded itself in the background, and the new one refused to launch on my old phone.
> If there's a compatible version, a message appears and you can choose Confirm to get the latest version of the app that works for your device
That's not much of a challenge for Apple. There is always a laundry list of features that people want but that aren't necessarily mature (e.g. Wireless Charging, NFC, waterproofing) and Apple keeps those in it's back pocket incubating. Occasionally it pulls one out and slaps it on one of their devices to boost sales.
They also drive sales by adding or removing features from their devices (e.g. Firewire 400/800, USB-C/MagSafe, Lightning) or dropping support for legacy systems.
Mechanical failure is also a common reason to upgrade but Apple seems to be cannibalizing this revenue stream by replacing mechanical devices (e.g. switches, buttons, touchpads) with solid state ones that use haptic feedback. I'm curious how that will play out, will Apple's Taptic engine be the point of failure?
They solved this by just releasing a new color every year.
Not everything is a conspiracy, FFS.
This is what happens in mature product segments. The first iPhone was released 9 years ago, and that was the last real revolution -- a smartphone whose front was pretty much all screen. Everything Apple and everyone else has done since then has been incremental -- better screens, larger screens, better cameras, faster processors, thinner chassis -- but definitely the same paradigm.
The same thing happened to laptops before that. Some of the older laptops had pretty odd and uncomfortable designs. Sometime in the 90s, everyone standardized on the clamshell laptop with a 4:3 screen (later moving to widescreen), a low-profile keyboard, and a central trackpad below that. If you compare a 20-year-old laptop to a new one, you'll see the same incremental changes that happened to phones -- larger, better screens, faster processors, thinner chassis.
I think the bottom line is that the smartphone market has become as mature and predictable as the laptop market.
I actually think that is a good thing. Commoditization of gadgets usually mean lower prices and standards across models.
How did that happen? It used to be that a normal laptop had a 4:3 screen and a ridiculous, huge laptop had a huge, 16:10 screen. Now both of those are gone, and normal laptops have even shorter 16:9 (!) screens. It's hard to think of a more user-hostile progression. I don't want to work in a series of cramped side-by-side windows. I want a screen that can display more than one paragraph of text at once.
Nothing's stopping you from watching it on a 4:3 screen, either, although at that point you've shrunk the image pretty noticeably.
Now the best they can do is a second lens for kinda bokeh and adding a touch bar instead of physical keys on the macbook.
While dumping the headphone jack and magsafe (and HDMI and SD).
Touch ID is cool and all, but in 2016 calling it "amazing" is a bit of a stretch. ThinkPads first got a fingerprint reader more than a decade ago: http://www.technewsworld.com/story/37017.html
As a feature, the Thinkpad fingerprint readers kind of sucked. All you could do is log into Windows with them, if you had the Thinkpad crapware installed. They weren't very reliable, either. I had one, and I gave up on using it after a while because typing my password was faster than attempting to use the fingerprint reader several times, then giving up and typing my password anyway.
Touch ID is built in to iOS. You can use it to unlock the phone, and to authenticate yourself for various Apple applications like the App Store and Apple Pay. There are also APIs for third-party applications to accept Touch ID instead of passwords, if the user wants to do that. It is fast and generally reliable. It's light years ahead of the fingerprint reader on the Thinkpads.
This mentality is what resulted in them shipping what is essentially a portable media player that doesn't allow you to plug almost every pair of headphones that has been manufactured in the last 30+ years into it.
I think Macs are designed very well, even though I do question the decisions of the new MacBooks, and I would prefer to have a little bit more than 4 USB ports and thin laptop. I also really like Mac OS, even with it's thorns, because its design is much better than whatever version of Linux I would use (not a big fan of Ubuntu's Unity). Most likely, my next laptop purchase will be either a 2015 MacBook Pro or some nice non-Apple product with Ubuntu installed.
I wish we would have executed on that wish a year ago, because with the two Microsoft & Apple events, it's clear there is demand for "a Surface Studio, but not with Windows 10". Yesterday people effectively said they stay on Macs because of macOS, but envision switching to PCs because the innovation and specs are much better. Had we worked on that, we would have released a credible alternative to Macs, with distrib that would have embraced the Surface Studio while providing 1. the design, 2. the experience and 3.the privacy that everyone is looking for in an Apple computer...
As a reminder, the idea of a Paid Linux is to fund the open-source community with the same flow of money that Apple and Microsoft get from their OS (at $200/yr), in order to provide the same "red carpet" experience for specific profiles of users (either 3D workers, either graphists, whatever profile we target first). At the market level, one great experience for one type of work would develop adoption for Linux on the desktop. At a more selfish level, the benefit to paying for Linux instead of MS/A is that the new improvements are effectively open-source, so we're effectively raising the baseline of what every other distrib can do. The way to make people pay is by only providing their upgrades through authenticated PPAs, which means professionals will pay because it's easier, and hackers will redistribute versions on Torrent, which we don't mind, because hackers are a benefit for Linux. Besides, even hackers understand the value of funding open-source, so they might still participate. A lot of people would rather pay for open-source than closed-source.
I didn't execute on that wish, because I'm not an OS-level person, and I don't have the UX design background necessary for this endeavour. Nor the marketing know-how to execute at a high level. But I really wish someone would do it.
Linux is a wonderful server OS, and in fact I do most of my development inside of a VirtualBox VM running Debian, but in order for me to move to a full Linux workflow, I need to use some proprietary software packages like Microsoft Office and Apple Keynote (while I find LibreOffice Writer to be a suitable replacement for Microsoft Word, Microsoft Excel fits my needs better than Calc, and Impress is behind both PowerPoint and Keynote).
In line with a paid, polished Linux experience, another thing that would be nice for me and other disgruntled Mac users is a Wine-like compatibility layer that allows Linux users to run Cocoa programs. There's already a project called Darling (http://www.darlinghq.org/introduction/) that has some of the basic functionality implemented. If this project had more contributors, then it could develop into a working solution for running my Mac programs.
My dream OS would have a Unix-like foundation (like Linux or FreeBSD) with an interface similar to Mac OS 8/9 (with various features from OS X added like Spotlight, Expose, and the Dock) and with Don Norman's UI advice (http://www.jnd.org/dn.mss/apples_products_are.html) taken seriously.
I'm actually interested in contributing to such efforts toward an alternative OS for disgruntled Mac users; I have experience with systems and kernel programming. If there is enough interest, maybe an alternative OS will materialize.
We all used to love Mac, we all see that Apple has lost its way and seems to be heading more and more confidently in the wrong direction, and that may be just the push the community needs to actually build something.
And what an amazing achievement it would be--an actually open alternative that runs on lots of (powerful) hardware with the beauty of Mac OS (before Lion lol).
I really think it's going to happen because I think a very large percentage of the community now realizes that there is no existing private company heading in the right direction. So we now have to take the steering of the ship into our own hands.
A fully open community "Mac" operating system that runs on any x86-64 hardware would be an excellent thing. I believe the best way of getting there is contributing to the GNUstep and Darling projects so that the underpinnings are fully functioning, as well as working on a Snow Leopard-esque interface.
I have some free time over the next week or so; I'm going to start developing a plan for making this idea a reality!
Who's going to fund the upfront development? "We're going to make a new desktop OS that takes on both Apple and Microsoft" sounds like a hard sell for the VC crowd.
I wonder, though, if these is enough interest in the FOSS community to make such an idea a reality, where the project could be started by volunteers and donations could be requested?
So if you got the team that wrote the Apple Aqua UI on board to write it, and have great PR, then maybe. If you get the KDE or Gnome team, no way.
- Design isn't appreciated or valued. A successful OS effort can't be driven only by engineers.
- An obsession with choice as an end to itself, and being unable to say, "This is what we think the best user experience will be. You can't change it."
- Fragmentation with too many APIs, GUI toolkits, and so on.
- Participants focusing on users like themselves (tinkerers and geeks) rather than typical end-users who want to use the device to get their real work done, rather than messing with the system.
Until you have a plan to address all these, throwing more money at the open-source community won't produce a macOS-quality OS.
-Supports the most powerful and popular hardware.
-Yes, focused first and foremost on tinkerers and geeks, who are the main people losing out in the current environment. It's all a group that's increasing in size and will likely continue to for the foreseeable future.
How's that plan sound?
Because tinkerers want choice, such as with multiple window managers, sound subsystems or what have you. If you have N window managers, now you need to build and maintain all of them at a high bar, which is N times as much effort as one.
Plus the combinations: window manager X doesn't work with sound subsystem Y.
An awesome Linux would have just one supported window manager, filesystem, sound subsystem, and all the rest.
You will also need one dictator who understands eng, UX, product design, sales, marketing, and so on. The dictator listens to everyone, and people can present information and perspectives and debate as much as they want, but at the end of the day, it's the dictator's decision.
If you do all this, yes, you can succeed, in theory.
I would love to hear your thoughts on the Cinnamon DE. Genuinely curious.
Caveat: I haven't used a Linux GUI in ~3 years, so my opinions might have changed a lot since then.
MacOS has its problems, but its still many time better than the POS Windows OS. The only other option is Linux/Ubuntu, which is nice but wouldn't be my first choice, but definitely a second choice.
You can't cover a shitty OS with a Shiny dress and fool most people who have been burned by it. They can copy all they want, at the end of the day it still has windows installed...
Looking at the poor experience of other commenters in this thread, it looks like Microsoft hasn't realized that is how Apple has managed to do it, and the Surface devices are really "just another PC" and not first-class Windows hardware.
I'm not against change, but change for the sake of change is annoying.
The idea well must be running pretty dry at Cupertino
What they have done now with the MacBook Pro is exactly the same thing. They are trying to establish USB-C as the one port to rule them all just like they successful did with USB-A.
Yes change can be hard and some people like yourself clearly struggle more than others. But change is needed sometimes to push the industry forward.
There is a time to be bold and throw your weight behind a better standard, but there's also a time when doing so makes things inconvenient with little to no benefit.
That wouldn't be so bad if the Macbook were significantly more powerful but here we have a machine with the same max RAM as the Macbook they sold 4 years ago while PC laptops are shipping with 4 times as much.
It just seems like Apple is focusing on making a sleek and visually pleasing device, rather than a device that will be most useful to those who want to work on MacOS.
The new MacBook Pro is an update to an existing product, with a large base of current users who already have expectations about what the product should do for them, at a relatively high price.
That's a big difference, and it's the main reason people are so peeved about the total switch to USB-C. Apple should have added USB-C ports to the existing slate of MacBook Pro ports for at least a generation, to help everyone bridge the gap. Nobody cares about the 2 or 3 mm that Apple shaved off the thickness by discarding all those ports for this generation of the product, and anybody who does can go buy the little MacBook instead.
My 2011 MBP has 16GB.
I used my first OSX machine ever earlier this year. I wanted to add some extra keyboard commands, so I had to download a 3rd party program that had to unlock accessibility controls and essentially take full control of the machine. That's absurd to me.
I also hit tons of external display problems. Things wouldn't connect or sync right, desktops not correctly moving to the right screens, no window snapping even though it's a high res screen, etc.
I hit way more beach balls/lags than I do with my modern Windows machines.
Windows 7 was solid, Windows 8 was a bit of a mess (as has been every 1st iteration of Windows [98, 2000, ME, XP, Vista, 7, 8, 10]), but Win 10 is really enjoyable.
It's unfortunate that the BSODs caused largely by shitty drivers in Windows 98/XP has hounded the Windows ecosystem for over a decade, even though the driver verifier has fixed the vast majority of those.
Also note that accessibility must be “unlocked” because it fundamentally adds a security risk: processes that can inspect inputs in arbitrary unknown applications have a lot of power. What macOS does is actually a feature and it prevents one of the “tire fires” in Windows.
Beach balls in my experience are usually produced by problems in drivers, or programs that are unnecessarily complex (like the ones that are 400 MB installs with entire virtualization layers instead of being all native code). The OS can only do so much.
External displays: yes, definitely buggy on Macs these days, and with some obscure settings. Something to try: with at least 3 desktops/Spaces defined (need to use "+" in Mission Control), you can right-click on the Dock icon of an application to specify the default space to use for windows in that app. These menu commands are not available with less than 3 Spaces created.
Well, discoverability certainly has been a problem with Apple software lately.
Take Force Touch as an example. Perhaps the menu items will show up with fewer than three Spaces created if you just press it harder?
Even after that, things never went smoothly. There was some sort of bug where I couldn't enter my Apple ID security answers in the OS dialog box. Like you, I had problems with external displays, especially with waking up a display when the laptop itself woke up. The very first time I tried to restore from a Time Machine backup I got errors. Problems with iTunes and Photos sync. Not to mention more or less subjective issues I had.
I had more problems with this laptop than I've had with any Windows machine.
Because OSX comes with a terminal environment that is almost identical to GNU.
I bought a non-apple keyboard and still can't map home+end keys in all applications without a daemon running.
Despite disliking GNOME 3 when it came out, I got used to it and I honestly think it's the best desktop out there.
Also haven't had sleep or hibernation issues in years. Though I've used Thinkpads and Dell XPSs which are probably the best supported laptops for Linux.
Meanwhile, they did nothing to end the long wait for desktop users yearning for an upgrade. I was expecting at least a cursory spec bump for their Mac Mini and iMac lines. I was hoping to replace my 6-year old Mac Mini sometime soon, but I don't want to do so with a 1-2 year old product.
Still, Apple will not change a thing unless people actually vote with their wallets. If you're really so upset, do NOT buy this damn computer!
If this latest line sees very little sells, they'll get the idea.
That's probably because a lot of the people who are happy no longer bother to read or comment on these types of stories. There's just too much negativity and it's the same for Apple, Microsoft, Google, etc. People with positive opinions are basically no longer welcome to participate in the discussion.
I'd gladly buy any other laptop that was simple and just worked out of the box in the sense that a Mac does. But there aren't any.
You are badly misrepresenting the current state of mainstream Linux distributions. I've done probably 20 Linux installs, and by and large it always just works. If there is some necessary proprietary driver, it's almost always as simple as Menu --> Administration --> Driver Manager and then clicking once or twice.
Kernel patches? I wouldn't know how the hell to do that, but somehow I've been using Linux happily for a decade.
For the laptops, the clickpads never work well. You have to mess with synaptics settings a lot and eventually you get a slightly worse config than the default on Windows and a lot worse than OSX clickpad. This is coming from someone who looked into hardware compatibility and bought a laptop that didn't seem to have any problems.
For Desktops (and laptops * 10), you have a lot of issues if you want to
A) Use CUDA in general for neural networks.
B) Resize VM encrypted hard drive after creation.
C) Dual boot with Windows (things like updating windows or reinstalling a Linux distro after digging yourself into a hole with CUDA drivers mentioned above can wipe grub in a way that wouldn't let you boot).
D) Allow hibernation in a dual monitor setup with proprietary drivers.
E) Use a tablet for drawing on a system with multi-monitor setup and proprietary drivers.
F) Mess with Compiz settings too much when you have proprietary drivers.
G) Dual booting with one hard drive with Linux full disk encryption and non-default partitions and another hard drive regular Windows.
H) Dual boot from one disk and encrypt Linux partition with luks and forgo swap partition.
Having said all this:
I would still use ubuntu/linux because almost all non-.NET/Java tools are easier to use on Linux. Lets you customize your system and code without VM overhead and inconveniences.
The system doesn't get slower over time and nothing unexpected randomly happens (except after updates to graphics drivers).
I'll add one to your list -- dealing with Linux audio. I have a stable music production setup now, but it took me a long time to iron everthing out.
I recently had to binary patch a video card driver -- using a patch I found on a forum -- just to get it to start X, after I had specifically bought the hardware (Foxconn, AMD - major brands) because they specifically advertised that they fully support Linux and X, which I had cross-verified with the Linux and X documentation.
Why? The video card was a Radeon model "1234" (I forget the real exact number), but the hardware was a newer minor revision that identified itself as a "1234-A" to the OS in its ID string. Linux (Ubuntu) then simply refused to load the "1234" driver, because "1234-A" was not in the list of approved card ID strings in the driver.
Absolutely nothing on AMD's site, nothing on Foxconn's site, nothing on Ubuntu's sites. Contact tech support? Ha.
Eventually -- after days of googling for a solution -- I found some obscure forum where others were discussing the exact same problem. Someone provided a binary patch ("just load up the video driver in a hex editor and change the following 20 bytes to these other opaque values".) It worked.
That is why I don't want a Linux desktop.
You don't have to make up stuff to make a point; yes, Linux is probably not as polished as MacOS (though I doubt that), but it's much more user friendly now than you make it out to be.
No, I should not have to edit any text files to get the GUI to come up.
What text files are you referring to? I've been using linux for 10+ years and have never had to do this.
I've been holding out because until now all the apple laptops on offer have required huge compromises for me. The macbook is too gutless (expecially its GPU which apparently can't drive big external displays smoothly). The recent airs still don't have retina displays and the old macbook pros are so big and heavy. After owning an 11" air I can't imagine travelling with the previous generation 15" mbp.
The only downside is the exorbitant price. But to replace a machine I've used just about every day for 4 years I can easily justify the cost. And I hope (and expect) the price to drop over the next few years. The air, macbook and retina mbp all started out pretty expensive and then dropped steadily over the lifetime of the design.
Many of the people complaining were never going to buy a mac anyway.
To achieve that, they started by only targeting specific hardware. Android supports various hardware and so does Windows. Does iOS and macOS do the same? no. Because it would compromise the experience, which is tied directly to the value of their brand, which is ultimately what allows them to price their products the way they do.
Apple focused on creating products people want to buy. I am not from a wealthy country, and I have seen people who have put basic needs aside to purchase an iPhone. For a much lower price you could purchase an Android phone, but they did not care. This is the power of a consistent, pleasant experience, something that Microsoft and Google seek to now obtain through the Surface and Pixel respectively. Let's see what happens.
It's so funny how Apple doesn't release a dual-mode touchscreen laptop because they are afraid their iPad sales will tank. So cowardly - unlike Jobs' Apple!
Microsoft got artists with the Surface Pro tablets and pressure sensitive pens, etc.
Was this a coincidence? I'm guessing not but no idea really.
Windows 10 includes binary compatibility. bit-for-bit, checksum-for-checksum Ubuntu ELF binaries running directly in Windows. all of Ubuntu user space.
a full copy of ubuntu built in. Windows not working as a unix-like os for developers isnt reality anymore.
I'm less than thrilled with Windows right now because just the other day the built-in Mail app started crashing within a few seconds of opening; it's now unable to connect to my Exchange account. After spending 10+ hours trying to diagnose and fix the problem, I've accepted that nothing short of a complete wipe of the machine will fix the problem.
You can pick individual bugs to complain about all day. Every OS has plenty. Sorry to hear about your issues with the mail app though.
if you log into a new user profile, does mail work? if so you dont need a machine wipe.
Windows 10 tries. It doesn't run the actual Linux kernel, and doesn't implement quite some portions of it. There is no boot process, so system services can't be run the same way they can on a real Linux install. The VFS is buggy and slow. The install hangs and borks sometimes, without a clear way out. And it still can't run Ubuntu 16.04. And the biggest disparity: no GUI support.
Are you "Kirkland", you used the exact same phrasing.
It is early days, but things keep getting better. The 'I need Unix' crowd will soon find that Windows is a much better Unix than macOS ever was.
Cutler must be crying right now.
Now if only Windows Phone wasn't being completely ignored (who knows, maybe it could have been a better Android than Android)...
Treat your upvotes as a currency that you can spend on controversial opinions and phrasing. Accusing down-voters of being 'angry people who don't engage in discussion' is certainly immature and hypocritical of you. Generally speaking, there are dozens of great reasons to down-vote someone and not type a reply.
(irony of using "M$" when you're talking about them not making money is great, btw)
what you're shooting for here is a pun. You want something that looks like a regular character, so @pple is kind of cute, but there are two obvious problems. One, it kind of looks like a direct message to pple, which falls apart. &pple isn't bad, but & doesn't have the association with money, so you don't get the pun.
There are a bunch of currency symbols you can use, there's a list here . Check it out ₩₦€v€t$, then all of your currency puns will be money.
I'm sorry, are you from the past?
I was asking people if they were programmers, the median reaction was "a programmer? oh lord no, I'm a [marketing/UI/non-technical] person".
It seems the term was coopted and gutted by some in the startup scene.
(I'm aware there are legitimate hackers in the startup scene, this was just one particular event in one particular city.)
Many talented tech folks move to first-tier Canadian cities, or to first tier U.S. cities, which are zeroth tier by Canadian standards, if they aren't tied down.
They are obviously confusing a hack with a hacker.
Nothing wrong (or right) with any of that. I'm just observing.
I would be quite amused if young "Microsofties" suddenly decided to co-opt "Micro$haft" but with a Richard Roundtree vibe.
Now you've got Google backtracking on privacy and siphoning up user data to create ads that know you a little too well, Amazon as more threatening Walmart of the digital age, and Facebook trying to make sure that the only way you can experience the online world is through the censored lens of Facebook itself.
It makes all of the once-held fear of Microsoft dominance seem quaint.
At least Apple is still building pretty gear and killing useful ports, just like back in the day.