Hacker News new | comments | show | ask | jobs | submit login
Why I left Mac for Windows: Apple has given up (char.gd)
695 points by shlema 286 days ago | hide | past | web | favorite | 759 comments



I used to be hardcore windows guy...

Then 10 years ago I got a mac. I never went back..

But what am I saving money for right now? To build a nice PC again.

Mostly because of the exact reasons in the article.

I have a fondness for apple... but they have definitely lost their way. First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone. Then they became a computer company who also made a phone and a tablet. Then they became a phone company who also made computers and tablets.

Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

It makes me sad, as a mac fan. The hardware is getting worse. The decisions are getting dumber every time. I wont buy a laptop without a magsafe or similar connection, i have kids and animals, and the magsafe has saved a laptop more than once.. to remove something that was as core and identifiable a part of their computers was just a stupid move and served no purpose.

They don't listen to the industry or the consumers anymore, they stick their fingers in their ears and pretend to know best.

Jobs was hardheaded, but reasonable. Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.


I too have become annoyed recently at really basic stuff not working in macOS, like I can't drag and drop a picture out of Photos app into Pixelmator to edit. In fact, dragging and dropping a picture out of Photos into any application is completely broken and doesn't work. Its as-if the Apple QA team no longer tests to make sure their applications play nice with each other in a way that has been fundamental to desktop computing since the desktop OS was invented.


At the same time, Windows 10 on my Dell laptop will heat up, spin the fan noticably and something called "Service Host: Local System" will consistently use 48 - 52% of CPU when the laptop is sitting doing nothing at all, with no applications open. I really can't understand what it's doing but have a hard time taking it seriously as an operating system.


Check the resource monitor to see which service that corresponds to: it's likely Windows Update since they changed it to mine Bitcoin or something about a decade ago when it went from being I/O to single-core CPU bound.


This is just a guess -- being very late to the conversation -- but in case someone even later is curious, this may be the well documented case of the .NET framework recompiling libraries in the background after an update.

It seems like it will never finish, but it does ... eventually. However, the lack of any UI, any indication of its presence or its progress, and also that killing it will just make it come back later are all pretty hostile.

(Google for more info.)


I'm not sure what it is but it's amazingly inefficient: I notice it most when downloading VMs from https://modern.ie because as soon as I start a test VM the updater will run and then it's multiple hours + reboots before a browser in the VM is remotely usable.


Is this a serious statement, that windows update mines Bitcoin?


No - just my boggled speculation about the kind of work which would tie up a modern CPU for hours installing a few GB of files. (SHA-256 could hash each file in a couple seconds)


A MS engineer posted a lengthy blog post on why this is the case. The ridiculous time has to do with insane system state and dependency checking for every update and each one in history before it, since Windows Update allows you to add or remove every little update ever for the OS.

The Rollup updates that MS moved towards, even in Windows 7 now, is supposed to help this. However, a completely new solution is really needed to replace this antiquated one, like many other lingering parts of Windows.


Decompression. They probably weighed compression time vs file size and realized that every byte not downloaded is money saved on bandwidth.


No.


Come to think of it, that's actually an interesting monetization scheme. "You can use our software for free, but when your computer idles, we use x% of its computing power for mining cryptocurrency" (or whatever profitable endeavor). It's of course disastrous from an environmental perspective, but still interesting.


And if it doesn't idle enough, we'll make it.


This is very likely it from my experience as well.



It happens on OS X. Kernel freezes as well.


Svchost.exe is a bonkers bit of design that goes against the simple, basic separation of concerns that OS-level processes provide by running multiple "services" as DLLs loaded into one process. This makes it extremely hard to pinpoint which service is faulty.

Apparently the magic runes are

    sc config wuauserv type= own


Just go into Task Manager, Details tab, right click on the columns and add "Command Line". Now you can see what each svchost does.


That helps a little, but fundamentally the runtime of a module is still "billed" to the svchost.exe process as a whole - so if a particular one is at 100% of a cpu, you can't tell which module is causing the problem.


This kind of bug is not unique to Windows.


Seriously as I am typing this right now, on Mac OSX, the "kernel_task" is using 50% CPU for no apparent reason and fans have been running for > 20 minutes


> I really can't understand what it's doing

Probably backing up your files to government servers :-)


I used and loved iPhoto and then I had a great (first impression) of Photos until I tried to do the thing you describe: DRAG AND DROP. Apple forums provided an answer (I would call it a work-around) with their usual condescending tone: hold Alt while dragging, and you can at least get a copy of the photo/video of interest into a folder in the Finder.


I just tried it in Photos 2.0 (macOS Sierra 10.12.3) and it works as expected - drag'n'drop from Photos to Finder copies files into selected Finder folder.


I recently discovered the detailed and amazingly accessible Aqua Human Interface Guidelines from their golden years have been completely gutted and replaced with the macOS guidelines that cover only the basics. They've abandoned years of accumulated wisdom. Luckily the old versions still float around in PDF form, the first few chapters are just good timeless UI design advice.


http://interface.free.fr/Archives/AquaHIGuidelines.pdf In case anyone else is interested.


Photos overall is a trainwreck. Options are hidden and hard to find, anything beyond "tag this image with your auntie's name" is way harder than it should be... and this without ever mentioning the dreadful "I'll quit when I want to, not when you tell me to" backend process.


I've noticed this as well. I use a "workaround" in the form of Yoink: https://itunes.apple.com/us/app/yoink/id457622435

Yoink seems to trigger a file export in Photos. Why that doesn't happen with other programs is anyone's guess.

I put "workaround" in quotation marks because this is really how I do all dragging and dropping between programs in macOS, since I tend to run stuff in full screen.


Is it even possible to chose a photo out of your photo librry from a browser's file upload dialog? I don't think it is, when the photos are in iPhoto. How many steps does it take to get a picture off your camera, into iPhoto, then exported, and finally uploaded to Craigslist?


Scroll down to the 'Media' section in the sidebar.

Select 'Pictures'

There you'll find all your pictures in Photos


Photos should show up as an icon in the side-bar? Rather than navigating to the actual files I think you're meant to access them this way. It's a good idea and used to work better in my memory - these days it can be a bit finnicky to find the photo you want.


It is possible and very easy. Just drag the photo from iPhoto to the upload dialog.


You can't copy and paste an image out of Photos, either. It's really quite ridiculous.


1. Open Photos.

2. Select Photo.

3. Edit Menu -> Copy.

4. Open TextEdit.

5. Edit Menu -> Paste.

It is ridiculous. Ridiculously simple.


After some experimentation, it turns out my problem was that I expected - ridiculous, I know - that it would copy image data in some form that's comprehensible to Gimp, Firefox and Thunderbird. So I now rather suspect that Photos is using some new API, that Gimp/Firefox/Thunderbird aren't, and that this new API is, for no particular reason that I can see, not remotely backwards-compatible.

Feels like this forms a data point for the thread - since meanwhile, on Windows, I have a couple of programs in my bin folder that I use fairly regularly that I compiled in 2006.


Yes you can - I've just done it.


Instead of being condescending you could have explained how to do it.


Sorry, it really was a question of selecting a picture and the selecting copy from the menu. I wasn't trying to be an arse.


I know this isn't the point of the comment, but Yoink works perfectly as an intermediary for this. I use it dozens of times a day for various things but this one in particular is a lifesaver.


> really basic stuff not working

Try selecting text in a Windows dialog box, such as the "About" box. It doesn't work, it never did as far as I can tell.


But isn't it sad if MacOS has to be compared to Windows to relativize its flaws? People were using it because they felt it was BETTER than Windows, not because "both are meeh".

Also arguably, selecting text in a dialog box (instead of copying the whole text) is a lot less used feature than drag&drop of photos.


Ctrl-c works though.


Is dragging a photo to the Desktop and then into the app really a huge deal? Just curious, because that works fine for me.


It's not a big deal, but it's a death by a thousand cuts. These little things that used to be part of the refined experience of OSX are turning the "just works" part of their slogan into "damnit, why can't I"


That's called a work around. Its not a big deal in the grand scheme of things, but its not something you'd call a desirable design trait for a desktop app.


And then deleting or moving the photo from the desktop afterwards. So, three steps as opposed to one. For a feature (drag n drop) that's all about convenience.


Ah, that Apple mantra. It's not hard because it's hard, it's hard because you don't do it OUR WAY. Do it OUR WAY and everything is easy.


Buy a breakable USB cable? I've had MacBooks for 10 years, and I'm ecstatic to see MagSafe gone. Apple chargers have always been shit, and Apple laptops have never been compatible with off the shelf battery packs. Moving to USB-C and detachable, replaceable cables, is a huge step forward.

Also, what PC alternative has a MagSafe equivalent?


People complain that Apple uses proprietary ports. They ditched MagSafe for USB-C and people still complained.


I think the magsafe was at the bottom of the list of "bad choice of proprietary ports by Apple"


I can now charge my Macbook with the same cable I use to charge my phone. I can now charge my laptop from both sides. I can trickle charge my laptop in my car now. It's making my cable life so much simpler. Yes, I have other devices that have yet to catch up to the C spec, but with the leading laptop manufacturer pushing so hard for the C port; I don't think it'll be that long.


MagSafe is the only proprietary port on my pre-touchbar MBP.


Good point, and it's because of the status quo bias. When an alternative is proposed, people look at what they lose, and since we don't like loss, we say, "No, I don't want that".

Thought experiment: say Macs had USB-C for 10 years, and the latest Macbooks switched to MagSafe. There'd be a lot of complaints about Apple locking us in to their walled garden, and how we have to buy a proprietary overpriced charger now, which can't charge other devices, moreover.

One technique I try to follow is an anti-knee-jerk reaction: if everyone has a knee-jerk reaction about something, I remind myself of the advantages of the alternative.

Ultimately, the only true test is time. Wait 2 years and let the emotions cool, and you'll know if the outcry was justified.


> Thought experiment: say Macs had USB-C for 10 years, and the latest Macbooks switched to MagSafe.

Then you would presumably still be able to use the remaining USB-C ports to charge it, with the MagSafe merely an option. That would not get many complaints. If they violated that they would deserve the complaints.


No, in this thought experiment, when they have MagSafe, they charge only via MagSafe.

The thought experiment is intended to be the same as reality, just in the opposite direction.


That's what my last sentence is for. If they go pure-magsafe and get rid of all USB-C charging ability then they deserve the complaints.

But if they add magsafe while also having USB-C charging then it's much better than either option on its own.

And keeping USB-C charging does not require extra ports, because you need the USB-C ports anyway to do USB stuff.


The question isn't whether they deserve the complaints, or about the minutae of charging, but whether people think that going from X -> Y is bad and going from Y -> X is also bad. If so, they're just being change-averse: ignore them.


In general it's a good thing to think about.

But you were using it to make an argument that the complaints were invalid.

So I'm going to reply to that specific argument, by pointing out that when Apple changes the IO ports to USB-C and enables USB charging, it does not require them to remove the magsafe port.

In other words, apple went X -> Y. But adding Y reused IO ports and could have coexisted with X. Even though Y might be better than X, it was a false dichotomy in the first place.


There's no need for MagSafe if you have USB charging. It's redundant.

It also entrenches the old standard rather than making way for the new one. If you have a laptop with both USB-C and MagSafe, and you need a second charger, you might buy a MagSafe charger if it's cheaper or to use with an older Macbook. Whereas without MagSafe, you'll buy a USB-C charger — the new technology.

If you want to do a transition, you have to go all in. Having both the old and the new port merely delays the transition, causing more pain in the long-term. Get it over with.


That would be a valid argument except that it implied one port less since you have to use one of the already scarce ports to charge. People like me were waiting for an extra port and got one less


Apple could keep usb-c and MagSafe, but they didn't.


Microsoft's own Surface Book uses something very like MagSafe:

https://www.youtube.com/watch?v=5bnIIuS0New

http://www.cultofmac.com/174993/microsoft-steals-apples-mags...

I've also been using MacBooks since 2006, and I love MagSafe - or at least, MagSafe 1 (haven't tried the more recent MagSafe 2).


Try to keep MagSafe 2 connected in any circumstances other than flat and stable. Sitting on the couch or laying in bed no way, it constantly disconnects.

Good riddance.


I never had a magsafe2 laptop. I love magsafe1. I have no idea why they went through with changing it.


Probably something to do with shaving a billion of a millimetre off the thickness of the MacBooks


Probably, but they could have changed the adapter so the cable left the connector sideways like in Magsafe1.


So that you have to buy new chargers.


Apple owns the patent on magsafe. That was the one feature I actually liked on my MacBook pro. I have seen to many chargers wear out, or wear out their ports. Magsafe also made it easier to connect the charger.

Think how nice it would be to have the same functionality with USB ports....


Currently at work we use (mostly T 4xx/5xx series) Lenovo laptops with rectangular power connectors [1]. I have mine for ~2 years, was used ~year prior. I connect/disconnect charger multiple times a day and both ports are sturdy as new. Others state similar feelings.

[1]: http://shop.lenovo.com/ISS_Static/options/US/images/17912239...


FWIW, some appliance connectors have used magnets for well over a decade in Japan and elsewhere in Asia (it usually looks like https://www.liandung.com.tw/Japan-m/LT-515.html). I owned an electric kettle with one of those.

I don't understand how Apple could patent this and prevent anyone else from implementing it. In any event, there do seem to be 3rd party implementations for USB-C if you're keen on the feature: https://www.macrumors.com/2016/01/04/griffin-breaksafe-magne....


Apple patented MagSafe and how it works, specifically. This doesn't preclude other magnetic breakaway cables if they don't work exactly the same way as MagSafe.


Can you recommend magsafe-like USB cable that works with 2016 MBPs? When I looked for something like this I found only a cable made for the 12" macbook that could not charge my machine.


I pledged on Kickstarter to receive a pair of these (still in manufacturing), but maybe they'd help?

https://www.kickstarter.com/projects/branchusb/magneo-first-...


I don't mind the lack of MagSafe connector on the new MBP. What I do miss however is the green/orange light to know when it's fully charged.

That's something they could have kept (somehow) in the shipped USB-C charging cable.


The Surface Pro 4 line has a mag safe equivalent (while avoiding the Apple patents) which also allows for distributing ports ala USB-C.


I was worried about the loss of MagSafe too. Now, with almost three months with the new MacBook Pro, I have become very happy with USB-C, and I hope that iPhones and iPads will make the switch too. A USB-C-only world would make so many things easier!


Switching the Iphone to USB-C would definitely be the best choice for the consumer.

It is, however, not the best choice for Apple.

I think it should be pretty obvious what I believe will happen.


If they included a breakable USB cable, most people would have been happy. For some reason they didn't.


If they actually enforced their cords so they didn't fray and fall apart within 6 months, I would be happy. Their stubbornness to correct the cord situation is exasperating.


Proper strain relief goes against the 'make everything thin' ethos.


>Also, what PC alternative has a MagSafe equivalent?

The Surface Pro uses a MagSafe style charger.


My Surface Pro 4 has a MagSafe equivalent power supply. Works quite well, albeit not as well as the mac version.


Apple is a still computer company, because Phones ARE computers.

For the majority of the population in developed countries, and almost all of the population of developing countries, smartphones are the only computer people have or need.

I agree that Apple have shifted focus away from "making tools for people to create things and solve problems" towards consumption-oriented mobile devices. But those devices are still computers, and they're wildly successful.

If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.


Sure, phones are computers. Also, most people use phones as their primary way of connecting to the Internet.

But what does that have to do with me (or, it seems, the GP poster)?

I want a great computer on which I can do what I usually do on my computer which is mostly programming, but I guess I could be fancy and say "content creation" instead.

The fact that phones are computers have exactly zero relevance when it comes to me choosing a new laptop.

I used to be a hardcore Mac user, because the computing environment was superior to any other choice, but that's not the case anymore so my next computer will be a Linux laptop.


> If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.

True. However, creators and developers (who, in my experience, almost always use a desktop) are important for the iOS platform. Someone has to write those native apps. Therefore it doesn't make sense to ignore them for too long.


Developers flock to the platform with more users. Which means Apple actually does need to focus on iOS to get more people to keep using mac.

Those of us who are ios developers will keep bitching about it but have no alternative so will keep on using macs as long as their iOS ecosystem is doing great.


While this is generally true, it's a little more nuanced. For example: Android has a larger market share (by a lot globally), but apps are frequently developed for iOS first. Why? Because it's easier. Device fragmentation plays a large role in that, but part of the cost of the project is dealing with the development ecosystem (good or bad). As Apple makes the development less appealing (because of sub standard hardware / OSX) android will seem like a better option.


Developers don't develop for iOS first because it is easier. They develop there first because iOS users pay money for things/apps. Android users do not.


I'm an Android user and I pay for things (?) and apps. Are you saying that I'm the only Android user who pays for apps and "things"? Or is your statement just a gratuitous cliche?


Statistically the average revenue per android user is lower than per iOS user. It's not a binary either/or.

Source: have been a mobile developer for 6 years.


Is the average revenue per user a relevant number though? When it comes to choosing what platform to develop for, wouldn't total revenue be a more important number?


The difference between total revenue and average revenue per user is important, but I think the answer is still iOS.

http://bgr.com/2016/07/20/ios-vs-android-developers-profits-...


It's well understood by many startups and established companies that developing for iOS often takes priority because there's a larger group of users willing to pay more money than their android counterparts


My point is not whether Android or iOS is better. It's that the key to keeping iOS developers around is NOT to make a better laptop but by putting more effort into iOS so that its market share grows.

No matter how great XCode becomes, if everyone starts using Android, people will all jump ship to Android, which means many iOS developers will change to windows.


Sadly that's more a race to the bottom since Google still don't seem particularly interested in actually making the experience for Android developers any better.


iOS apps are also more profitable than Android.


No one makes money selling phone apps.


"Apple's App Store has paid over $50 billion to developers"[0]

[0]http://www.theverge.com/2016/8/3/12371006/app-store-50-billi...


You don't have to sell an app to make money from it


I live in developing countries. It's true that smartphones are the only computer people have or need, which is a problem.

Mobile devices are still consumption devices. It's hard to create complex, multi-layer structures on phones: the screen is too small, the processor is too underpowered... take your pick. But creating a good SaaS or a good UX on a phone is darn tough.

Here comes speculation:

Websites/services in developing countries are painfully bad.

This is due to a number of factors outside technology, but it's also true that universities here graduate people who never grew up with big screens and unwalled gardens, never grew up creating rather than consuming, and don't have a sense of what it's like to go from blank screen to working prototype to polished platform.

Neither do most people in developed countries.

But at least computers are widely available in developed countries so that the x% of kids who have an affinity for these things end up getting started early.

Someday mobile devices will match and surpass desktops and laptops. But that day is not today, and meanwhile developing countries are years behind not only in physical infrastructure, but in online infrastructure as well.


You're correct that phones are computers, however:

Solidworks, AutoCAD, etc. don't run on phones

Photoshop/Lightroom/Illustrator don't run on phones

SPICE, VHDL, Verilog don't run on phones

InDesign and other DTP programs don't run on phones

Emacs/Vi don't run on phones

There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.



Yes, but on iOS you still can't compile or run any of the programs you write.


> Solidworks, AutoCAD, etc. don't run on phones

"AutoCAD" is a brand that applies to a large number of different CAD-related programs, some of which are mobile apps.

> Photoshop/Lightroom/Illustrator don't run on phones

Both Adobe Photoshop and Adobe Photoshop Lightroom are in the Google Play Store for Android phones.

> SPICE, VHDL, Verilog don't run on phones

Well, I can find circuit design apps that support a subset of verilog, but this is basically right.

> Emacs/Vi don't run on phones

There are ports of both on the Android App Store, supporting phones.

> There are a host of actually-useful programs that are completely unsuitable for run on phone-computers, that do run well on laptop-computers and desktop-computers.

There may be applications that phones don't have the processing capacity for or that work best with certain I/O peripherals that phones aren't often used with, sure. But the array of classes apps that are completely unavailable for phones is smaller than you seem to think.


SPICE, VHDL and Verilog typically run on some 56 core machine in a server room. At least once you are working on a sufficiently complex design.


> For the majority of the population in developed countries, and almost all of the population of developing countries, smartphones are the only computer people have or need.

Not true. I am from India. Developing country. There are 5 working people in my family. None of us can replace our notebooks with our phones. For all of us phone is for communication and notebook for work.


> because Phones ARE computers.

This is like saying trucks are cars, too.

Different form factor for different purpose.


Yes, this, thank you. The "but it's a computer!" sophistry falls apart the moment you take one look at the user interface. The reason people are starting to get annoyed with Apple is their continued phone-ification of the desktop.

It. Doesn't. Work.


Your first line is true but only in a trivial sense. In this sense Fitbit is also a computer company. That would be a stretch, but not by far.


Wholly agree with you -- in fact I think we're at a turning point in smartphones catching up to being full fledged computers.

I've written some thoughts about this previously, if you would care for a detailed elaboration: https://guan.sg/apples-2017/


Thanks, that is a nice write up. I especially like "Imagine a world where the way desktops are meant to function is by plugging your ultraportable into any available set of peripherals: monitor, keyboard and mouse. The interchangeablity and convenience of that, necessarily implies that the brand and type of monitor / keyboard / mouse you plug your ultraportable into doesn’t matter, as long as they’re technically compatible." I have written on the same topic, and that is the future I see. Microsoft had the right idea with Continuam, and I expect Apple, Microsoft, and Google to all try to make this work on the mobile side while leaving it to 3rd party manufacturers to develop USB-C compatible monitors that act as a hub for keyboard, mouse, USB drives, etc.


Thanks! Would love to read, if you have a link!


> Imagine a world where the way desktops are meant to function is by plugging your ultraportable into any available set of peripherals: monitor, keyboard and mouse.

I'm skeptical that this will work for everyone.

Everything is tradeoffs. When "must be tiny" is the top priority, performance and battery life necessarily suffer.

There will always be people - from gamers to video producers - whose top priority is performance. And for computer geeks generally, the weight difference between a laptop and a phone is not compelling, but having four times as many cores would be.

The portability difference between a mainframe and a laptop is immense - it changes your working life. The portability difference between a laptop and a phone is much smaller. It means you can do things spontaneously, because you can always have the phone with you. But if you're planning to work, the difference is minimal. Especially if the smaller form factor means you have to plan to have peripherals wherever you're going.

Eg, I can take my laptop to a cafe or a park and work. I couldn't do that with an "ultraportable" unless I bring my own peripherals. So it's actually less portable for the situations I care about.


Saying phones ARE computers is being completely pedantic.

Of course they are. So are microwaves at this point. So are toilets at this point.

I fail to believe that you just "missed" the real point, that the form factor and the user interface are (and need to be) quite different from a phone to a desktop.


> If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.

Consider:

- iPhones will suck if developers stop making software for them.

- You need XCode to make software for iPhones

- XCode only runs on MacOS

If developers start abandoning Mac en masse, that's going to hurt iOS a lot.

Also, if photographers, videographers, animators, designers, audio engineers, etc start championing Windows or Linux as their platform of choice, the average person will follow eventually. "All the pros I know use X" persuades a lot of average people.

And if people aren't using Mac, there's a lot less reason to use iPhone as opposed to Android.

I hate to use the word "synergy", but that seems to be what they're risking here.


No, Phones are TVs, books, consoles etc. A typical computer usage at a guess is an Excel file. I would wager that majority of business usage is some kind of document editing stuff and Phone by design will never ever be able to do this.


> Phones ARE computers

Computers that need jail-breaking, though.


I switched to Mac at about the same time as you. I can't see myself ever going back to Windows but if Adobe ever ports their application suite to Linux, I'd switch immediately.


I'm on a similar timeline as well. Thought I'd never go back, but have you checked out Windows 10? The gap has closed a lot.


I think the gap has increased even more with Windows 10.


I can't agree here. Windows 8 was not usable as a web developer, for me. Windows 10 is something I don't mind using as my experimental box (I still use OSX for day-job tasks).


Windows 8 was worse than Windows 10 which is worse than 7. I should have specified-to me Windows 8 and 10 are both poor, both much worse than 7. To be honest, I had an EASIER time with Windows 8 (company provided laptop) than Windows 10.

Windows 7 was the pinnacle windows experience for me and it has gotten worse ever since.


What about windows 7 was so great? The start menu? Windows 10 has a 5 second boot up time, most all annoyances anyone has online can be configured away, and it is insanely stable (haven't had it once crash on me yet).


It took a familiar UI and improved it. It added features in an intuitive way. And many of the features of windows 10 could've easily been added to 7.

> Windows 10 has a 5 second boot up time

Man, I wish. My 10 system never boots up that fast. Meanwhile my Windows 7 desktop takes 15 seconds to boot up on SSD. Those 10 seconds just aren't that much of a feature for me-especially since my laptop/desktop are typically in sleep mode anyway.

> most all annoyances anyone has online can be configured away,

Yes, because we should have to do work to eliminate baked in ads and processes that share my information with who knows who.

Also, you can't even intuitively FIND settings. The Control Panel has some settings the Settings app doesn't have and vice versa. It's a mess. Why can't they all be in one place? Mac? one place.

I can't even get Windows 10 to update. Instead I have to constantly kill a rogue update process that decimates resources because I can't get a basic update to download and install properly.


Well, it crashed on me. Not even a blue screen of death, just spontaneous reboot.


This is one of those annoyances, probably not a crash but one of those infamous 3am forced reboots while you're playing CSGO no less. This is one of those things you need to google and configure away.


With configured away you surely mean install third party tools full of adware until one nearly does what you want.


That's quite an assumption


Thanks, I'm quite good at those.


I didn't necessarily mean that the gap has closed just due to improvements at Microsoft ...


Check out my Windows-based node dev environment:

http://i.imgur.com/nJhqlvX.jpg

I am far more productive now than I was on OSX. Switched over when the Windows Subsystem for Linux came out (you can see my ZSH shell running in the corner there)

Jelly yet?


>far more productive now than I was on OSX

Really? In what ways?

This is interesting to me, because the main reason I'm a mac zealot (for all but le screengames) is because it's mostly indistinguishable from working on a linux box, without any of the negative aspects of running linux on a workstation. Any time I try to do work on my Windows machine it feels very handicapped. WSL helps, but it's still one more layer of abstraction.

SSH keys/git always feel like a hassle on Windows, rsync was slow as fuck for me on WSL (but works fine in Cygwin), any kind of scripted behavior is a hassle. Package management outside of WSL is pretty annoying, choco is okay, but nothing compared to brew/apt.

At best I could see myself being equally productive on Windows, but not without significant effort.


I'm on a beast of a machine that I could only dream of when I was stuck inside the Apple ecosystem.

Granted there were a few of annoyances I had to find workarounds for, but nothing unsurmountable, we are devs after all. There was a lot I had to relearn the-windows-way, old habits I had to drop, new ones I had to adopt. But with experience comes expertise, and you get used to the new way. I now have that handicapped feeling when I use OSX, death by a thousand restrictions. Funny thing, I never noticed them before, I assumed that's just how things were.

You don't know how far you can push and customize an OS until you make it your main environment and force yourself to stick to it until you overcome and find your new workflow. Pussyfooting around with Windows while you use OSX on your main machine is not how you test the OS for fitness, nor how you change old habits. You have to go in it with an open mind, you're not going to find OSX in Windows, you WILL have to relearn new ways of doing things, then you have to dive into the deep end of the pool head-first and attempt to hit your stride.


I understand your point about having to give an honest shot at Windows - I bought a Win10 license some months ago. It's fine.

Until I wanted Emacs.

Until I wanted to uninstall software.

Until I wanted to setup a VM in VirtualBox (set up a Win2k12 Server VM tonight, and it simply failed to boot 3 consecutive times, then booted up in a "repair" mode, and then failed to boot again, just to boot up again after the 5-6th try. Wonderful).

Until I wanted a damned clock that keeps time (always have to manually stop/start automatic time in the options).

I really want to like Windows but I can't. It feels like it's going out of its way to annoy me. Booting up a fresh Debian install feels so much better... I just feel like I have to understand how things work much more. Linux is a sharp tool, and Windows feels like a clunky bicycle.


The only actual _windows_ problem of those you listed is that uninstalling software is a hassle. That said, it isn't a walk in the park on Mac either.

Can't use Emacs on Windows? Use any other editor. Can't setup VM in VirtualBox? A VirtualBox (maybe) problem

If you go on Windows expecting the same environment as OSX or whatever OS you're on, you're going to have a bad time. Booting up a fresh Debian install feels better because you know what to do already.

Booting up a fresh Debian install for anyone else is likely to be an almost impossible undertaking without reading some kind of guide, if you're wanting to setup a proper dev environment.

There are problems in any platform you chose, you're probably just subconsciously sidestepping those in your process of setting up, while the Windows ones stick out to you.

Example problems I notice on Mac that are fine on Windows; Docker is extremely slow, I need to run it inside a Linux VM for any kind of proper developing Window management is horrible Installing software

Now, I'm no advocate for any OS, I love running Arch Linux with i3wm, I love OSX and I love Windows. I don't see any reason at all to hate any OS, I can setup my dev environment on practically any platform I could want with little or no difference. The only things that change are the things around my environment, the simplicity of i3wm, the task bar on OSX etc.

In my opinion Linux is the outlier here, which provides the greatest change in environment, not a bad one mind you, just a difference. Mac and Windows are mostly interchangable, I can switch betweem them with little overhead.


Alright, I was feeling quite whiny yesterday. My opinion of Windows is not that bad - I bought a license, which is enough said :)

I don't prefer OS X over OpenBSD or Debian. What I like is that I can mostly just hop from one to the other without doing a context-switch. Things (mostly) work as I expect them to from one box to the other. That's not true for me on Windows (but that is to be expected).

I learned about computers on Windows, from 95 to Vista (briefly touched it and then left for Unix). I used to memorize countless contextual menus and options and paths between each, so that I would see how to solve a problem when it arose and could diagnose it without access to a computer. I still do not have the same ease with Unix.

What I have gained by using Unix is real knowledge about how computers actually work, not only how the OS itself is built. And in my anecdotal experience, typical users of Windows (at work, college and friends) unequivocally understand and know less about computers than typical users of Linux do. That is true of Mac users in general but the effect is less pronounced than with Windows - most Mac users that I know have a basic understanding of the command line.

But take this for what it is: personal experience.


> Can't use Emacs on Windows? Use any other editor.

But... to Emacs users, there is no "any other" editor :)

// That being said, it's possible to run a native Windows build of Emacs, I remember doing that at some point. It wasn't overly nice though, and you have to delve into the whole msys/mingw/cygwin thing.


Stackoverflow answer for how to run Emacs on Windows' WSL.

http://stackoverflow.com/questions/39182483/how-to-use-x-win...

It's like they don't even try. After a lifetime of looking down at Windows users, most Mac users go into Windows looking for reasons to validate the way they already feel towards it (irrational hatred) and don't allow themselves to like anything about it. They hit a couple of bumps in the road and quit in frustration and use those as excuses for their decision to retreat back into their comfort zone.

Don't be that guy.

I am proud to say that I can fully configure a dev environment and work on any major OS out there. Linux (Debian and Redhat based), OSX, and Windows.

Currently, I'm glad I'm no longer bound to OSX (deprecated OS, overpriced hardware) or limited to Linux (no gaming and no adobe suite). I'm on a constantly evolving OS ran by a forward thinking company on hardware upgradable through the next decade. It can only get better from here. I got no worries.


> Can't use Emacs on Windows? Use any other editor.

I dont understand what you are trying to say there.


My point was that not having Emacs in Windows is a non problem because the editor market is oversaturated. It's not like the Adobe suite where any other product is almost a downgrade, so running a system that can't run Adobe products would be a liability.

Not being able to run Emacs as a reason not to use Windows is like saying; Damn! PulseAudio doesn't work on Windows, guess I'm back to Ubuntu.


I am not the biggest fan of emacs but for what it is it is more on par with people saying "I dont move to Linux because it does not run Adobe products".

There is no alternative for emacs if gimp does not count as alternative for photoshop.


If that's true, then not being able to run Adobe software in Linux is not a reason to not use Linux.


I run Emacs on Windows just fine. Cygwin Emacs + Cygwin Xserver works pretty well. Ubuntu on Bash on Windows is better by some measures though. You might like that one.


This window mess is the way you work more productive? Try something tiled maybe?


I don't actually work with all the windows opened or even on the same desktop, they are arranged like that for the benefit of the screenshot.


Looks very awesome. What term app is that?


That's WSL+ZSH running in ConEmu (w/ Solarized Dark color theme).

I also have this theme installed on Windows 10, which is why everything is dark including the ConEmu's titlebar: http://www.cleodesktop.com/2016/08/after-dark-cc-theme-for-w...

I spend long hours staring at the screen, I need a dark theme for eyesight preserving reasons. I had a dark theme on OSX until Apple decided to give a massive middle finger to the theming scene when they released El Capitain. The built-in dark mode just doesn't cut it. Yet another reason I'm glad im on Windows now.


Yes, exactly agree on Adobe opening their SaaS cloud to be used on Linux machines. I don't understand their hesitations but I also don't see it happening any time soon.


>> The new "features" every cycle are more "lets put this phone feature on the desktop"

What is it missing? It's been a while since I really wanted a feature in the desktop OS. All of the 'phone' features (besides launchpad) have made my life a lot easier (continuity, handoff, notification centre, even Siri from time to time).


It's missing high performance graphics drivers in the form of either openGL or Metal.

It's missing proper support for eGPU cards, which is currently the only way to add a high performance GPU to any currently produced Mac.

Blender, for instance, can't even make use of th GPU in my MacBook Pro because the openCL support in macOS is not good enough.


Mac laptops never had that.


I don't miss features I just miss refinement.

As a app developer who wanted to build an app which challenged what a mac can do with regards to productivity I had to basically leave the mac app store because they made it impossible for me to make it work in Sandbox mode.


Shipping Unix tools (like Bash) from this decade would be a start.

Trying to get modern Unix software to run on OSX is getting more and more painful. Since development is the main activity I do on a laptop, this is quite important.


UNIX being part of OS X is a historical accident, Apple just wanted to save themselves from Copland's failure.


It was the main reason a lot of people (myself included) moved to OSX from Linux. I honestly believe that its Unix compatibility was one of the main reasons Macs got popular with developers in the first place.


Macs were already popular with developers.

Apparently as someone that favors Xerox PARC way of thinking, I am not a developer, as I don't embrace the UNIX religion.


That's not what I said. I never meant to give the impression that “developers” necessarily means Unix users. A lot of them are though, definitely enough to be a noticeable fraction. The number of people in this very thread proclaiming the greatness of the Linux subsystem in Windows 10 should confirm this.

You have to admit that around 2003 or so there was a huge influx of technical users to the Mac. My proposition is that a lot of those users were developers moving from Linux (or from an unhappy Windows life, wishing they were using a Unix-based operating system).

I base this on the general sentiment at the time, and I was one of the people who made the switch. I'm sure you can find old posts from me on Slashdot talking about how great the switch was.

Now I'm on Hacker News, talking about doing the exact opposite. I'm moving away from OSX because the Unix experience has become really bad.


I agree, but on these discussions, that subset of developers just gets packaged into developers as whole, as if there wasn't anything else.

Personally I know UNIX since Xenix days, do have a vast experience across UNIX variants, use it in some form in many projects, but rather use OS X and Windows environment.

For strange as it may seem, some of us are actually happier with the GUI developer tooling culture of OS X, Windows, Android, iOS.


My experience with hardware that comes pre-loaded with Windows is that it is utter crap. Or so it seems - it works fine once you remove Windows and install Linux for example. I have kept none of my previous Windows machines. None. They all decay so fast, BSODs after months of use, constant reboots just because things don't act right, general instability, sluggishness, etc. I have only recently built myself a new Linux tower. Much more pleasant.

OTOH, ALL Apple hardware that I have bought (multiple iPads, iPhones, iPods, Macbook Pros, iMacs) still works as advertised and is in good shape (I always buy and use Apple Care). I have just recently begun to use my Linux desktop more because my 2009 MBP is starting to feel a bit sluggish.

People can whine all they want about Apple (I personally will not buy the new MBPs, wait until next year's model). The truth is they've set the bar so high that anything less than perfection (which IMO is mostly what they gave us during the last decade) is seen as unacceptable. The same cannot be said of Windows - at least according to my experience since Windows 95. I recently bought a brand new SSD and a license for Windows 10. One day after installation, Windows crashed and had to "repair" itself. To be fair, it has been running flawlessly since.


Do you just roll into WalMart and buy the first $300 HP you see?

That's literally the only way I can see your anecdotal experience being truthful.

The PC marketplace is open, like the Android one. There's a lot of cruft to sift through, but it should only take about 1-2 hours of research to find the best laptop at any given time and any given price range for any given use-case.


No. ~1200 PCs and laptops. I understand that it seems like I'm exaggerating. I'm not. Either I have been incredibly unlucky (or stupid in my choices), or Mac hardware is of better quality.

But you're right that it's anecdotal.


>First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone. Then they became a computer company who also made a phone and a tablet. Then they became a phone company who also made computers and tablets.

>Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

I don't know much about this area, other than the little I have read, but didn't MS do somewhat the same thing - "lets put this phone feature on the desktop" - with Windows 8? Not sure if they reversed that in Windows 10. Interested to know.


Windows 8 was ridiculous, and one of the things that gave MS a big wake up call. Its now a different company. Windows 10 is leaps better in about every way than 8 (not without its own issues, of course). Disclosure: I am a MS MVP. MS will listen to me in almost every area I care about.


Interesting. Yes, though I've never used Win 8 (and don't want to), from the little I had read, it seemed like the idea of trying to merge the mobile and desktop paradigms was a bad idea from the start, and doomed to fail.


Windows 8 didn't merge the mobile and desktop paradigms, it put the mobile paradigm over the top of the desktop.

The apps stuff (Windows Runtime) was actually quite well done for a first try, but it was unfamiliar to Win32 users and got in the way. (Only one click away, but people are impatient and resist change.)

Windows 10 does an excellent job of merging apps and traditional programs, though it's not strictly correct to equate apps and mobile. Windows 10 is a mobile operating system that runs apps from an app store, but you can resize apps and run them seamlessly alongside traditional Win32 programs.

You can still run Windows 10 in tablet (Windows 8/8.1) mode if you want to. I don't know anyone who does.


I, strangely enough, sometimes run Windows in Tablet mode and use it as tiling window manager. Works surprisingly well. Try it sometime.


> Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.

I don't think Cook is hardheaded. I think he has no idea how to run a product company. Pretty much everyone from the average HN user to Larry Ellison predicted this outcome. Cook is logistics / supply chain etc, Ballmer was sales, neither had any business at the top of their respective companies. Jobs did it on purpose, in my opinion, because he believed the single most important function initially was to keep Apple running smoothly and to complete the iPhone boom (ie the big product for the next decade was already in place). The only question is how long Apple will stay in the Cook era before getting a product leader replacement.


Death throes, not death throws, fwiw.


FWIW, I have a 2016 13" nTB and have had a few trip over the power cord incidents. The USB-C cord snaps right out almost like magsafe for me... the laptop will get a nice tug and move an inch, but it comes out even on a slick countertop.


I had similar experience with MacBook 2015. Yes, the laptop did move, but that was less than 2" before the cable detached.


The original magsafe design ( http://www.iclarified.com/images/news/9014/31392/31392-500.j... ) was the most ideal desing out of all of them in my opinion... I have seen macbooks slide about 20-30cm with the new designs without the magsafe connector disengaging, all because the of direction of pull.

The only problem was that the original magsafe design was prone to fraying. and probably didn't look as sexy as having the cord at a right angle to the computer.


> I wont buy a laptop without a magsafe or similar connection

Check this out: https://griffintechnology.com/us/breaksafe-magnetic-usb-c-po...


I own one of these. It comes close, but it's just not the same.

I use my laptop, closed, hooked up to an external monitor. Nowadays, I spend a lot of time on video conferences. The number of times that my laptop has shut down mid-conference because I tapped the cable under my desk just enough for the connection to drop for a split second (combined with Apple's -- let's call it "interesting" -- decisions around power management) in the last week is way too high.


Careful: BreakSafe is rated up to 60 watts (20 volts @ 3 amps), and has been designed and tested to meet USB-C power standards.


I categorically refuse to pay more money to buy something to add on to an already overpriced laptop that sticks out from my computer's form factor and that should be built in.

Fortunately for me I got a new laptop not long before this new abomination came out. If Apple doesn't correct this gigantic mistake, the next time I get a new laptop it won't be made by Apple.


Man you would absolutely hate to work any job that requires a serial connection then. Because let me tell you about the serial-to-USB adapters I've used... I wish I had the luxury of saying that I categorically refuse to pay more for something that sticks out of my computer that I think should be built in.


If those are permanently-emplaced USB to serial adapters and you have problems with reliability, I recommend checking out Digi.com for their "Serial Servers." We have a couple of customers who've been running faxes from VMs to external modems over Digi One SP units for more years than I care to think about, and they're quite simply rock solid.


I do embedded dev; at any time I have 2-6 serial-USB adapters hooked up to my MBP. They are all plugged into an external 10 port USB hub though, would be a little harder to do it otherwise.

Anyway, back to your point, I most certainly do NOT want DB-9 or DB-25 ports on my laptop. ;)


The operative clause there is "should be built in". And of course I can say "should be" because it has been for years now. Literally just yesterday magsafe saved my computer from being thrown off the desk. The regressions in overall user experience with this latest MBP mean that I'm no longer willing to pay Apple the premium that I've been paying them.


No, the operative clause is "I think". You're talking about your opinion, while simultaneously telling me that my opinion is worth less than yours just because it's different.

It'd also be great if I could have a VGA output, but that ship has long since sailed, hasn't it? It'd also be great if I could have more than two USB ports, but I have a 2015 MacBook Pro. To get four USB ports I'd have to upgrade to the 2016 MacBook Pro. See? To me, the 2016 is better than that one I have. It lets me plug in more adapters, adapters that I've been used to carrying for years. I carry four with me to every client site I visit, and you don't see me whining about it on the Internet.

Regressions in user experience? It's double the number of ports! It's a god damn lifesaver!


The number of people who trip over power cords is almost definitely much larger than the number of people who need VGA output, or need four USB ports (given the easy availability of USB hubs).


I say the OS is getting worse, the quality of the software is nowhere near what it is used to be. It's clear that they need to put more effort and polish Mac OS.

Apple laptops are still successful as for now, they sold wagons of the latest Macbook Pro, but it is not going to last if they don't fix the OS ASAP.


> First, they were a computer company driven by a man who loved computers ("first" here is the Jobs return era) ... then they became a Computer company who also made a phone.

You missed the most important stages in between: when they became a company that made cute, candy-colored computers that would match your Volkswagen New Beetle, then when they made a totally awesome mp3 player you could get to match it.


Good riddance to magsafe. I've had two laptops become unusable because the magsafe connection became flaky and wouldn't charge unless I fiddled with it until it rested just so (even contact cleaner wouldn't help). There was no hope of it lasting more than a few hours without constant tending by me to get the connection just right again. I've never in my life had any other computer that failed in that way.

Moving to USB-C was the right choice. Charge cable goes bad? Buy another one for $15.

Everything else is spot on. Apple quality is fast going downhill.


Many years ago, before the age of the iPhone, I had a meeting with about thirty people at Apple headquarters at the request of Steve Jobs. That was my very first visit to Apple HQ.

After half a day of conversations, as we left the building, my very first words to my team were: "Apple Computer is no more. This is a marketing company. They might as well stick their logo on washing machines, microwaves and refrigerators. They know how to market them to be cool and they'll sell millions of them"

Then came the iPhone.

And, yeah, they could have filled homes with Apple appliances. Not sure why they didn't go there. So easy. Not saying it would have been right, but they had the opening and the mindless following to make every home "Apple Cool".


> I wont buy a laptop without a magsafe or similar connection

Not a big deal nowadays, my mid-2014 Thinkpad has a similar connection (it won't ever drag the laptop when pulled), I'd think most laptops should have such connectors by now, "mag-safe" or other?


which adapter exactly are you referring to here? I have a 2015/2016 thinkpad yoga for work and I don't think it's the same/similar at all to the magsafe adapter. Magsafe comes off even with the gentlest, but firm, tug even at an angle. I can't say the same for my thinkpad


> the consumers anymore, they stick their fingers in their ears and pretend to know best.

They are though. Consumers will pay for hardware with batteries that die after a year and a half. They'll pay for vendor-lock in and to be locked into the walled garden experience. They'll pay for hardware that will be locked out of the walled garden and unsupported by vendor updates after 4-5 years. They'll pay extra for accessories that give basic functionality to their electronics. They'll pay a massive premium for accessories that are subpar for their asking price if they're endorsed by celebrities.

I'm not a fan of newer Apple hardware and software, but we're in the minority.


Agree. Have you noticed that their phones also suck now? I am locked in their eco system, but I can tell, the quality of software is just not there any more.


> Jobs was hardheaded, but reasonable. Cook is trying to emulate the hardheadedness but fails to recognize the reasonability needed to balance that.

I think we can be sure that under Jobs, they would not be in a situation where you cannot charge the current iPhone model with the current MacBook without a dongle. "Oh, it's not a dongle, it's just a cable." -> Get out of my face.


I'm in the same boat. I switched around 2008-2009 to jump on the iOS app wave, but am more interested in Windows since 10 and Azure.

I still think their rMBP 15s are the all around the best you can buy. I haven't tried the track bar version yet though. Also, OS X (or whatever) on a rMBP is still great for surfing the web. The zoom with the trackpad is flawless.


You can try it on any Mac with https://github.com/bikkelbroeders/TouchBarDemoApp however the touch bar does not seem all that useful to me and I think that the previous model, (2015), is all around a better buy, higher performance CPUs, standard USB 3-A, MagSafe...


"death throes"


> Now they are a phone company who presides over the death throws of an amazing operating system that is going to be killed off to make it more like a phone. The new "features" every cycle are more "lets put this phone feature on the desktop"

it's death throes.


> Then they became a phone company who also made computers and tablets.

You're describing the changing world we're living in. Apple is merely reflecting it.


> I wont buy a laptop without a magsafe or similar connection

Do Windows laptops have it, or you're going to use a 2014 MacBook Air forever..?


Recall that Apple Computer Inc became Apple Inc January 2007.


Under Steve Jobs's leadership, no less.


> I wont buy a laptop without a magsafe or similar connection

What about wireless charging? I see magsafe as deprecated and only Macbook air has it now.


Wireless charging is horribly inefficient...IIRC there aren't even any that go over 10W.

Magsafe may be deprecated by Apple, but that's a step backwards in functionality. One cannot deny that using USB C for charging is inferior in terms of protecting the device.


Thought the same thing, but then saw this:

https://www.kickstarter.com/projects/branchusb/magneo-first-...

Gotta say the idea of a single USB C adaptor carrying power, data, and having a general purpose mag-safe like adapter is pretty compelling.


That looks massive though.


Massive?

.. there's no pleasing some folk.


Given it's intended to be a permanent fixture in the side of your laptop, at the size of a (keyboard) key I'd say it's pretty big, yes.

Add that to the one-and-a-half key-sized optionally permanent part (unless you're going to buy one for each cable, it's fixed until you need the safety feature) and it's a huge thing sticking out of the side!

To be honest, I question the usefulness anyway - I've pulled my Macbook Air off a table (accidentally) by it's MagSafe power cable before, and trying it deliberately now it only works with vertical movement. Side-to-side or directly 'out' it just moves my laptop, even as short & and sharp as I can.

If I stepped directly on the cable it would be such a directly-out movement; if I tripped, a side-to-side one. Unless my laptop was positioned perfectly such that the MagSafe hung over the edge of the table, and I trod directly down, I can't imagine how it would have its intended effect.

I use it all the time for un/plugging deliberately, but I can live without that, and as above with this Kickstarter product without buying several I'd have to leave the whole dongle plugged in anyway, so any deliberate dis/connection would be a normal un/plug motion.


> intended to be a permanent fixture

Are you sure about that? I've ordered two and I've never thought about leaving it in: for me, it's purely a normal connector except that it could "break" safely.


How many devices did you break because they did not have a magsafe like functionality? I did not break a single one.

Magsafe is a differentiating feature of Apple with little to none daily life saving rationality for existance. Its there to be different not to be better.


3. 3 laptops, broken when I wasn't even there from a family member or child or pet running by snagging the valve and ripping t out. 2 were just rendered unchargable, the third shattered the screen. I'm far from alone in that, as well. It's rather presimptuous on your part, don't you think?


Those computers are like $3000. Even if it only saves 10% of MacBooks from an untimely end, it's still $300's worth of insurance.


10% is a really high guess. I would suggest less than 1%.


Wireless charging needs to be strangled in the crib. As long as we're burning carbons for power, we have no business charging wirelessly until it's as efficient as wired.


You know what fans (the real ones) do? They try to know more about the thing they are fans of, and to understand it. Your comment shows nothing of the kind.


Sorry but who wants to be some lame "fan"? We want the best tool for the job, too get that job done and we're willing to pay for it. The tools are no longer up to scratch or functioning as expected and we are voicing our concern.


I couldn't agree with you more on this statement. Said exactly as I feel.


[flagged]


We've banned this account for repeatedly violating the HN guidelines and ignoring our request to stop.


I'm sorry I re-read GP and I'm still unclear what you're talking about. For as long as I've been using Apple it has always been because of the superior experience. Its with regret I have to say I am let down by (in particular) their software quality recently. I've gotten to the point where I'm fairly certain my next phone won't be an iPhone. I can't ... understand your line of reasoning .. that as an Apple user that means I should have some loyalty to them if what they're selling me doesn't fulfil my needs. I'm really interested in knowing where you're coming from but your snarky tone is making that difficult.


>But recently, I realized I’d gotten tired of Apple’s attitude toward the desktop. The progress in macOS land has basically been dead since Yosemite, two years ago, and Apple’s updates to the platform have been incredibly small.

So, like creating a full blown new programming language (Swift)? Or a full new filesystem (AFS)? Integration of Cloud storage directly to the desktop? Siri on the desktop? Saving RAM through memory compression? Continuity to transfer work across desktop and mobile (and different desktops) seamlessly? All things added in the last few years, with few of them still ongoing.

Sure -- they've totally abandoned it /s.

>Take a look at Sierra: the only feature of note is Siri, which is half-baked as it is, and the things that did get ported over from iOS are half-done too.

That's hardly "the only feature of note". But even so, I wouldn't want Apple to continue to change much in OS X, except refining things.

>and so I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro.

So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).

If it discussed the state of Mac Pro and Mac Mini the post would actually have a leg to stand...


>So, like creating a full blown new programming language (Swift)?

Byproduct of Apple's work on mobile.

> Or a full new filesystem (AFS)?

Byproduct of Apple's work on mobile.

> Siri on the desktop?

Byproduct of mobile again! (By the way, ask Siri on Mac to set an alarm or interact with homekit!)

But, I take your point. I tried to mostly detail Apple's lack of attention for developers, but perhaps missed the emphasis on that there in the post. Microsoft is really trying with developers and it's readily apparent.


"Byproduct of mobile." If you need evidence:

Swift appeared in IOS first and still is not great on the desktop at all.

APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017

Siri on the desktop is mostly useless, as said here.


Right - that's simply a result of what I say in the post about Apple focusing on iOS first, then letting the byproduct of that filter down into OS X. That's a end-result of the Mac team being merged into the iOS team. Maybe it's not bad, but it just means slower overall progress.


For APFS, I don't buy that it has anything to do with prioritizing one operating system or the other. APFS wasn't ready when Sierra was released, which is why it was included as a beta with limited functionality. It seems to be ready now, so it will be included in the next version of iOS (10.3), which is still in beta, and presumably the next major release of macOS.

The only difference is that it's being released in a minor update to iOS but will probably wait for a major update to macOS. But that makes sense, because macOS allows the user to customize partitions and filesystems, and allows running other operating systems - including Boot Camp, which Apple itself must provide drivers for, and Linux, which is waiting on Apple releasing documentation for their filesystem, as they have promised to do. By contrast, iOS has the same partition layout on every single device and no cross-OS compatibility concerns. Thus APFS on macOS requires more work and has greater risk of failure. For both reasons, regardless of Apple's priorities, it would be logical to start with iOS. Even if Apple did add stable APFS support to a minor macOS update, they probably want to auto-convert users to APFS eventually, but doing so in a minor update would definitely piss off technical users; they could make it opt-in to start with, but it's easier to just wait.

The next version of macOS is only a few months away; really not a big deal.

(I agree that Apple seems to be prioritizing iOS in some areas; I just don't think APFS is one of them.)


>Swift appeared in IOS first

Which makes perfect sense. We talk to our phones all the time anyway, and hands-free is crucial. For desktop, not so much.

>APFS (not AFS) is the default in IOS, and still in feature preview and pretty unusable in 10.12. It's "expected" it will be released for desktop in 2017

Which makes perfect sense. A constrained environment without an exposed filesystem like iOS is easier to convert to a new FS. A full blown desktop OS not so much. That's why it needs much more testing and development to deliver the latter.

But they ARE doing this testing and development.

(Btw, APFS is not "the default in iOS" yet. It will be when 10.3 is released -- it is still in beta atm).


I agree it makes sense.

But it also indicates that Apple isn't really designing for the Mac as the author points out.

I mean, Siri is the biggest feature Apple is touting for Sierra (http://www.apple.com/macos/sierra/). And as you point out, it's something that doesn't even make all that much sense on the Mac.


I may be in the minority here, but I find Siri just about as annoying as the old Clippit thingie in Microsoft Office. It's the first thing I turn off.


>But it also indicates that Apple isn't really designing for the Mac as the author points out.

Maybe they don't but I don't see them doing anything major on the iOS side either. Both platforms are quite mature by now anyway.

(And Apple was never about revolutionary new designs and jumps. Back in the day of the iPod and early MBPs etc, we cheered and waited anxiously for at best incremental changes -- now it has USB, now it has a different touch wheel, now it has a color screen, now it has wifi, now it does video, etc, year over year).

>And as you point out, it's something that doesn't even make all that much sense on the Mac.

Yes, but people (and pundits) have been asking for it all the same to appear on the Mac anyway.

And "talking to your computer" has been a thing from the times of 50's sci-fi stories even.


I'm sure they'll figure it out at some point, but the mystique of Apple only releasing something when it was "ready" has been over for a while now. So not only did it come later, it was less evolved than the Windows implementation a year prior. Cortana felt useful from day one for various tasks. Siri? Only in a subset cases.


Siri is a humiliating embarrassment. There is no other way to describe it. The only time I ever use it is for comedic relief, or to demonstrate the obvious superiority of either Google's, Microsoft's, or Amazon's implementations. To Apple, it's just another in a long series of marketing-driven features rolled out into some perceived white space in the competitive landscape and then abandoned. It's ridiculous that something so potentially game-changing to enable non-technical users, which is theoretically the market Apple thinks it owns, has gotten such short shrift at Apple, even in the face of growing (and superior) competition.


> The only time I ever use it is for comedic relief, or to demonstrate the obvious superiority of either Google's, Microsoft's, or Amazon's implementations.

Curious here - can you give some examples? Siri seems to work better (on friends' devices) than "OK Google" on my Moto X.


I don't know how well Cortana works, but Siri works very well for English. It is passable in Swedish for every-day language (like "påminn mig att köpa mjölk ägg och smör" will be fine) whereas anything remotely technical or heaven forbid English will be a jumbled mess. Google is understandably leaps and bounds ahead of Apple, and it's not surprising that there is essentially nothing in the field of machine learning coming out of Apple, whereas Google is basically _the_ ML company on Earth right now.


Apple was never about revolutionary new designs and jumps.

http://theoatmeal.com/comics/apple


Yeah our desktops are just the thing we use to make those, no big deal really


> APFS (not AFS) is the default in IOS

Well, it will be when that beta iOS is released.

Besides, it makes sense for any sort of file system to be released first in such a limited manner.


Uh You realize they make more phones than desktops, right? By like a factor of 10 at least :)

So the limited manner would be to release it to desktop?


In what sense is Swift not great on the desktop at all?


I think a pretty good summary is about 3 years ago Microsoft realized it had a couple of failed stratagies.

1) Catering to casual. Xbox one launch is probably the best example. They wanted to become the home media center so the entire launch and discussion was centered around sports streaming and social media. + Kinect.

2) Desperately trying to get a foothold in mobile.

I think they realized people weren't going for either. Sony focusing on games and hardware killed it initially in sales, and still kind of does. Apple and Android had two corners of the market and MS could make a space for a third.

They pivoted, targeted "professionals" and that's what you see in their approach for Xbox & PC. I think it's overall good, but saying Apple abandoned developer is sort of desparate fear mongering.

Apple's strategy has always been pretty consistent, focus a lot of resources on making a good ecosystem. More recently it seems that's changed to "focus a lot of resources on making the best systems, and allow for a great ecosystem" with sponsoring third party monitors, and apple's home automation front.

I think it's still a pretty good strategy, but with windows catching up with some "pro" features like their own premium desktops and laptops and adding in old features like "workspaces" people jump to say Apple "abandoned" them when they're still kind of sticking to the same successful strategy.


If you'll look even further back in the Mac OS history, you'll see that they didn't really care about developers back then. Their entire focus was media and publishing with education on the side. I think it hurt them in the long run as they've always lagged behind with software and gaming and they only managed to make some headway in the PC market when they devoted resources towards making life easier for developers.

I've always liked Mac OS better than Windows and even though Windows 10 is finally a usable system for development, I still don't trust Microsoft. I've been burned too many times with a promising system that then gets hobbled by ORMs adding their own Adware and other BS, by Microsoft caving into the bean counters and hobbling their systems in order to fight Chinese piracy, and by some C-suite managers bright idea that they should to waste resources on some "Synergy" play instead of just making the OS, and other products, better.

At the core, you still have an ecosystem where the people making the computers can't make enough money on their own because Microsoft and the chip makers are squeezing them. Which makes for a sorry experience in the long run, even if Apple has its own problems.


I've been an enterprise developer on windows since XP (now on 10) and I don't think they've come very far in terms of what's going on with your dev pc. I still wouldn't want to use my dev machine for anything but development because of how much shit the .Net dev environment does to your system.

Don't get me wrong, Azure is blazing amazing, and so is .NET in general, but when your enterprise IDE still hasn't found a way to function properly when the "documents" folder is on a network share, then you've got ways to go.


This is the entire reason I do the majority of development in a VM. There needs to be installed so much software, that is deeply integrated into Windows, that I don't want any of it on my daily workhorse.


By that logic, isn't iOS byproduct of OS X? I think people are overblowing the situation. I still rember Snow Leopard being best desktop OS I have ever used, but Sierra is still far far in front of Windows, and I realized that just when I sat in front of the computer with Windows 10 on it.


> Microsoft is really trying with developers and it's readily apparent.

Which has been their approach for a very long time - let's not forget that Ballmer speech. From what I have heard, Apple puts minimal effort into XCode; the iPhone lead dev here was throwing around all forms of cuss words trying to get CI set up.


XCode is a sometimes excellent, sometimes shaky experience, still far ahead of Android Studio until the past 6 months or so. The API design of iOS is also still second to none, and part of the reason I continue to develop for Apple platforms. I use XCode Server for CI without issue, another team uses Jenkins happily. Apple's hardware can be fairly neglected, but the experience of being a developer in their ecosystem is still very pleasant overall.


WSL is also a byproduct of their failed attempt to run android apps on windows phone, and it's the most interesting thing they released recently at least for me.


"Byproduct of mobile" doesn't really say much -- if anything.

They support all three on Desktop systems, and they have features and APIs for both that only make sense on desktop systems too.


It says a lot.

The desktop isnt a phone. Forcing phone features into a desktop and slowly turning the desktop into iOS is not a good thing.

We were just fine without an app store. Just fine. I cant think of any of the "use this mobile feature on your desktop" that provides any real value...


>Forcing phone features into a desktop and slowly turning the desktop into iOS is not a good thing.

Siri, maybe, but there's nothing about an FS or a programming language like Swift that makes them a "phone feature". Same for memory compression (which only exists on the Mac IIRC) and continuity (which would be useful even if it only worked from Mac to Mac).

If anything Apple has adamantly refused to "force phone features into a desktop", and only ports stuff that makes sense -- that's sort of what Microsoft did, trying to merge e.g. tablet and desktop platforms in the same UI.

>We were just fine without an app store. Just fine.

Well, I'm much much better WITH an app store. Why wouldn't I want one? I might want some more features out of the Mac App Store (like the ability to demo an app) but I very much want to have it.

And in no way is an app store a "mobile feature". App stores, (and app repos) existed way before they appeared on mobile phones, for one.


> Same for memory compression (which only exists on the Mac IIRC) and continuity (which would be useful even if it only worked from Mac to Mac).

Windows 10 has memory compression.


And Linux too since Dec 2012.


Huh? Is that enabled automatically, or do I need to flip a switch?


And of course, Android (Cyanogenmod had it in 2010 or so)


"On the Mac" as opposed to "also on iOS".


> We were just fine without an app store

You were just fine. Don't speak for everyone.

Finding applications on the internet is a dangerous endeavour if you are inexperienced. Many people get tricked by popups suggesting their computer is broken and they need to download a "cleaner app". Many people get tricked by the fake Download buttons. Many people just use whatever the first link on Google.

And then once they are tricked they then proliferate their passwords, email addresses, credit cards etc all over the internet.

App Stores IMHO are a must have for most people.


If the mac allowed the "store" to have more than one source, allowed usage via cli and gui and didn't require a 30% cut most of your software would probably come from the store and it would be a good thing.


They'd have to remove the account requirement too. Just `appstore install foo`.


Maybe the could just offer a way to download the apps via the internet, and then open some kind of image container and move a file to the Applications folder.


This is inherently inferior as you are requiring the user to make the distinction between safe and unsafe software each and every time they install software something users are notoriously bad at.

Additionally it makes it hard for 2 apps to specify libraries required meaning each must include whatever they require AND either updates are manual, read nonexistent, or each app includes its own nagging update mechanism.


An account could be required for installation of non free software while still allowing installation of free software without it.


I'm a professional developer, I don't want a new toy language. I'm buying a powerful laptop, I don't want cloud storage. I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem. I don't want compressed memory, I want 32GB of RAM. And while I'm at it, I want a laptop with a real keyboard, not keys that have so little travel they feel like a touchscreen.

In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.


>I'm a professional developer, I don't want a new toy language.

First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

Second, whether some random web dev or embedded C dev or Java dev wants Swift or not is irrelevant to its intended audience and utility. It's not for "professional developers" in general, it's for Mac/iOS application developers, and thus, for Mac/iOS users (that benefit as a side-effect from developers having a modern/better language to implement stuff).

>I'm buying a powerful laptop, I don't want cloud storage.

So? Millions of users do want it, judging from the huge popularity of Dropbox, Google Drive, MS implementation of the same, etc.

>I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem.

Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+ and Apple obliged.

And whether you can "dual boot" or "share drives" is orthogonal to whatever OS X has a new native filesystem. You can still format the rest of the hard disk on another fs to dual boot, and you can still share drives with other systems in NTFS, xFAT, etc.

>I don't want compressed memory, I want 32GB of RAM.

And I want a pony unicorn. But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it. And Intel gives late 2017 / early 2018 as the date for those. (Existing PC laptops with 32 GB ram merely sacrifice battery life using regular high power drawing RAM -- and even those are far and few between).

(And of course compressed memory helps whether you have 16 or eventually 32GB of ram)

>In short, Apple is spending a lot of effort making features that I, as a professional developer and 20 year Mac user, don't want.

Well, as a 15 year Mac user and 20 years professional developer, I do want those things.

I don't particularly care for the touch strip though -- I'd prefer it to be OLED physical buttons giving both tactile feel AND changeable inscriptions.


> So? Millions of users do want it, judging from the huge popularity of Dropbox

Yet just about everyone uses local FS with cloud for backup or sharing of a small subset. Giving us an odd cloud first strategy seems more suited for a phone or severely storage constrained devices.

> Again, irrelevant. Lots of Mac users, and tons of pundits HAVE asked for a "new proprietary filesystem" to replace HFS+

Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning. HFS+ has been getting long in the tooth for years, so a better FS is long overdue! So now we're getting APFS that explicitly doesn't checksum user data[0]! With current storage size, that's disappointing to understate it hugely.

> until Intel delivers boards that accept 32GB low-power RAM

It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both. Probably a longer overall life. Teardowns show a lot of empty space around current batteries.

[0] http://dtrace.org/blogs/ahl/2016/06/19/apfs-part5/#apfs-data


>Giving us an odd cloud first strategy seems more suited for a phone or severely storage constrained devices.

What "odd cloud first strategy"? Cloud is just ANOTHER option, not a "first" or privileged one. To the point that Apple also includes a whole new local filesystem (in beta) with Sierra.

>Seem to remember that most of that conversation was of ZFS. They even got a long way into the ZFS port before abandoning.

Oracle bought Sun and poisoned the area with patents and threats.

>It's funny, if Apple had not significantly reduced the Wh of MBP batteries in the latest generation we could easily have had both.

No, we really couldn't. At best we'd have a 10% or so larger battery space. The impact of RAM (there whether you use 32GB or fewer for a task or not) is much larger.


To comment on Apple's "cloud first" strategy, while they do provide local file storage in their APIs, their guideline suggests avoiding creating an option for storing local files since users "expect all of their files to be available on all of their devices." (https://developer.apple.com/ios/human-interface-guidelines/i...)


> Yet just about everyone uses local FS with cloud for backup or sharing of a small subset.

In most cases, not because they want to, but because that's how current software works. Both Dropbox and Google Drive focus on having their own special folder that gets synced; syncing the OS desktop and documents folders is possible but only with special configuration (on both Windows and macOS).

Cloud first makes perfect sense no matter how much or how little storage you have. It prevents you from losing data if your device is lost or damaged, and if you have multiple devices it allows accessing all your data from any device. If anything, its usefulness depends more on internet connection speed.

I do think Apple's specific cloud storage offerings could be improved for large devices. Currently the highest tier of iCloud storage is 2TB, which costs $20/mo; that's only twice as large as my MacBook Pro's SSD, and fairly expensive. But it's the same price/GB as both Google Drive and Dropbox, so it can't be that much of a ripoff...


> You can still format the rest of the hard disk on another fs to dual boot, and you can still share drives with other systems in NTFS, xFAT, etc.

Until that other OS has support for the new proprietary filesystem, it won't be able to read it. Since Apple and Microsoft categorically refuse to implement any of the many featureful existing filesystems, one is stuck with archaic NTFS (with no file permissions) or FAT (with less than 4gb files) to keep data.


Why do you think there’s no file permissions in NTFS?

In NTFS, each file or directory can have arbitrary access control list specifying granted and denied permissions for users and groups.

Those lists are typically inherited down the directory hierarchy, but that inheritance can be stopped.


>Since Apple and Microsoft categorically refuse to implement any of the many featureful existing filesystems, one is stuck with archaic NTFS (with no file permissions) or FAT (with less than 4gb files) to keep data.

xFAT which I already mentioned is supported and doesn't have a 4GB limitation. You can have up to 128 pebibytes (which should be enough for everybody: that's ~144 petabytes).


> which should be enough for everybody

I've heard these words so often that they've lost any meaning to me. :)


Glad you agree the keyboard sucks


> First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

A language with no promises of stability counts as a "toy" in my mind (I understand that this can be subjective, just providing my opinion).

Say what you want about Objective-C, but at least Apple stuck with basic syntax decisions.


That kind of oversimplifies things. From my reading of the Swift mailing list, many things you would consider "basic syntax decisions" will have a profound impact on the stabilisation of the ABI.

Migrating is a pain, but I'll take the Swift migration pain over the maintenance of old Objective-C code any day. Even on large projects.


As a Linux/Windows developer, Swift remains a toy language as it cannot be used to do develop useful applications for Linux and Windows, where I make all of my money.


>As a Linux/Windows developer

As a Linux/Windows developer you are not the target audience (which I already addressed).

Besides, just because you can't use it, it doesn't make it a toy language, just a language that doesn't have support (or full support) for Linux.

A toy language has a specific meaning, it's not a generic work for "language I can't use professionally on my platform/industry".

I can't use Forth to make web apps either, which is were I make all of my money, but it's not a toy language by any means.


Yes, it's wrong to be dismissive like that. But no, "toy language" does not have a specific meaning. It is a slang insult, not a proper technical term.


As a Linux/Windows developer, you probably are not the primary target audience of Swift.

As a developer for Apple platforms, am I right to consider C# or Java toy languages, too?


There are many Java apps for Mac and Apple used to provide its own implementation of the JRE.


There are thousands of Apple and Android C# apps (see Xamarin).


Which, again, is besides the point. The parent was mocking the idea that just because a language doesn't support a platform it can be called a "toy language".

C# that he mentioned wouldn't be toy languages even if there were 0 Apple and Android C# apps. It would just be a totally professional Windows language with no Android/iOS support.

C# was professional even when it only supported Windows (and partially FreeBSD), before Mono came along...


> C# was professional even when it only supported Windows (and partially FreeBSD), before Mono came along...

As well as the decade-plus when Mono existed but wasn't owned by Microsoft, making cross-platform support merely incidental or even a negative from Microsoft's perspective.


Seems like a bit much to say that it's a toy language because it can't (currently) be used to write Linux or Windows applications. There's probably tens of thousands of developers at least making a living developing Swift applications. I bet there's more people making money writing Swift than Haskell and Lisp combined.


First, there's nothing "toy: about Swift. It's a full featured language on par with Rust, Go, etc.

That's some amazing CoolAid you've got there. But I'll grant you that it's on the same level with a bunch of other new languages that also aren't in general use for serious long term products yet.

But until Intel delivers boards that accept 32GB low-power RAM of the kind that goes into Apple laptops, we're not gonna get it.

Yeah, wouldn't it be great if Apple made this happen? That's the kind of innovation I want to see.

As for the bulk of the rest of your points, developers and common Mac users have different priorities, and your opinions are not mine. Who better represents professional developers will take time to find out.


>Yeah, wouldn't it be great if Apple made this happen? That's the kind of innovation I want to see.

Adding something with marginal utility for the majority of their users (since even pros can do just fine with 16GB) just because some insignificant minority runs multiple VMs on their Macs?

That's not the kind of innovation Apple was ever, ever, interested in.


You want Apple to make their own CPUs?


No, I want Apple to do what they've done in the past and get Intel to do something for them. http://mrob.com/pub/comp/mac-models.html#intel_special


> I want the possibility to dual boot or share drives with other operating systems, I don't need a new proprietary filesystem.

Then you're in luck. APFS isn't proprietary, as Apple's website claims that "Apple plans to document and publish the APFS volume format specification when Apple File System is released for macOS in 2017." So there will be nothing preventing drivers from being written for other operating systems, though of course it will require someone to actually do the work. In return you get a FS that improves on HFS+ in reliability, performance, and features (volumes). If there is enough interest I imagine it might even compete with btrfs for use as a root volume for Linux installations… though there probably won't be.

You may criticize Apple for not open sourcing their implementation of APFS, though I'm holding out hope they will do so eventually. An open source implementation would definitely be useful as a starting point for drivers for other operating systems, but to be fair, it would still require someone to spend the time to port it. Apple's HFS+ driver is open source as part of xnu, but Linux hfsplus doesn't support journaling which was added in 2002. That's not Apple's fault.


APFS isn't proprietary, as Apple's website claims that "Apple plans to document and publish the APFS volume format specification

Great. ... anyone else still waiting for the open FaceTime API announced in public on stage by Steve Jobs?


>Great. ... anyone else still waiting for the open FaceTime API announced in public on stage by Steve Jobs?

Only people who missed the part that some other company took them to court over a BS patent on it and won, so they couldn't finally do it.

Meanwhile, they HAVE opensourced other stuff, including Swift, in the meanwhile, so it's not like they have some big history of backtracking on open sourcing promises...


That was six years ago :)

The rumor is that Apple changed its mind due to the VirnetX patent lawsuit, which forced it to switch FaceTime from P2P to relaying all calls through a central server. But who knows if that's the whole story. I'd still like to see an open FaceTime.


What about patents?


I haven't heard any statements suggesting that APFS is patented or that Apple wants to charge for licensing it, but it's possible. We'll see what happens when 10.13 is released...


> I'm a professional developer, I don't want a new toy language

Why do you think Swift is a toy language?


I'm way more than three years with the Mac platform, and I have similar concerns. The impression I get is that Apple are not particularly interested in the computer market as opposed to the mobile (phone/tablet) market.

I think it is a question of emphasis. You've focused on the OS, but not all the innovation was in the OS itself. Remember when they added iLife? That was a big deal. Software you had to pay for on Windows was bundled free, and it was good. Everyone used iPhoto, iMovie was nice, GarageBand was awesome, although probably not everyone needed it.

(Good lord, just checked, iLife debuted (minus GarageBand) on 10.1. Evidently, I've been using MacBooks longer than I realized.)

The sense that they have an exciting vision and plan for the future of how people use and relate to the computer is gone. That sort of vision and the innovation that goes with it is reserved for the mobile space now.

They are still fine computers, particularly the notebooks, but the mobile products are driving the process now. I do think that is a shame.


> Integration of Cloud storage directly to the desktop?

This was done extremely poorly, though. Among the big sync providers (Dropbox, Sync, Box, Google), iCloud Drive is easily the worst.

The iOS app looks like something an intern cooked up. Hardly any iOS apps can read or write from the drive. There's no way to share anything in your drive. There's no way to share anything into your drive, either.

While Apple introduced the ability for your Documents folder to be stored in iCloud, macOS has completely lost control of the Documents folder in a very Microsofty way: Mine has turned into a rubbish heap of various application data that I didn't ask for. Savegames, virtual machines, "My Music", "Microsoft User Data", etc. — it's so unusable that I ended up hiding the entire folder and creating a new one that I could use for, you know, documents.

The iCloud services that are hidden away from the user seem pretty great, but the way Apple decide to expose the Drive is terrible. It's almost like they don't want users to actually use the drive.


> "My Music", "Microsoft User Data", etc.

Neither of these are Apple's doing - obviously in the case of "Microsoft User Data", and I don't know what "My Music" is but Apple's software puts music in ~/Music. (Heh, it seems I have a "My Music" in Documents, as well as "My Documents", "My Pictures", "My Games"… not sure whether this is Wine or VMware's doing, or whether they're there for the same reason as on your system.)

Applications are not supposed to use random subdirectories of Documents (that's what Library/Application Support is for), and in the case of sandboxed Mac App Store apps, can't. But I don't know what you want Apple to do about random unsandboxed third party apps deciding to do it anyway. I guess if the Mac App Store had been more of a success, more apps would be sandboxed, but still…

I guess it would be nice if macOS made it easy to sync custom (existing) folders to iCloud rather than just desktop and documents. But if you're starting from scratch, you can just create the folder under iCloud Drive and drag it to Finder favorites. If you're a CLI user, add a symlink in the home directory or whatever.

I agree iCloud Drive on iOS is really inconsistent and awkward, and it would be nice if you could share folders with other users. I don't know what you mean by "no way to share into your drive", though. You can add arbitrary files to iCloud Drive from the share menu (and you can access arbitrary files from the iCloud Drive app).


My Music etc., are made by Apple. My Music is Garage Band, I believe. My Videos would be iMovie, I believe. I think savegames end up in Documents because they are considered to be "documents" as opposed to app state (which would go under ~/Library/Application Support), but I'm not sure what the Apple guidelines are.

"Microsoft User Data" is apparently from earlier versions of Office. Newer versions will look for it in ~/Library/Preferences, and you can actually move the folder there to get rid of it. [1]

Here [2] is my Documents folder. It's 100% crap that I did not ask for.

As for sharing: This discussion was about macOS, although the same limitation applies to iOS: There's no way to share an iCloud Drive file directly. If you do right-click -> Share -> Email, you get an attachment that you need to send, even though the file is already reachable through the cloud. Every other competitor (Dropbox etc.) adds a "Copy Link" option to the Finder context menu.

[1] https://www.atulhost.com/remove-microsoft-user-data-from-doc...

[2] http://i.imgur.com/PvXbmJk.png


The current version of GarageBand saves projects under ~/Music/GarageBand by default. and iMovie uses an opaque "iMovie Library" in ~/Movies. I'm not sure about older versions, although from a quick Google it seems unlikely to me that it used those directories.

Apple's documentation[1] implies that applications are not supposed to use the documents directory themselves (only when the user selects it in the file picker).

[1] https://developer.apple.com/library/content/documentation/Ge...


> Take a look at Sierra: the only feature of note is Siri, which is half-baked as it is, and the things that did get ported over from iOS are half-done too.

I agree 100% with your point. If you want pointless changes to the OS that means you have to relearn a lot just to keep doing what you were doing, then yes, Microsoft is what you want.

If you want small incremental improvements that don't move around common functions for no reason, then stick with Mac OS.


>>and so I was tempted away in early 2013 when Apple released its second-generation 15" Retina MacBook Pro.

>So, you're merely 3 years of the platform, but have an opinion on how Apple "pivoted its attention" regarding OS released based on just a couple of OSes? Because I've been here since 10.2 and most releases weren't about breakthrough features, but refinement and minor changes (often regressions).

Exactly. Sigh. Kids these days. That is like the complain about Webkit being the new IE, without actually experiencing Web Development during the IE era, and falsely claiming IE era means IE 7+, then later acknowledged he wasn't even doing Web Development in IE7! time. Anyway......

There are lots of new fans into the Apple Ecosystem since iPhone. And without the RDF from Steve they quickly lose sight of Apple's greatest strength. Apple is the master of iteration! Very rarely will you see a completely new product CAT being introduced. They just continue to iterate. Sometimes taking two steps forward and one step back.

I agree Apple have spend less time with Mac and specially Desktop in General, but most of the point in article were not convincing.


Amazing how people forget RAM compression and Metal. The latter prolonged my Air life by 2 years!


Actually I care more about OS performance rather than new features. Of course the only reason I don't want new features is because Apple refuses to release a screenless iMac where you can upgrade the RAM and hard drives easily.


Isn't that what a Mac Pro is? Screenless and upgradable hardware.


It's screenless, but I wouldn't call it upgradable when you compare it to the old metal Mac Pro


All of the things you mention as proof of Apple caring about the desktop are things I don't give a shit about. I needed _none_ of the things you list. Nor do any of the people I work with.

True, there probably isn't much the average user would like to change in OSX. But it would be okay if stuff was broken was fixed and stuff that was wonky got improved.


Er isn't AFS still essentially vapor ware at this point?


On iOS Apple switched it to being the default update quietly in a point release: https://arstechnica.com/apple/2017/01/ios-10-3-will-be-apple...

It actually shows marked performance improvements, but that's because it focuses on performance on the type of storage an iPhone uses.


>but that's because it focuses on performance on the type of storage an iPhone uses.

If that's solid state storage, then it's the same storage all Macs will also use in the future (and most models now).


No? Pretty sure with the initial release they specifically said that they were taking their time in testing it thoroughly, but that they wanted some initial feedback from people. It's been in a preview mode available in MacOS for just shy of two years if I remember correctly, but they explicitly said from the beginning that they didn't advise people to use it for their main OS partitions or anything important for a while.


No, it has essentially already shipped with Sierra (only in demo mode for developers), and it's already on route with the final 10.3 iOS version (currently in beta) to become the official iOS FS.


No, it's almost ready. It's already in use in iOS 10.3, and signs point to a full rollout this year.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: