I feel it's getting worse as we get pushed more and more to vertically integrated platforms (Pixel, iPhone, Surface, game consoles, Kindle+Fire etc). We still have Linux, thankfully - at least as long as hardware manufacturers let it run on their computers.
Linux is thankfully totally emancipated from this now. There is no longer an exclusive x86 hegemony of desktop computing - its fringe for now, but ARM, Power, and RISC-V exist as alternative options with currently purchasable hardware that can run several Linux distros today.
If Intel / AMD did anything to try to lock platform exclusivity on their consumer offerings to Windows there is an escape hatch. And it also doesn't really align with their business interests to help Microsoft on that - they just want to sell hardware and both employ substantial development teams to support their hardware under Linux already. AMD just got a bunch of money from me this year upgrading to their latest platforms all for their ongoing commitment to Linux support.
You've got 3 areas:
1. Producers actively preventing you from running custom software (iOS, TVs, game consoles, some Android systems, ...).
2. Platforms providing default software and leaving an official way to replace it (pixels and a few other androids, ...).
3. Producers who mostly don't care what you run. (PCs)
It matters to me that things are as close to the 3rd category as possible. But let's not lump 1 and 2 together - there's a huge difference.
(Just imagine using a computer from 1993 in 2000, for comparison...)
There are no longer any supported 17" Mac laptops.
As a ThinkPading Linux-user I’m looking at all these other people and wondering wth they’re up to on their increasingly leased computer platforms.
Whatever that is, I want nothing of it.
What would happen to indie filmmaking if every TV and movie projector threw up a scary warning screen, requiring a cumbersome manual override, before playing anything that wasn't certified by the MPAA?
If you develop webapps, would you be OK with not being able to get a trusted SSL certificate unless an anonymous reviewer at Mozilla was satisfied with your UI?
Apple is intolerably paternalistic in how it tells people how to use their hardware and it is continuously puzzling to me how much positive press it still gets in particularly in communities like HN, which have the 'hacker' part right in the name.
I'm not the biggest fan of windows but at least windows has, for the most part, left me alone and let me install software on my machine, same for Linux obviously. Not even Google and Android tell me what apks to run on my phone other than giving me a warning.
It's not super intuitive but it isn't hard for power users and normal users generally shouldn't be just running unsigned apps anyway.
The iPhone and Apple's slow and steady OS convergence would like to have a word with you.
I'm not the biggest fan of windows but at least windows
has, for the most part, left me alone and let me install
software on my machine
No, they don't, but the article covers two different topics: notarizing apps for macOS desktop (for which you have to pay the developer fee, where previously there was no need to), and distributing apps through the iOS app store (for which you have to pay the developer fee and lose creative control). Presumably the app store is relevant because there is no other way to distribute apps for iOS.
If you try to notarize an app, you chase links for a while untyil you get to https://help.apple.com/xcode/mac/current/#/devac02c5ab8, and then to get a Developer ID, you chase links for a while until you get to
If it's free, it's pretty well hidden, like US tax prep companies' legally mandated "Free File".
But there is still a slight overcharge in work comparing to 64 bits only platform. So Apple, one of the richest company in the world, decided to cut the cost, once again.
I came to believe that every decision apple is making regarding its products, hardwares and softwares is about cutting costs at every corner and maximizing profit.
Switch to in house ARM processor from standard Intel ? No need to pay intel anymore...
These new keyboards ? Well, may be it's about slim design but, having be able to use them for 3 years and having had to remove some of the keys occasionally and comparing them with the ones of the macbook pro 2015 i bought on ebay 2 month ago (I hade to convert the Querty keyboard to an azerty keyboard by switching some of the keys), I can say the quality of the material and the cost of manufacturing of these 2 keyboard are not the same.
Additionally, don't you have to sign a contract as a developer to use the notarization service?
Today it is a layer on top. But my hope is that Apple comes to peace with non-AppStore distribution, with automated notarization and waiving of the $100 fee for open source, students, etc.
Mac OS X was such a joyous platform to learn to program on. But macOS is throwing up more and more obstacles to native apps. When you squeeze a balloon, the air goes to the other thing.
That doesn't override the need for free expression. Software is art, and no one should have the ability to ban a piece of art. However, it makes sense to design systems which take these risks into account.
So by default, Macs are shipped in a "safe mode" that only runs vetted software, and anyone who wants to run less-safe code—and accepts the risk—can type four words into a Terminal window and go along their way.
I really don't think a single Terminal command is too much to ask—it doesn't take long, and it's a good litmus test. If you can't open a Terminal window, you probably don't understand the risks involved in running untrusted code.
† Outside of extreme scenarios involving zero day exploits.
Have you ever used the Google Search Console? Your website is indexed well only if you fix the issues that they find with your site.
I was recently told that my buttons were too close together when on a mobile device....
How much should you have to pay for unlimited bandwidth, hosting, downloads, user ratings, reviews, feedback, crash logs, showing up in searches, the chance to get featured, and other features like CloudKit etc.?
> Apple is demanding what amounts to a non-trivial amount of creative control over the visual appearance of games
Where did you get that from?
How is that different from Steam, Nintendo, Microsoft and Sony not allowing certain types of content on their platforms? For example, you can't show explicit pornography in your game and expect it to be approved anywhere.
As for not allowing "blue screens of death, simulated error, glitch art, brokeness" that TFA complains about:
Have you seen those "YOUR PC IS INFECTED BY 42 VIRUSES!! CLICK HERE TO CLEAN SYSTEM32! PERMANENT HARD DISK DELETION IN 2 SECONDS!!!" ads?
What if apps start doing that for fun then charge in-app purchases to clean the in-game viruses?
Would you want the job of being an arbiter over whether something like that is malicious deception which preys upon user naïveté, or just a cute quirky joke?
As others have already pointed out, I'm talking about the fact that Apple charges $100/year to make your software runnable on a Mac, regardless of whether you want to use their app store.
> Where did you get that from?
From the blog post that we're all supposedly discussing: "The next reviewer saw the “blue screen of death” art and said that it conflicts with copyright (it’s Microsoft) as well as simulated error not being allowed. So I changed that too. After that it got arbitrarily rejected for another reason… so at this point I changed so much about the game that it wasn’t even my work anymore. It didn’t reflect any of the things that were important to my work. Simulated error, glitch art, brokeness, historic UI’s are deeply important to me because they reflect a computer history. None of that was allowed. I had to change my games."
> How is that different from Steam, Nintendo, Microsoft and Sony not allowing certain types of content on their platforms?
Steam is a free service that doesn't prevent me from running non-Steam software on my computer.
As for your other examples, I don't think it's different. I think restrictions on user freedom are made worse, not better, by the fact that lots of companies are engaging in them.
Go work with Purism if you want to ship whatever
What a shock Apple is also nickle & diming for services as hardware sales atrophy. This is business 101.
If we don’t like how businesses work we should stop working for them.
Paying for beer & leaving speech free never works. We end up talking about supporting paid beer. Free speech becomes all about what we buy & sell.
If you want a free society that doesn’t end up in a nickle & dime system of inequality a test should be is life free as in speech & beer? Organized around resource distribution sure, but free of coercion to support paid speech all day (where to be viable in the eyes of society work must earn money).
That’s the DNA of our culture. These are not random things. This is what we and Apple talk about all day: financial justification for behavior.
That’s a complete corruption of free speech.
You seem to think that 30% of earnings is not enough for this ... so yeah, how much should you have to pay for that? 50%? 75%? 100%?
The point is there are always going to be limits to what you can release on any platform. Does that count as creative control over developers?
If there truly are no restrictions on what you can publish on Steam, then my initial list of examples is wrong, but that still doesn't make Apple the sole company with restrictions for their platform.
Those are things Apple provides for its users' sake, not for developers. The amount I should have to pay to allow users to run my app on their machines, is and always has been ZERO.
You're in luck!! The amount you have to pay to allow users to run your app on their machines, is still ZERO.
Whether you can make them trust you enough to manually allow your app to run without notarization, is another matter.
It's about money. They want to close down the platform like iOS, as they said a few years ago, in order to bring the business model of the mobile platform to MacOS by forcing them to intall everything through the app Store.
Now there is notarization, but it will not last. They're just boiling the proverbial frog slowly, step by step. In 2 or 3 years, it will also disappear.
It's not only the game but a shitload of little utilities which will get lost with notarization
Once the users get used to not having as much third party software as before from outside the app Store, except big name like mozilla or google, they will stop notarization altogether and impose a 30% fee for every software sold on the platform.
With the system partition permanently set as read-only device, as they currently test on catalina, even if you can still currently mount it in read-write, MacOS bill be lock down like iOS. The only way to install a software outside of Mac App Store will be to jailbreak it thank to a "security flaw".
As i said, it's just about money.
Only technically naive users will stay on the platform and developer whishing to take advantage of them with softwares paid through subscription model.
You don't have to pay the $100/year to run the apps distributed on your website but you do if you don't want them to get a message that says it's from an unverified developer. That seems to me like the system is working exactly as intended.
Not for me, the end user. Less developers releasing applications for my Mac means less options for me.
You can still release apps without notarizing them. You just can't get around the warning message that the application hasn't been verified because, well... it won't have been verified. You can still release it and users can still run it.
If I really want to run an app that is not notarized and I trust its developers, I can always manually allow it to run.
I'm one of those developers, and I don't appreciate being strawman'd and dismissed.
Now, granted, that statement can't possibly cover every single reason that someone might have for leaving Apple's platforms but those don't seem relevant when having a discussion about this specific context which is what the article and these comments are about.
The fact is that nothing that's been done changes the ability for people to run software as a developer. The difference is whether the end users trust you directly or whether the expect Apple to verify that trust for them. Both scenarios still allow them to run your software.
If you want to bring more clarity and point out where I'm arguing a straw man, I'd appreciate it.
1. You can run any unsigned or unnotarized software from any developer by right-clicking the app, selecting "Open", and then click the "open" button in the scary warning. It's really simple. I wish people would stop pretending like it was really hard to run unsigned software.
2. Apple does not "curate" or "review" software distributed outside the Mac app store. The notarisation process runs some automated malware checks, and that's it. The goal is to block malware, not to limit the content you put on your computer.
3. Notarisation takes less than ten minutes. It's easy to automate it, and you can also do it manually if you want. You can staple the notarisation ticket to the app, but you don't have to. MacOS will look up the ticket via a webservice if you don't staple it. The documentation sucks, but that's the only bad thing you can say about it.
4. Apple does review stuff you submit to the Mac app store, but fortunately it's entirely optional to submit stuff to the Mac app store, there's nothing stopping you from releasing software outside the app store without any review.
5. Apple did end support for 32bit apps, which sucks, and I don't have anything good to say about that.
It reminds me a lot of the uproar on Windows when shortcuts were indistinguishable to an application from the actual folder. When MS changed this so that the end result was the same but the applications were also aware of whether they followed a shortcut or not, there was a bit of outcry because it broke a few apps that relied on the file system not knowing the difference. In other words, these apps built their entire functionality around an unintended bug and then cried foul when that bug was fixed.
In this case, at least, the author just seems upset because they can't continue to create things that are immediately abandoned afterwards. They've become reliant on being able to constantly create new work.
The complaint sounds too much like someone building a sand castle on the beach and then complaining when the tide has washed it away.
You can build apps and distribute them without a developer account. But macOS will show a scary warning to your users.
That's why I wanted to post my clarifications. I don't think that the article is technically wrong, but it seems that people have not read it closely and just came to weird conclusions.
And then the update after High Sierra happened and killed it outright. And as much as I'd rather boot into macOS for non-gaming items, because my monitor hooks into the eGPU, I boot into Windows for anything asides Xcode which is pretty heavy. Could stick to High Sierra, but that isn't exactly an option.
It truly is a shame that we couldn't just be allowed to change a flag or confirm a prompt to maintain business as usual. As such, this may be my last Apple branded PC and will just use it as a build machine in the future whenever I inevitably upgrade.
Maybe Nvidia and Apple will work together on metal drivers, but quite frankly, it is the edgiest of edge cases so I imagine the chances of that are minuscule. That isn't to say I don't somewhat understand their reasoning: they basically appear to be scaling back numbers of areas they can't safeguard or require support (non-metal drivers and 32-bit respectively). But at that point it seems like provide something that users can do to bypass it.
> Isn't this just a matter of opening up the project, changing one line and recompiling? Should take five minutes, it's not really a big deal? Yes and no. The 64-bit change itself is small but they change enough other things every few months that recompiling against new versions of the libraries doesn't simply work. You get a few linker errors and have to look up the new names for a couple of functions. Or there's a new element in one of the libraries with the same name as one of my variables so I have to find+replace to change its name. And then you run it and find that it's in portrait mode, squished into half of the screen, so you have to look up what changes they've made to how screen orientation works and change a few more lines. [...] It adds up.
Sure, you can work around it, but from the developer's perspective their game has just been designated obsolete for no reason that Apple couldn't have fixed themselves.
If I had to do this every time to run an app, I'd throw my PC out of the window. How are you people OK with that?
As a user I say suck it up. Notarization and 1st party QA will make users lives so much easier. I have yet to see people run into serious issues with either that weren't doing things that entitlements/signing were designed to prevent.
The $100/year dev license kinda sucks if you just want to hack around and distribute code, but it also goes a long way to stopping people from creating spam accounts and evading bans. But it's a pretty trivial burden, if you're distributing software professionally and can't cough up $100 in revenue in a year... maybe try your hand at something else or just go the amateur route?
I also don't get complaining about obsoleting 32 bit. Use obsolete software on an obsolete OS in a VM like the rest of us, we all hate supporting things until the end of time.
But how do you even go the amateur route? You still won't be able to distribute what you make.
A big part of my childhood was making games, and sharing them on shareware sites in the mid 90s. It sucks that kids today aren't going to be able to do that. There was no way I could afford $100 as a 10 year old.
I predict this change will cause kids to embrace open source more and distribute their projects in a way that can be shared and reused.
But for now, non-notarized executables don't fail, they just need to be enabled by an admin account through permissions.
MacOS might have bugs due to neglect, bad priorties, too fast deprecations, etc.
But all the extra restrictions (notarization, sandboxing, dropping 32-bits, etc) are in the right direction for a consumer platform (i.e. not for devs), for making the computers easier and more secure, like appliances that just do what you bought them for. And they will be part of any/all platforms going forward (including Windows, and new platforms like Fucscia).
As for Linux, it exists for 2.5 decades, it has been free since forever, it's still not adopted by the masses. It's not in the direction that most want.
I'm arguing against your comment: "You surely can’t be oblivious to how this has been a process of boiling the frog slowly — almost certainly they will eventually be permanent-blocked like on iOS".
And I'm saying that this is a good thing in some ways (whereas "boiling the frog slowly" doesn't just point to the graduality of the process, but also implies it's bad).
I don't disagree on the graduality or that we will "eventually see it evaporate" (it being various current more open abilities).
*Games I worked on: https://livegame.show/play/ride_v3 | https://livegame.show/play/pewpew_v6 | https://livegame.show/play/bubbleshooter3
I will say though, Apple does seem to almost purposefully hold back the web on iOS. I suppose this makes native iOS apps, which must pay Apple 30% of their revenue, more appealing to users. On iOS, the Firefox and Chrome apps aren't allowed to include their own browser engines. They have to just wrap Safari webviews.
On Android, the Firefox app is allowed to include it's own browser engine. An engine that supports adblocking extensions like uBlock Origin.
Just don't support ~~MacOS~~ [e: iOS, my b], this is what I mean by going the "amateur" route. If you don't care about distributing software professionally, then just don't deal with the headaches and costs.
For all the major means of distribution the problem isn't not enough apps, the problem is too much garbage - so the direction of improving end-user experience is more filtering of apps and developers, not easier access to distribution.
Many users agree with Apple.
But those developers are going: "No those users are wrong and should join my mob to make Apple let me do whatever I want to users' systems."
Oh, and they need to authorize it from an admin account. So, yeah. Not sure there are a lot of people out there that dumb. Unless you're some well known downloaded product like Blender or something, I would think you would have a hard time getting people to run your software.
At the same time, to be completely fair, you probably should have a hard time getting people to run your software. That's pretty much exactly how a lot of malware is distributed.
From an effort-to-income perspective, macOS isn't worth it.
Creative Suite stuff, which actual pro creatives use, on the other hand, has been 50-50 Mac/PC sales, even though the Mac has 10% or less of the desktop market.
That's how many creatives are on the platform.
If Krita can't tap into them, it's not the platform that's at fault...
Affinity, for example, does fine, as does Pixelmator...
It may be a wonderful program now, I don't know, but I do remember KDE's initial efforts on the Mac -- and they were... I originally wrote "just awful," but I'll amend that to a weaker "not very good." I haven't actually seen any desktop GUI program whose primary platform is Linux really take off on the Mac, with the possible exception of LibreOffice. We have so many good cheap-to-free alternatives.
In fact I used "webkit" before it was webkit and before it was cool: I used Konqueror as my main browser (I didn't care for it being not compatible with many sites at the time, as I cared mostly for simpler, text-based sites which worked fine).
I remember all the "KDE Office" apps, various drawing programs, etc. They were and many are on a perennial semi-finished state, in a way that Windows and OS X equivalent are not. Krita has done well on that front, and a couple of others, but it took many years of not great releases. And in Windows/OS X they are non-starters as ports...
... As a user, statements like that – about any platform – honestly just make me want to avoid anything from the developers who make them.
If a developer is going to attack users, the users are better off on Apple's side.
Apple has introduced Catalyst in Xcode -- this translates iOS apps to macOS, compile once and run on two platforms.
Mac App store has been notirous for not getting as much love as the iOS one but this change might reverse the direction as there will be a lot more apps coming for the Mac App Store now that they can leverage their iOS dev ecosystem.
I'm pretty sur once MacOS has switched to ARM, system partition will be in read only mode, you will not be able to install app outside of the AppStore except if you jailbreak thank to a "security flaw".
This is about whether a 3rd-party application downloaded from a 3rd-pary website can be run. It used to be that you only needed to pay to get onto the distribution and advertising channel, now you need to pay not to make your user jump through special hoops on an admin account.
It's not implausible to suggest that in the future, that loophole will be removed as well. Apple does not want you to download apps that they can't personally vet.
The complaint isn't that you need to pay to get into their store. Of course you do, that isn't surprising to anyone. It's about whether distribution channels exist outside of their store.
As to why that's a problem? To quote the author:
> It really upsets me because the iOS App Store policies basically prohibited art or anything interesting (nudity, glitch, error art...temperamental depending on who reviewed you for approval).
> It was hell for me to get things approved, and even when I got them approved I had to change so much about my work it really wasn't my work anymore. A platform holder shouldn't dictate so much about someone else's work.
Apple has been consistently behind the curve when it comes to recognizing the artistic potential of games. On a cultural level, we don't want want them to be a gatekeeper for a medium. As a company they aren't responsible enough or smart enough for that job.
Gatekeeper was added in 2012, and required the user to either take steps (right click open, or disable gatekeeper via the Gui or command-line) to run apps not signed by a developer ID, which costs $99/yr.
Eventually Apple removed the option to disable Gatekeeper via the GUI due to developer abuse (e.g. games, including Minecraft, walking children through globally disabling security settings rather than signing their apps).
Notarization doesn't add the requirement of signing, nor does it change that you need a developer ID for $99/yr
The change is that in each release of Mac OS this process is getting harder to avoid and is starting to include additional requirements. "You need to sign apps" becomes "we need to run an automated review on our servers for every release you make." This matters because a lot of the original criticism around the original Gatekeeper was dismissed via the argument that it was just an optional security feature that was turned on by default. It wasn't like Apple was going to block 3rd-party marketplaces, this just made Mac OS safer for computer illiterate users.
With El Capitan, people are starting to suspect Apple does not want unvetted apps on the Mac in any context, and it seems more likely that future Mac OS versions will continue the trend of having stricter, more burdensome vetting requirements.
It is increasingly harder to take advocates at face value when they say that 5 years from now I'll be able to run a 3rd-party app on my Mac that didn't go through a manual review. Which might sound great to some people, some users love the app store.
As to why I think that could be a problem for creative mediums, see my parent post. It may not matter if Gatekeeper begins to live up to its name, because creative mediums may just migrate off of Mac. But if Mac becomes a platform too big to ignore, or if other platforms like Windows follow suite, then I think creative mediums will suffer.
Why is Apple blackmailing the industry over malware protection instead of making it a core part of the OS experience? I'd rather have malware protection than losing my Esc key or redoing the app window background textures yet again.
What do you offer to earn my permission to allow your app?
You mean like games released last year which require heavy GPU usage? That's neither obsolete, nor a great experience.
One will do. I'm sure this won't take you long. Even if you can cite one, all that will prove is that the developer was so incredibly stupid that they ignored a full decade of warnings about how 32-bit software would be deprecated soon.
How is it not that obvious ?
As a user, what does $100 a year even accomplish? It is nothing more than a racket.
It is clear from the post that Apple's platforms are hostile towards any sort of digital executable art.
They deprecated 32 bit support almost a decade ago. Software has a lifespan and always has, to run obsolete code you need to jump through hoops to get an obsolete target running. This isn't new.
Why is 8 years self-evidently "long enough"? Banks have code from the 1960s still running, and plenty of people have industrial hardware much older than that. Many of my household appliances are older.
It's only "obsolete" code because Apple has decided it is, it didn't rot.
The only reason "old code" still works in banks is because untold man hours are spent wrapping around it, working around it, and testing it.
A long time ago, some exec saw that some old thing needed updating because the world has changed. There are new regulations, new customers types, new products, new strategies...
A bunch of people get hired. A bright future of next thing is promised. It gets everyone excited and devs start cutting little chunks away off the old thing, making a new thing. Now there are 2 things to maintain, but that's ok, some day new thing will become next thing.
But the exec leaves (probably to another bank) before the next thing became a thing. Another exec comes in, declares the new thing is crap because it doesn't quite get the job done. A next next thing is planned. The previous new thing is now the old new thing. It will be removed, someday. Just not today, because it's _a_ thing now. old new thing will be sunset when next next thing arrives.
Many many years and several execs later, there are many things maintained by a small army of devs and testers. These things work in really strange, archaic, or even stupid ways. Nobody knows how (or worse, why) the entire system works. The most knowledgeable people only know how it _behaves_.
But it's there, still working. And your money is in that thing of things. Many peoples' money are in it.
That's not necessarily a good thing.
That said, I don't agree with such an argument. It seems like seeking out technical excuses for crappy user outcomes. In another thread on this topic I said: in any job there are parts that are fun and others that are not so fun but necessary. They should try to keep customer apps working.
The Mini was the last Mac to make the switch to 64-bit, in "Mid 2007". So a little over 12 years.
For comparison, the Macintosh was on M68K from 1984 to 1994 (10 years), and on PowerPC from 1994 to 2006 (12 years). IOW, the Mac has been on 64-bit x86 for as long as they've been on any CPU architecture ever.
Another fun fact: there was less than 10 years between the last Apple IIe they sold and the first Power Mac G5.
The 32 bit support is actually a bit of a red herring here; the bigger issue for some is Apple dropping support for Carbon, a 32 bit only API made to ease porting of classic MacOS apps from the 80s and 90s to OS X, and itself deprecated in 2012.
I guess Apple could have created a 64 bit version, but it also seems reasonable to expect software developers to have rewritten effected software in that timespan. Carbon was originally a compromise for companies not wanting to learn an entire new OS (NeXTSTEP) and API/libraries when updating their apps for OS X -- the fact some apps still need it today is kind of mind boggling.
Then, I suppose the Win32 API has been stable-ish for far longer.
There is lots of great modern art. but none of it makes older art "obsolete".
If a painting sits in a moldy basement for a decade, don't be surprised when someone finally go to check on it at what happened.
In other words it's not about the overhead, it's about the reliance on x86.
It seems kind of silly to block x86 modes on x64 chips when the backwards compatibility needs of Windows pretty much insures "eternal" x86 modes. It's going to be a while before x86 modes disappear on Intel or AMD chips, and there's evidence to support that like the Itanium (IA-64) failure, and the fact that Windows on ARM made it a commitment to emulate x86 (but not x64) on ARM. x86 is likely going to be a lasting "lingua franca" ISA. Right now I'd almost put more money on x64 being retired before x86, and if anything from that perspective Apple is more tightly coupled to an ISA than before this choice.
That really depends on what their long-term goal is. If the speculation & rumors about an eventual ARM transition have any merit at all, then sunsetting 32-bit support now can help them smooth that transition, since they'll have a lot less work to do to implement whatever Rosetta-esque dynamic binary translator they would need.
Which it seems unlikely that Apple could do in secrecy what Microsoft + Qualcomm could not do (very publicly), but I guess all things are possible.
This is kinda where I'm at with this. Surely anyone even remotely garnering an audience could raise this amount annually. It's a decent cost to an individual but to even a humble audience, it's peanuts.
If you can't raise even that, then IMO your game being lost is of little consequence. Not everything farted out by every indie developer is a lost masterpiece, sometimes stuff is lost because it lacks the traction to merit archival.
When I was in high school working on the weekends, that would have been two weeks wages, which I had other stuff to spend on. And that's from a privileged background, I know plenty of folks who's weekend/summer jobs paid towards rent/bills for their families.
That said, Apple probably doesn't care about the teenagers or indie devs out there who want to make a little game for their platform. However I'd also say to those folks - if you're strapped for money, don't develop for Apple devices. Go make a doohickey on a windows or Linux box.
Agree. But this isn't some lost nobody. This person has a Wikipedia Page [https://en.wikipedia.org/wiki/Nathalie_Lawhead], is an established game developer and artist, and lives in California. She's won multiple awards and she owns at least one Mac.
Like come on. The refusal to drop a hundred bucks on something that is apparently so important to her reeks of good old fashioned stubbornness, or at worst, entitlement, but even then, she has a significant audience. Crowdfund it, cough up and stop whining.
Like I don't entirely agree with the $100/year model being the only option from Apple, but we're all paying that particular piper. Why is she so special that hers shouldn't be subject to the same?
Also, some people may look up to Lawhead as a role model as a programmer/artist but not have awards and lots of $.
By this logic the cost of anything that you can't own and use ad-infinitum is $infinity.
> Also, some people may look up to Lawhead as a role model as a programmer/artist but not have awards and lots of $.
What does that have to do with her ability or willingness to pay?
And those communities succeeded, and produce interesting games/genres, primarily due to the lack of barrier to entry (in the sense that you eg bought doom & sc for the game, and then realized there was an editor attached that came free)
Annoyingly, they never learned to share their source code and assets, but otherwise.
Today indie gamedev has become more and more a minaturized version of the AAA industry, and that's a terrible shame.
Apple is not opposed to free games.
> Today indie gamedev has become more and more a minaturized version of the AAA industry, and that's a terrible shame.
I'm sorry but this comparison is ludicrously hyperbolic. Firstly because the Mac is hardly the platform for free and open gaming, relative to the topic at hand because the Mac itself has a high barrier to entry, being so damn expensive just for the hardware, and secondly because the total expenditure, if you wanted to publish, not develop because ANYONE can develop on a Mac for precisely no money, to publish a game for free is $100 for the dev license.
And, additionally, I see no reason the games couldn't be distributed as source code for free for people to build and run themselves. Is it as elegant as App Store? No, obviously, but that's what the $100 is paying for.
I'm sorry I just don't see why this is a problem. Mac has always been a bit of an exclusive platform, which makes it both of higher quality, but also does indeed introduce barriers to entry. If those barriers are too much for you, then don't do it, you're still spoiled for choice for distribution channels that aren't the Mac App Store.
A lot of teenagers who are learning to program right now are definitely not going to be able to pay $100 a year to distribute toy programs they're making for fun. Heck, a lot of adults developers won't pay that. Apple has just made Linux significantly more appealing for all of these people.
I own a MacBook Air at home (and a Linux desktop). I know this will definitely motivate me to go back to a Linux laptop for my next purchase. This move isn't just hostile to users, it's hostile to developers, hostile to the people creating value for the platform.
For a user, right-click-open is easier than installing Linux.
I've unfortunately come to the same conclusion.
The last time when I was forced to use a Mac laptop, I hated every minute of it... Not because it was smooth and worked well, but that I had less and less freedoms to manipulate the system.
The only thing that has no built-in off switch is TCC ("this app would like permission XXX"), although without SIP the ability to nullify it with code is theoretically at your disposal. I wish someone would make a MacForge plugin...
The point is a) about Apple's hostility b) what to expect from your average macOS user as a developer (most will have no idea what SIP even is, and then telling them to disable SIP is borderline irresponsible).
Regular software shouldn't run into SIP. It will run into Gatekeeper if the developer doesn't explicitly notarize/sign it, but I don't think it's irresponsible to tell users how to bypass Gatekeeper.
I think we're in a very, very narrow window of time where Apple is not popular enough that dropping them means apps are severely limiting user reach, but is still trying to push itself as a platform where creative things can happen -- a tool for graphic professionals.
I'm cautiously supportive of this move, even though I use a Mac occasionally.
Then they'll discover the apps which are available for them.
Your app can either be one of them or, unless you're massively popular, people aren't even going to know that your app is missing.
If Krita goes, there's not going to be any shortage of painting apps for Apple devices.
And also obviously, if literally only one app moved off of Mac, nothing would happen.
So what's the threshold? If Blender stopped supporting Mac, would anyone here reconsider using them as a development platform? If Photoshop dropped Mac, would anyone here notice or make a purchasing decision based on that?
Is it significant that Photoshop is being shipped for iOS now -- do we expect that professionals will take the platform more seriously because of that?
The point that you're trying to make and the point I'm trying to make is effectively the same:
Apple platforms have no shortage of creative apps, have never had a shortage, and their library of apps and games is always increasing, not decreasing.
If Krita abandons Mac, do you think it will hurt Krita more, or Apple, or users?
What's more likely: Will people sell off their Mac and buy a PC just to keep using Krita, or would they rather find one of the many other painting apps available for Macs?
Regardless, we're kind of arguing past each other. If Krita abandons Mac and no one else does, then Mac users won't care. If a bunch of indie games abandon Mac, and then Krita does, and then something like Audacity or Blender does, and then it starts to spiral from there where Open Source and hobby developers en-large decide this crap just isn't worth putting up with for a project they release for free, then users will care.
It seems your main objection is just that you don't think that's likely. My assertion is, "Apple won't court developers until they start losing apps." Your assertion is, "Apple is not going to start losing apps." I don't see those two claims as being incompatible, and I don't think your assertion is worth arguing about given that we can just wait a year or two and see what happens.
> do you think it will hurt Krita more, or Apple, or users?
My broader assertion here is that we're in a narrow period where most people aren't using Mac computers, even for creative work -- so developers have a lot more freedom to make these decisions. I don't know the stats for Krita, but it wouldn't surprise me if the number of Mac users was small. I know that as a game developer and Open Source developer, I'm not going out of my way to support Mac. It wouldn't be worthwhile.
Other Open Source developers have different approaches and opinions on that front. Some of them are a lot more charitable than me.
Apple has been the underdog in the numbers game since 1980. Macs have always had a smaller percentage of worldwide computer users, compared to Windows, but higher than Linux etc.
But that number has always been increasing. There are over a 100 million Mac users, and many of them doing creative work. Isn't it the second most-used desktop OS?
If someone is going to casually dismiss even a million people, I don't know what to say to them.
That is, the owner is no longer in control of his machine, Apple is. And if the machine will no longer listen to its owner, can it really be said to be his?
It’s all baby-steps, sure, but you have to draw the line somewhere. And for some people, that line was here.
2. Instead of dropping macOS support, why not charge Mac users? Since you've already paid the developer fee, why not move Krita to the Mac App Store? Tell users that it's a cost of supporting Catalina.
3. You could continue to offer an un-notarized legacy version for free.
Abandoning your Mac users is more likely to just make them move to other apps.
If you feel your userbase is not techie enough to know how to manually run un-notarized apps, they probably won’t be able to switch operating systems just to use your app either.
We're not that far removed from a time when every non-nerd's personal computer was a cess-pit of viruses and "addon search bars". It was very, very bad. It'd be much worse now given how much more sophisticated the malware scene is, and how much more connected devices are (always-on Internet wasn't even the norm in the early days of the every-computer-has-some-trojans-and-bonzai-buddy era)
I don't know. And that part scares me.
The future is probably 10,000x as many total computing devices, most of which will be fairly locked down and not as hobbyist-programmer friendly as we've been used to but much better & safer for most folks—but also 100x as many tinkering-friendly computers, and much cheaper than they were "back in the day", and 1,000x the encouragement and free or cheap resources to learn to program as we had. My concern level that current trends will make programming anything other than more accessible, generally, to kids than it was back in the good ol' days, approaches zero. Even if the percentage of computing devices that are tinkering-friendly drops.
Apple really needed a comparability method similar to how windows allowed you to simulate old environments. not always successfully but at least it was a chance
That notarization fee / process, I have not read beyond this story, does look to be a burden beyond what should be reasonable.
Until then, I'm completely in favor of putting all kinds of hurdles between uninformed users and executing random executables downloaded from the internet.
That said, if it required something like a "qualified electronic seal" I think the cost would be substantially more (600 EUR a year?), and Apple might turn around and start charging a notarization fee to recoup their infrastructural costs.
Want to play experimental itch.io games? Open a Terminal and type:
sudo spctl --master-disable
Some people will say this is a bad solution because it makes your computer less secure, but, like, you can't have it both ways. Either you allow experimental software and accept the risks involved, or you don't. Maybe don't play experimental games on the same machine you use for important work.
I don't. As a user, I don't need Apple arbitrarily allowing big players' apps through their approval process while they hold smaller developers to a stricter standard. That will stifle innovation.
If security is the touted excuse, macOS already has sandboxd which can be used with arbitrary apps that aren't in the App Store.
Linux solved the security problem with Snaps, Flatpaks and AppImages, which all use various layers of containers, kernel namespaces and isolation to provide a sandboxed environment for apps.
And sandboxing is completely orthogonal to the fine-grained revocation that notarization allows for. A sandboxed app could still be malicious: say, a weather app that asks for access to your Contacts ostensibly to show weather at your friends' location, but also uploads all the Contacts info to a malicious tracking service. With notarization, this app could have its notarization revoked once it's discovered.
Apple is rejecting anything by developers that don't pay them $100 a year, stifling competition in the process.
Apple has a history of conveniently rejecting apps if the rejection is in their financial interest.
I think by "user" it was meant the average user which by my estimation is not super technical and mostly sticks to larger apps anyways. Are smaller developers being held to a stricter standard though than larger developers on notarization?
PCs and the WWW are massively bigger mass markets , not walled, and it's not like PC users are geniuses. Unless you mean that apple specifically caters to even less technical people.
I've never had any issue with macOS software acquired outside the MAS and I've never heard of any non-technical user with a problem either.
Notarization is not about security, it's about the iOSification of the Mac.
Do Trojans not exist at all on Mac? Honesty question (I have certainly seen them on PCs; on Linux I worry more about packages).
When Zoom was found to have a serious security issue, Apple stepped in and blocked execution of the older versions of Zoom.
This would not be possible if malware just mutated to avoid detection. For this reason they want to attach a verified developer identity to applications, something backed by an individual's physical address or business records. You pay for this verification, and get a certificate to sign your applications.
New this year, they added a notarization service. This fixes some issues with signatures expiring, but is also built where Apple scans the application for malware before signing.
The scanning is new, but the developer id requirements have been in place since 2012.
If you distribute an unsigned app, the user will by default not be able to open it. You can set an exception as easily as selecting 'open' from the context menu and then saying you will allow the app to run.
You can also disable both the malware list and gatekeeper in general.
Note this is all separate from distributing in the Mac App Store, where you may run into additional policies around requirements for sandboxing, branding, use of private functions and frameworks, etc.
On linux, package signing is typically direct trust. You can manually choose to trust a packager who isn't trusted by your distribution, which is trusted by default.
I don't think linux distributions have anything to deal with malware after-the-fact, however.
I know it's anecdotal, but I doubt there is any objective data out there.
I don't use app stores. Never have. And it's for this reason. I do not favor the control coming. If my computer will not run an executable on command, it really is not my computer. Useless and definitely not something I trust.
Actually I do grab free things off Google play, but I side load as often as I do that.
I think we're gonna see fairly modern systems added to the category of retro computing. It will be a system that just runs programs developed by others. Imagine that!
You don't really - you can use Electron and other frameworks to build it on a different OS, and just package it for macOS.
At work, I do this for in-house command-line tools. I have automated unit tests and integration tests, and all the compilation is done on a CI server. And if a problem slips through automated testing, well, then somebody will ping me on Slack and I'll fix it.
I only need to take out my MacBook Air to debug something once every year or two.
We may actually need to start a conversation at work about whether we want to continue supporting Macs internally. We could notarize our own CLI tools, but we also rely on lots of open source CLI tools, and I understand that all of those will eventually need to be notarized, too.
I can’t test on a mac, as I don’t have the money for one, so I build the binaries with Travis CI automatically, and never test them myself — I also can’t notarize them, obviously.
this is not hypothetical; I literally just received a GH issue update asking if we can use community funds for the $100 ransom^Wfee.
It's too expansive to pay just to distribute games that will be even refused probably on the app store because it's GPL.
Apple is just killing freedom and continue to take advantage of free softwares (free developers) on BSD, mach kernel, a lot of unix tools... even on projects started by Apple like LLVM Apple takes all advantages of all third contributions.
A previous build of VLC was pulled from the store because someone claimed they had copyright on some of the code - because they did not want VLC in the store.
That said, Apple's membership-levels page features "software distribution outside the Mac App Store" as a benefit of the paid level: https://developer.apple.com/support/compare-memberships/
I also see users asking for 32 Windows version of the games, there are still a lot of people running old computers and have bad internet connection and there is no good reason not to package some basic game app for them if it is possible.
That said, I agree with the article’s author that devs should be free to distribute whatever they want without warnings and without having to update annually. Not every piece of software gets maintained but it can still be useful, especially for artistic works. Let the App Store be the place where Apple certifies safety and leave the rest to the user.
The author of this piece is 100% correct: none of these changes have anything to do with security. It's all about control. Apple, like most companies, just wants to control everything, whether relevant or not. The end goal is to control exactly what software is ran on their platforms, as clearly evidenced by the linked articles. I highly doubt that we'll be able to run any non-Apple approved tools in a year or two. They are turning OS X into iOS and have already renamed it macOS to let people know it's just going to be a walled garden of shit. The only reason to ever purchase an Apple computer was OS X. Now that OS X is turning into a shitty version of iOS, that last reason is gone. And that's before we get into the garbage hardware. I simply don't understand why anyone would want to support this platform going into the future. It's a platform that stands for censorship. Every component on the platform is designed around it. It's no longer a general purpose computing platform. In the future, it will be even less of one. All in the name of "security" and "privacy." Yup, save the children. But plenty of smart people fall for such stupidity.
→» Users CAN run un-notarized apps if they WANT TO. «←
You just have to make us trust you as a developer.
Telling your users to do that (+ any other steps if needed) might take less effort than shaking pitchforks at Apple (though they're certainly deserving of ire in other areas.)
If it's important to you for users to trust your app, then there are ways to do that. One of those ways is paying the $100. There are other ways outside of that, though.
And it should be tough. The hurdle needs to be high enough to frustrate malicious actors getting people to run random executables they downloaded from the Internet.
A developer who doesn't want to put any effort into gaining trust doesn't sound like someone I should allow to run non-sandboxed code on my machine anyway.
To reiterate: You don't have to notarize, if your users trust you enough to manually allow your app to run.
I recently was looking to get a new machine for music production and ended up building an Ryzen PC with Windows 10. It cost me the same as an i7 Mini but it is many times more powerful, expandable, and silent. Thank god I moved away from Logic years ago.
Windows is not as pleasant to use as macOS, but it's just as stable. A lot of software seems to run better there (eg: Firefox, Chrome).
For dev work I will keep using my iMac and 2014 MBP with Mojave, at least for the foreseeable future. Unless something dramatic happens at Apple I will most likely end up moving to Windows or Linux in a couple of years when these machines die.
Would I see any issue here? I suspect everything I write or "install" via curl/bash/etc will be fine. Eg, if it's a downloaded binary in my $PATH, I expect it to continue to work.
So if that's the case.. why is Apple being so harsh about all of this? I feel like malware will just move to curl scripts and all Apple succeeded at was making normal Apps more difficult to develop.
I imagine some might argue that users will be somewhat informed, and will know not to run stuff in a Terminal. Cool.... but, how many times have we seen non-technical folks run stuff in the Windows command prompt? They can be walked through a malware installation process a thousand times over.
So is Apple going to lock the entire platform down? Are they going to limit my ability to run binaries? My own and others? Because if they don't, this all feels pointless. And if they do, i'l _be forced_ to switch to Linux.
I don't like you these days Apple. What is wrong with you.
The goal is to prevent users from running untrusted applications or mistrusted applications like trojans. If you can convince the user to click through a security warning or run commands from the terminal, then the user themselves have taken responsibility for evaluating trust.
The current system does depend on the user having appropriate 'spidey sense' to what the developer is asking them to do - the yardstick is a certain level of informed consent. If malware driving the user through terminal prompts becomes a significant user problem, that may unfortunately lead to the system being locked down further.
FWIW, other more serious changes (such as disabling SIP) go beyond a terminal command or requiring a password, to requiring you to reboot onto the rescue partition. Here, they are requiring a much higher degree of informed user consent.
I believe Catalina also will be periodic checks to make sure the binary hasn't been modified.
If you don't distribute software off of your own machine, you do not have to think about signing at all.
The rust installation manager and cargo do not set the quarantine attribute, so there is nothing for quarantine to check when binaries are run.
However, the evolution of platform protection slowed down as it became clear this was an amazing revenue generator. Incentives became blurred. A new type of malware showed up aimed purely at collecting your data, and our protections haven’t kept up, because that’s no longer the goal.
At this point, Apple does this by being a nearly completely consumer focused company. If you aren't the average consumer, they are going to toss you to the curb someday.
But it's $99/year to be a part of the Apple Developer program which covers all platforms.
> ”Apple’s vision involves us constantly updating work, constantly adding to our games, constantly paying to exist here, even when some of this stuff is done. Often when a game is done, it’s done. Games aren’t a service. It’s like asking for a director to keep updating a movie, or for a musician to keep changing their song so it can keep running.
Decisions like this erase our history.”
On the contrary, you chose an ephemeral medium for your art. That choice, like making sandcastles, has consequences.
I don’t understand how someone can’t afford a $100/year expense on something related to their occupation.
However, the evolution of platform protection slowed down as it became clear this was an amazing revenue generator. Incentives became blurred, and it’s why you have so much malware on the App Store masquerading as casual games.
* once notarized, always notarized. You don't have to continue to pay for old apps to work
* You can notarize old apps, same as new apps
* It is very easy. One command line command. That's that. There is also GUI wrapper for it, our user like. But in the end, it is very easy to do.
Apple: Censors and controls everything you do on your computer.
Linux: Neither of the above. It's your computer.
The choice seems clear to me.
It strikes me that as a developer if you can't afford $100 and just want to make games, why not make them in the safety of the browser sandbox?