Hacker News new | past | comments | ask | show | jobs | submit login
The future of my games on Apple and what this means for art games (nathalielawhead.com)
211 points by tomgp 24 days ago | hide | past | web | favorite | 259 comments



We've heard the warnings about the rise of "proprietary computing", where all experiences in computers need to be approved by the policies of hardware and software manufacturers.

I feel it's getting worse as we get pushed more and more to vertically integrated platforms (Pixel, iPhone, Surface, game consoles, Kindle+Fire etc). We still have Linux, thankfully - at least as long as hardware manufacturers let it run on their computers.

https://boingboing.net/2011/12/27/the-coming-war-on-general-...


> at least as long as hardware manufacturers let it run on their computers.

Linux is thankfully totally emancipated from this now. There is no longer an exclusive x86 hegemony of desktop computing - its fringe for now, but ARM, Power, and RISC-V exist as alternative options with currently purchasable hardware that can run several Linux distros today.

If Intel / AMD did anything to try to lock platform exclusivity on their consumer offerings to Windows there is an escape hatch. And it also doesn't really align with their business interests to help Microsoft on that - they just want to sell hardware and both employ substantial development teams to support their hardware under Linux already. AMD just got a bunch of money from me this year upgrading to their latest platforms all for their ongoing commitment to Linux support.


Pixels are one of the few phones with custom firmware loading explicitly enabled and exposed by Google. Sideloading apps is also 1 switch away. It could be even better, but I really don't think it belongs in the same category as iPhone and have consoles.


Are we supposed to pretend it’s something other than vertical integration because you happen to prefer their approach?


I don't understand why you'd pretend anything.

You've got 3 areas:

1. Producers actively preventing you from running custom software (iOS, TVs, game consoles, some Android systems, ...).

2. Platforms providing default software and leaving an official way to replace it (pixels and a few other androids, ...).

3. Producers who mostly don't care what you run. (PCs)

It matters to me that things are as close to the 3rd category as possible. But let's not lump 1 and 2 together - there's a huge difference.


What does any of that have to do vertical integration?


In a way, it's fortunate that hardware performance improvements have all but ground to a halt; according to online benchmarks, the CPU in my nearly 7 year old (3rd gen Intel M series) Thinkpad is only something like 50% slower than the most recent comparable models. (This is partly because after 3rd gen, they all switched to U-series CPUs, which were significantly slower.) This way, even if approved-OS-only hardware suddenly becomes the universal norm in the same way that chiclet keyboards did around then, I imagine I will still be set for a long time.

(Just imagine using a computer from 1993 in 2000, for comparison...)


On Mac, the oldest supported computers (for High Sierra and Catalina are the only supported OSes) the 2012 vintage (plus older Mac Pros). Older devices (expensive, premium, devices!), which run fine and have specs matching today's cheapest new computers, cannot run a supported Mac OS.

There are no longer any supported 17" Mac laptops.


> We still have Linux, thankfully - at least as long as hardware manufacturers let it run on their computers.

As a ThinkPading Linux-user I’m looking at all these other people and wondering wth they’re up to on their increasingly leased computer platforms.

Whatever that is, I want nothing of it.


A lot of people here are fixating on the $100/year fee -- and don't get me wrong, that's a problem -- but ignoring another significant part of the blog post: the fact that Apple is demanding what amounts to a non-trivial amount of creative control over the visual appearance of games that can be distributed on macOS/iOS. Just like the other restrictions on user freedom, this is a ratchet that will only get tighter as time goes on.

What would happen to indie filmmaking if every TV and movie projector threw up a scary warning screen, requiring a cumbersome manual override, before playing anything that wasn't certified by the MPAA?

If you develop webapps, would you be OK with not being able to get a trusted SSL certificate unless an anonymous reviewer at Mozilla was satisfied with your UI?


>Apple is demanding what amounts to a non-trivial amount of creative control over the visual appearance of games that can be distributed on macOS/iOS

Apple is intolerably paternalistic in how it tells people how to use their hardware and it is continuously puzzling to me how much positive press it still gets in particularly in communities like HN, which have the 'hacker' part right in the name.

I'm not the biggest fan of windows but at least windows has, for the most part, left me alone and let me install software on my machine, same for Linux obviously. Not even Google and Android tell me what apks to run on my phone other than giving me a warning.


Apple lets you run your own applications you just have to right click the program and then select run. A warning will pop up but you can dismiss it and run the app.

It's not super intuitive but it isn't hard for power users and normal users generally shouldn't be just running unsigned apps anyway.


> Apple lets you run your own applications you just have to right click the program and then select run.

The iPhone and Apple's slow and steady OS convergence would like to have a word with you.


That doesn't work for iOS devices. Regular Joe User can't really use an app that wasn't approved by Apple. Jailbreaking is far too difficult for any non-techie person and deployment with TestFlight is far too limited in scale, and also could be easily blocked by Apple.


    I'm not the biggest fan of windows but at least windows
    has, for the most part, left me alone and let me install
    software on my machine
The problem is that it's equally fond of letting everybody else install software on your machine. See also all the BonziBuddies every tech has had to put down in their relatives' browsers over the years...


I'm a little confused. Do they exert creative control over things that are submitted for notarization? I was under the impression that only apps submitted for the App store are "reviewed", while notarization is a fully automated process that only does a basic malware check. I do think that charging $100 for access to notarization is preposterous, but it's not the same as having to submit to App Store guidelines.


> Do they exert creative control over things that are submitted for notarization?

No, they don't, but the article covers two different topics: notarizing apps for macOS desktop (for which you have to pay the developer fee, where previously there was no need to), and distributing apps through the iOS app store (for which you have to pay the developer fee and lose creative control). Presumably the app store is relevant because there is no other way to distribute apps for iOS.


But notarizing apps for macOS is free?


Not according to Apple.

If you try to notarize an app, you chase links for a while untyil you get to https://help.apple.com/xcode/mac/current/#/devac02c5ab8, and then to get a Developer ID, you chase links for a while until you get to https://developer.apple.com/support/purchase-activation/

If it's free, it's pretty well hidden, like US tax prep companies' legally mandated "Free File".


Huh. When they talked about notarization at WWDC it sounded like it'd be available as a free service for anyone, but I might be misremembering.


To get a developer id, I believe you sign a contract with apple.


Who know what apple will decide next year. Despite announcing Apple computer where switching to ARM in 2021 or 2022, leading to obsolescence of intel apps in near future, they annouced they would stop support for 32 bits apps... Every other os platforms still support 32 bit applications : Windows, BSD, Linux. It's no big deal to run 32 big application on 64 bits platform.

But there is still a slight overcharge in work comparing to 64 bits only platform. So Apple, one of the richest company in the world, decided to cut the cost, once again.

I came to believe that every decision apple is making regarding its products, hardwares and softwares is about cutting costs at every corner and maximizing profit.

Switch to in house ARM processor from standard Intel ? No need to pay intel anymore...

These new keyboards ? Well, may be it's about slim design but, having be able to use them for 3 years and having had to remove some of the keys occasionally and comparing them with the ones of the macbook pro 2015 i bought on ebay 2 month ago (I hade to convert the Querty keyboard to an azerty keyboard by switching some of the keys), I can say the quality of the material and the cost of manufacturing of these 2 keyboard are not the same.


This sounds much like the arm wrangling over the PPC to x86 shift. I had just acquired a 17" macbook back then too. Color me pissed... but that's Apple for ya (insert-shrug-emoji)


Notarization implies permission.

Additionally, don't you have to sign a contract as a developer to use the notarization service?


Notarization is automated and similar in principle to something like LetsEncrypt. This could serve as a backing off from the push towards sandboxing and in review.

Today it is a layer on top. But my hope is that Apple comes to peace with non-AppStore distribution, with automated notarization and waiving of the $100 fee for open source, students, etc.

Mac OS X was such a joyous platform to learn to program on. But macOS is throwing up more and more obstacles to native apps. When you squeeze a balloon, the air goes to the other thing.


An indie movie can't install ransomware on your TV. An eBook can't corrupt your Kindle to spy on you.†

That doesn't override the need for free expression. Software is art, and no one should have the ability to ban a piece of art. However, it makes sense to design systems which take these risks into account.

So by default, Macs are shipped in a "safe mode" that only runs vetted software, and anyone who wants to run less-safe code—and accepts the risk—can type four words into a Terminal window and go along their way.

I really don't think a single Terminal command is too much to ask—it doesn't take long, and it's a good litmus test. If you can't open a Terminal window, you probably don't understand the risks involved in running untrusted code.

-----

† Outside of extreme scenarios involving zero day exploits.


>If you develop webapps, would you be OK with not being able to get a trusted SSL certificate unless an anonymous reviewer at Mozilla was satisfied with your UI?

Have you ever used the Google Search Console? Your website is indexed well only if you fix the issues that they find with your site.

I was recently told that my buttons were too close together when on a mobile device....


> A lot of people here are fixating on the $100/year fee -- and don't get me wrong, that's a problem

How much should you have to pay for unlimited bandwidth, hosting, downloads, user ratings, reviews, feedback, crash logs, showing up in searches, the chance to get featured, and other features like CloudKit etc.?

> Apple is demanding what amounts to a non-trivial amount of creative control over the visual appearance of games

Where did you get that from?

How is that different from Steam, Nintendo, Microsoft and Sony not allowing certain types of content on their platforms? For example, you can't show explicit pornography in your game and expect it to be approved anywhere.

As for not allowing "blue screens of death, simulated error, glitch art, brokeness" that TFA complains about:

Have you seen those "YOUR PC IS INFECTED BY 42 VIRUSES!! CLICK HERE TO CLEAN SYSTEM32! PERMANENT HARD DISK DELETION IN 2 SECONDS!!!" ads?

What if apps start doing that for fun then charge in-app purchases to clean the in-game viruses?

Would you want the job of being an arbiter over whether something like that is malicious deception which preys upon user naïveté, or just a cute quirky joke?


> How much should you have to pay for unlimited bandwidth, hosting, downloads, user ratings, reviews, feedback, crash logs, showing up in searches, the chance to get featured, and other features like CloudKit etc.?

As others have already pointed out, I'm talking about the fact that Apple charges $100/year to make your software runnable on a Mac, regardless of whether you want to use their app store.

> Where did you get that from?

From the blog post that we're all supposedly discussing: "The next reviewer saw the “blue screen of death” art and said that it conflicts with copyright (it’s Microsoft) as well as simulated error not being allowed. So I changed that too. After that it got arbitrarily rejected for another reason… so at this point I changed so much about the game that it wasn’t even my work anymore. It didn’t reflect any of the things that were important to my work. Simulated error, glitch art, brokeness, historic UI’s are deeply important to me because they reflect a computer history. None of that was allowed. I had to change my games."

> How is that different from Steam, Nintendo, Microsoft and Sony not allowing certain types of content on their platforms?

Steam is a free service that doesn't prevent me from running non-Steam software on my computer.

As for your other examples, I don't think it's different. I think restrictions on user freedom are made worse, not better, by the fact that lots of companies are engaging in them.


Look the point stands: it’s a non-free as in speech & beer platform

Go work with Purism if you want to ship whatever

What a shock Apple is also nickle & diming for services as hardware sales atrophy. This is business 101.

If we don’t like how businesses work we should stop working for them.

Paying for beer & leaving speech free never works. We end up talking about supporting paid beer. Free speech becomes all about what we buy & sell.

If you want a free society that doesn’t end up in a nickle & dime system of inequality a test should be is life free as in speech & beer? Organized around resource distribution sure, but free of coercion to support paid speech all day (where to be viable in the eyes of society work must earn money).

That’s the DNA of our culture. These are not random things. This is what we and Apple talk about all day: financial justification for behavior.

That’s a complete corruption of free speech.


> How much should you have to pay for unlimited bandwidth, hosting, downloads, user ratings, reviews, feedback, crash logs, showing up in searches, the chance to get featured, and other features like CloudKit etc.?

You seem to think that 30% of earnings is not enough for this ... so yeah, how much should you have to pay for that? 50%? 75%? 100%?


This is not about the macOS store, but about notarizing off-store apps.


But that doesn't include any 'creative control over the visual appearance of games'.


FYI Steam doesn't even have content restrictions on the marketplace now. There's a ton of adult games on there now. So not the best point.


Yes, Steam allows lewds, but, and I'm not trying to move the goal posts, just reiterating the core point: Can you get away with something that hatefully targets a racial or social group etc.?

The point is there are always going to be limits to what you can release on any platform. Does that count as creative control over developers?

If there truly are no restrictions on what you can publish on Steam, then my initial list of examples is wrong, but that still doesn't make Apple the sole company with restrictions for their platform.


> How much should you have to pay for unlimited bandwidth, hosting, downloads, user ratings, reviews, feedback, crash logs, showing up in searches, the chance to get featured, and other features like CloudKit etc.?

Those are things Apple provides for its users' sake, not for developers. The amount I should have to pay to allow users to run my app on their machines, is and always has been ZERO.


> The amount I should have to pay to allow users to run my app on their machines, is and always has been ZERO.

You're in luck!! The amount you have to pay to allow users to run your app on their machines, is still ZERO.

Whether you can make them trust you enough to manually allow your app to run without notarization, is another matter.


It's not a about trust. Apple apps ran on mac for 40 years or some without any epidemic issue as it could have been the case with Windows.

It's about money. They want to close down the platform like iOS, as they said a few years ago, in order to bring the business model of the mobile platform to MacOS by forcing them to intall everything through the app Store.

Now there is notarization, but it will not last. They're just boiling the proverbial frog slowly, step by step. In 2 or 3 years, it will also disappear.

It's not only the game but a shitload of little utilities which will get lost with notarization

Once the users get used to not having as much third party software as before from outside the app Store, except big name like mozilla or google, they will stop notarization altogether and impose a 30% fee for every software sold on the platform.

With the system partition permanently set as read-only device, as they currently test on catalina, even if you can still currently mount it in read-write, MacOS bill be lock down like iOS. The only way to install a software outside of Mac App Store will be to jailbreak it thank to a "security flaw".

As i said, it's just about money.

Only technically naive users will stay on the platform and developer whishing to take advantage of them with softwares paid through subscription model.


They are not even talking about App Store. The change is that you now need $100/year even if you distribute outside the app store.


If I want to distribute my app on my own website, I need Apple's approval and to give them $100 every year to do so.


Only if it's important to you and your users that they aren't told that no one has vetted the app except you and them.

You don't have to pay the $100/year to run the apps distributed on your website but you do if you don't want them to get a message that says it's from an unverified developer. That seems to me like the system is working exactly as intended.


> That seems to me like the system is working exactly as intended.

Not for me, the end user. Less developers releasing applications for my Mac means less options for me.


That's not what's happening, though. The same number of developers can release their applications. The only difference would be that some applications will say that they come from a source that's known to distribute reliable software.

You can still release apps without notarizing them. You just can't get around the warning message that the application hasn't been verified because, well... it won't have been verified. You can still release it and users can still run it.


This thread is full of developers considering or actively abandoning macOS as platform. That means less options for end users like me.


This thread is also full of people arguing with them. Meanwhile for end users like me, I see only benefits if Apple keeps the bar high.

If I really want to run an app that is not notarized and I trust its developers, I can always manually allow it to run.


I would not develop for mac with this policy.


With respect, then don't. I don't see why your unwillingness to establish trust with your end users is an issue. There are multiple ways you can do that and you don't seem to want to. If that's the case, I don't want to use your apps anyways.


Mostly because they're imagining what the worst-case scenario in the future is. They're looking at it completely from their own perspective, which is admittedly more challenging for them, while ignoring that it makes the experience for their users 1000x easier, better, and safer.


> Mostly because they're imagining what the worst-case scenario in the future is.

I'm one of those developers, and I don't appreciate being strawman'd and dismissed.


I'm doing neither of those things. Unless you have some kind of evidence that shows that Apple is going to prevent software from running if it's not notarized, you're imagining the worst-case scenario and reacting to that.

Now, granted, that statement can't possibly cover every single reason that someone might have for leaving Apple's platforms but those don't seem relevant when having a discussion about this specific context which is what the article and these comments are about.

The fact is that nothing that's been done changes the ability for people to run software as a developer. The difference is whether the end users trust you directly or whether the expect Apple to verify that trust for them. Both scenarios still allow them to run your software.

If you want to bring more clarity and point out where I'm arguing a straw man, I'd appreciate it.


This article mixes up a lot of issues and is not very coherent. Here are some important clarifications:

1. You can run any unsigned or unnotarized software from any developer by right-clicking the app, selecting "Open", and then click the "open" button in the scary warning. It's really simple. I wish people would stop pretending like it was really hard to run unsigned software.

2. Apple does not "curate" or "review" software distributed outside the Mac app store. The notarisation process runs some automated malware checks, and that's it. The goal is to block malware, not to limit the content you put on your computer.

3. Notarisation takes less than ten minutes. It's easy to automate it, and you can also do it manually if you want. You can staple the notarisation ticket to the app, but you don't have to. MacOS will look up the ticket via a webservice if you don't staple it. The documentation sucks, but that's the only bad thing you can say about it.

4. Apple does review stuff you submit to the Mac app store, but fortunately it's entirely optional to submit stuff to the Mac app store, there's nothing stopping you from releasing software outside the app store without any review.

5. Apple did end support for 32bit apps, which sucks, and I don't have anything good to say about that.


I would also argue that the "quirkiness" of what the OP is attempting to do that makes it such a difficult edge case is unnecessary too. For example, their simplest example of two executables that generate files that the user has to manually share between two folders is still completely do-able if the author just simplified the execution. If it's really just a game, and nothing more, then they should be able to just make it 1 executable that opens its own sandbox where there are 2 fake executables inside of it that do whatever they need to do to make the game work. There's literally no reason, other than the gimmick and novelty, for this game to have to run as 2 separate executables that actually play and have access to your documents folder (or whatever folder on the drive the "game" happens under).

It reminds me a lot of the uproar on Windows when shortcuts were indistinguishable to an application from the actual folder. When MS changed this so that the end result was the same but the applications were also aware of whether they followed a shortcut or not, there was a bit of outcry because it broke a few apps that relied on the file system not knowing the difference. In other words, these apps built their entire functionality around an unintended bug and then cried foul when that bug was fixed.

In this case, at least, the author just seems upset because they can't continue to create things that are immediately abandoned afterwards. They've become reliant on being able to constantly create new work.


Also, even keeping the 'separate folders with generated files' gimmick, it could do exactly that without permissions checks by putting the stuff in `$HOME/Library/Application Support/AppNameGoesHere` and opening the folders under there for the user.


Exactly.

The complaint sounds too much like someone building a sand castle on the beach and then complaining when the tide has washed it away.


You do need to get a paid developer account though and Apple has announced that in the next version of MacOS bypassing notarization is going to get more difficult.


You only need a paid account if you want to sign and notarize your app.

You can build apps and distribute them without a developer account. But macOS will show a scary warning to your users.


Where was this said?


The article addresses your first item directly and doesn't actually say what what your second one claims it says. It's not clear to me how three and four are responsive to the article. You might be able to clarify your clarifications if you give it a closer re-read.


While the article may technically address these points, it's not very effective, judging by what people post here. At the time I posted my comment, all the top comments here on HN were complaining how Apple is limiting artistic expression or how hardware makers decide which software we can run.

That's why I wanted to post my clarifications. I don't think that the article is technically wrong, but it seems that people have not read it closely and just came to weird conclusions.


Maybe you should have responded to those comments? Your 'clarifications' of the article are really inaccurate and end up misrepresenting it. You can still edit your comment to correct that.


All good responses, if I had to add one gripe about Apple's direction (re: 32-bit kill off), I'd add the Metal Only drivers post-High Sierra. I bought my MacBook Pro specifically for eGPU and my preferred GPUs are from Nvidia (using a GTX 1080). There was some degree of finagling to get it to work, but it could be done.

And then the update after High Sierra happened and killed it outright. And as much as I'd rather boot into macOS for non-gaming items, because my monitor hooks into the eGPU, I boot into Windows for anything asides Xcode which is pretty heavy. Could stick to High Sierra, but that isn't exactly an option.

It truly is a shame that we couldn't just be allowed to change a flag or confirm a prompt to maintain business as usual. As such, this may be my last Apple branded PC and will just use it as a build machine in the future whenever I inevitably upgrade.

Maybe Nvidia and Apple will work together on metal drivers, but quite frankly, it is the edgiest of edge cases so I imagine the chances of that are minuscule. That isn't to say I don't somewhat understand their reasoning: they basically appear to be scaling back numbers of areas they can't safeguard or require support (non-metal drivers and 32-bit respectively). But at that point it seems like provide something that users can do to bypass it.


Regarding the 32-bit thing, wouldn't it be in any game developers interest to create additional x-bit options to rebuild the game in higher bit systems? I would want my game to be amenable to historical preservation, which is the reason I do my own engines in lower level languages.


If they can, certainly. Assuming the developer still exists, and they still have the source code, some idea of the build environment, and they want their game to still be accessible. And even then:

https://mightyvision.blogspot.com/2017/04/868-hack-update.ht...

> Isn't this just a matter of opening up the project, changing one line and recompiling? Should take five minutes, it's not really a big deal? Yes and no. The 64-bit change itself is small but they change enough other things every few months that recompiling against new versions of the libraries doesn't simply work. You get a few linker errors and have to look up the new names for a couple of functions. Or there's a new element in one of the libraries with the same name as one of my variables so I have to find+replace to change its name. And then you run it and find that it's in portrait mode, squished into half of the screen, so you have to look up what changes they've made to how screen orientation works and change a few more lines. [...] It adds up.

Sure, you can work around it, but from the developer's perspective their game has just been designated obsolete for no reason that Apple couldn't have fixed themselves.


>You can run any unsigned or unnotarized software from any developer by right-clicking the app, selecting "Open", and then click the "open" button in the scary warning. It's really simple.

If I had to do this every time to run an app, I'd throw my PC out of the window. How are you people OK with that?


You only need to do that the first time you launch an app.


As a developer mainly on MacOS I feel the author's pain. There's also some legitimate gripes about Apple locking down their platform in a way that is ultimately hostile to users down the road.

As a user I say suck it up. Notarization and 1st party QA will make users lives so much easier. I have yet to see people run into serious issues with either that weren't doing things that entitlements/signing were designed to prevent.

The $100/year dev license kinda sucks if you just want to hack around and distribute code, but it also goes a long way to stopping people from creating spam accounts and evading bans. But it's a pretty trivial burden, if you're distributing software professionally and can't cough up $100 in revenue in a year... maybe try your hand at something else or just go the amateur route?

I also don't get complaining about obsoleting 32 bit. Use obsolete software on an obsolete OS in a VM like the rest of us, we all hate supporting things until the end of time.


> just go the amateur route?

But how do you even go the amateur route? You still won't be able to distribute what you make.

A big part of my childhood was making games, and sharing them on shareware sites in the mid 90s. It sucks that kids today aren't going to be able to do that. There was no way I could afford $100 as a 10 year old.


The amateurs will have to start distributing source. I remember back in the day when I was a kid making GameMaker games on Mac in the late 90s, sometimes I would collaborate with other developers and we would just send the source files around which were completely useable, but of course we wanted to distribute executables as if we were legitimate companies, even though our users were typically other kid developers. We had fake company names and stuff, but if we were emulating open source projects instead of cloistered corporations, we could've gotten more done together.

I predict this change will cause kids to embrace open source more and distribute their projects in a way that can be shared and reused.


Well you could write your game in JS and deliver it over the web. Flash games were all the rage when I was a kid.

But for now, non-notarized executables don't fail, they just need to be enabled by an admin account through permissions.


You surely can’t be oblivious to how this has been a process of boiling the frog slowly — almost certainly they will eventually be permanent-blocked like on iOS.


You surely can't be oblivous that, user wise (security, ease of use) this is a process of improvement slowly...

MacOS might have bugs due to neglect, bad priorties, too fast deprecations, etc.

But all the extra restrictions (notarization, sandboxing, dropping 32-bits, etc) are in the right direction for a consumer platform (i.e. not for devs), for making the computers easier and more secure, like appliances that just do what you bought them for. And they will be part of any/all platforms going forward (including Windows, and new platforms like Fucscia).

As for Linux, it exists for 2.5 decades, it has been free since forever, it's still not adopted by the masses. It's not in the direction that most want.


I'm not sure who you're arguing with, but the poster I was replying to noted the status quo is that you can run unnotarized executables. To your point, that should not be taken to mean that we won't eventually see it evaporate. On the contrary, we should assume it will.


>I'm not sure who you're arguing with

I'm arguing against your comment: "You surely can’t be oblivious to how this has been a process of boiling the frog slowly — almost certainly they will eventually be permanent-blocked like on iOS".

And I'm saying that this is a good thing in some ways (whereas "boiling the frog slowly" doesn't just point to the graduality of the process, but also implies it's bad).

I don't disagree on the graduality or that we will "eventually see it evaporate" (it being various current more open abilities).


To kill the analogy — developers are unambiguously the frog. You’re arguing about who we should care more about, the frog or those who will be having frog for dinner — this has no bearing on if the frog being boiled is a bad thing but it certainly is from the vantage point of the frog ;)


WebGL isn't fully supported on iOS, so there goes your nice little web game.


Last year I was building web games* with Phaser. To better support iOS, I just changed the renderer to `canvas` when an iOS user agent was detected. In the end, the games worked pretty well on iOS devices.

*Games I worked on: https://livegame.show/play/ride_v3 | https://livegame.show/play/pewpew_v6 | https://livegame.show/play/bubbleshooter3

I will say though, Apple does seem to almost purposefully hold back the web on iOS. I suppose this makes native iOS apps, which must pay Apple 30% of their revenue, more appealing to users. On iOS, the Firefox and Chrome apps aren't allowed to include their own browser engines. They have to just wrap Safari webviews.

On Android, the Firefox app is allowed to include it's own browser engine. An engine that supports adblocking extensions like uBlock Origin.


I think this is an issue with phaser. I’ve also made a bunch of web games with various frameworks, and webGL works fine on iOS in many other contexts. The js port of cocos2d is webGL compatible on iOS.


If you're an amateur and can't afford $100 dev license, why are you targeting a premium mobile platform that you'd need to test on?

Just don't support ~~MacOS~~ [e: iOS, my b], this is what I mean by going the "amateur" route. If you don't care about distributing software professionally, then just don't deal with the headaches and costs.


Last I checked MacOS wasn't a premium mobile platform... it's a general computing platform for desktops and laptops.


I was replying to the comment that WebGL isn't fully supported on iOS.


Consider though that flash was a security dumpster fire that coincidentally ran as fast as something unburdened by security concern, and that the most certainly pirated full Adobe suite used by a young beginner developer was installed by some sketchy exe that would no way be notarized by any sort of authority


You would think... but they already made it so even riot can't delete binaries from /use/bin, and you can't disable that setting (have to boot into a special mode)


Thing is, for the common scenario of "safe enough for random users to install" we want some barrier of entry. The market where random creations of a ten year old are acceptable isn't the Apple Store, it's something like Roblox environment for user-created content or some other sandbox that is (a) isolated from the rest of the system and (b) makes it clear that it's likely not to meet even the (already low) standards of the generic Store.

For all the major means of distribution the problem isn't not enough apps, the problem is too much garbage - so the direction of improving end-user experience is more filtering of apps and developers, not easier access to distribution.


That is exactly what some developers just can't seem to get:

Many users agree with Apple.

But those developers are going: "No those users are wrong and should join my mob to make Apple let me do whatever I want to users' systems."


I hypothesize "fantasy consoles"* are potentially fertile ground for young developers. They impose a lot of restrictions, but to someone starting out that means fairly low limits to understanding the basics of the environment (as opposed to something like UE4). They're also cheap or free. And some of them (like PICO-8) host user-created games.

*https://github.com/paladin-t/fantasy


You can still do that, you just have to find users dumb enough to authorize the running of an application downloaded randomly from the internet.

Oh, and they need to authorize it from an admin account. So, yeah. Not sure there are a lot of people out there that dumb. Unless you're some well known downloaded product like Blender or something, I would think you would have a hard time getting people to run your software.

At the same time, to be completely fair, you probably should have a hard time getting people to run your software. That's pretty much exactly how a lot of malware is distributed.


Or just drop Apple and macOS. Even for an application like Krita, for creative people, macOS is a laughably small platform, with a very small user-base, bringing in next to nothing money-wise, nothing contribution-wise, and it tries to force everyone to use proprietary libraries like Metal, instead of properly supporting OpenGL and Vulkan.

From an effort-to-income perspective, macOS isn't worth it.


For Krita maybe.

Creative Suite stuff, which actual pro creatives use, on the other hand, has been 50-50 Mac/PC sales, even though the Mac has 10% or less of the desktop market.

That's how many creatives are on the platform.

If Krita can't tap into them, it's not the platform that's at fault...

Affinity, for example, does fine, as does Pixelmator...


This pretty much nails it. I don't know anyone on macOS who uses Krita, but that's not because macOS has such a tiny tiny userbase there's no reason for anyone to support it -- it's because Krita has neither measurable marketshare or mindshare on the Mac.

It may be a wonderful program now, I don't know, but I do remember KDE's initial efforts on the Mac -- and they were... I originally wrote "just awful," but I'll amend that to a weaker "not very good." I haven't actually seen any desktop GUI program whose primary platform is Linux really take off on the Mac, with the possible exception of LibreOffice. We have so many good cheap-to-free alternatives.


I used KDE as my primary desktop for a few years circa 2000-2004.

In fact I used "webkit" before it was webkit and before it was cool: I used Konqueror as my main browser (I didn't care for it being not compatible with many sites at the time, as I cared mostly for simpler, text-based sites which worked fine).

I remember all the "KDE Office" apps, various drawing programs, etc. They were and many are on a perennial semi-finished state, in a way that Windows and OS X equivalent are not. Krita has done well on that front, and a couple of others, but it took many years of not great releases. And in Windows/OS X they are non-starters as ports...


> for creative people, macOS is a laughably small platform, with a very small user-base, bringing in next to nothing money-wise, nothing contribution-wise

... As a user, statements like that – about any platform – honestly just make me want to avoid anything from the developers who make them.


The platform matters. Mac is intentionally expensive and exclusionary. They won't even sell their expensive OS to run on perfectly functional low cost computers. If you are going to blame the victims for that, I don't want you as a user either.


Apple got to where they are because users generally agree with them.

If a developer is going to attack users, the users are better off on Apple's side.


>From an effort-to-income perspective, macOS isn't worth it.

Apple has introduced Catalyst in Xcode -- this translates iOS apps to macOS, compile once and run on two platforms.

Mac App store has been notirous for not getting as much love as the iOS one but this change might reverse the direction as there will be a lot more apps coming for the Mac App Store now that they can leverage their iOS dev ecosystem.


I don't know If the Mac App Store will bring better apps and grater for enthousiasm than it is now for Apple Store business model, but you can be sure of one thing : notarisation program will not last. The goal is to lock the platform like iOS and force force every user to get their apps through the App Store and nowhere else.

I'm pretty sur once MacOS has switched to ARM, system partition will be in read only mode, you will not be able to install app outside of the AppStore except if you jailbreak thank to a "security flaw".


Yes, absolutely, poor people should not be able to try to develop their skills and attempt to make apps.


They can make apps and develop these skills, however, the apps listed for random end-users in the Apple store shouldn't contain all these experiments. This is about access to a particular distribution and advertising channel, which should happen after you've already developed the skills and are ready to offer something serious in context of which a $100/year investment is trivial.


Notarization isn't the same as the App Store.

This is about whether a 3rd-party application downloaded from a 3rd-pary website can be run. It used to be that you only needed to pay to get onto the distribution and advertising channel, now you need to pay not to make your user jump through special hoops on an admin account.

It's not implausible to suggest that in the future, that loophole will be removed as well. Apple does not want you to download apps that they can't personally vet.

The complaint isn't that you need to pay to get into their store. Of course you do, that isn't surprising to anyone. It's about whether distribution channels exist outside of their store.

As to why that's a problem? To quote the author:

> It really upsets me because the iOS App Store policies basically prohibited art or anything interesting (nudity, glitch, error art...temperamental depending on who reviewed you for approval).

> It was hell for me to get things approved, and even when I got them approved I had to change so much about my work it really wasn't my work anymore. A platform holder shouldn't dictate so much about someone else's work.

Apple has been consistently behind the curve when it comes to recognizing the artistic potential of games[0]. On a cultural level, we don't want want them to be a gatekeeper for a medium. As a company they aren't responsible enough or smart enough for that job.

[0]: https://www.theverge.com/2013/1/16/3879194/apple-app-store-g...


I fail to see the (new) change.

Gatekeeper was added in 2012, and required the user to either take steps (right click open, or disable gatekeeper via the Gui or command-line) to run apps not signed by a developer ID, which costs $99/yr.

Eventually Apple removed the option to disable Gatekeeper via the GUI due to developer abuse (e.g. games, including Minecraft, walking children through globally disabling security settings rather than signing their apps).

Notarization doesn't add the requirement of signing, nor does it change that you need a developer ID for $99/yr


If Apple considers walking users through the process of disabling Gatekeeper to be developer abuse, then I don't see how anyone can claim that notarization or the developer fees are optional.

The change is that in each release of Mac OS this process is getting harder to avoid and is starting to include additional requirements. "You need to sign apps" becomes "we need to run an automated review on our servers for every release you make." This matters because a lot of the original criticism around the original Gatekeeper was dismissed via the argument that it was just an optional security feature that was turned on by default. It wasn't like Apple was going to block 3rd-party marketplaces, this just made Mac OS safer for computer illiterate users.

With El Capitan, people are starting to suspect Apple does not want unvetted apps on the Mac in any context, and it seems more likely that future Mac OS versions will continue the trend of having stricter, more burdensome vetting requirements.

It is increasingly harder to take advocates at face value when they say that 5 years from now I'll be able to run a 3rd-party app on my Mac that didn't go through a manual review. Which might sound great to some people, some users love the app store.

As to why I think that could be a problem for creative mediums, see my parent post. It may not matter if Gatekeeper begins to live up to its name, because creative mediums may just migrate off of Mac. But if Mac becomes a platform too big to ignore, or if other platforms like Windows follow suite, then I think creative mediums will suffer.


Good lord, didn't anybody read the article? Or at least skim it?


Notarization is a requirement for distributing your content outside the app store. Within it the requirements are even higher.


Mac App Store apps are pure crap, for most of them. Cheap apps developed by chinese or indian developers or other coming from developping countries. A few apps are good but sell at inflated price. Games prices are outragious and sell for half the price on Steam.


My users and I both already paid an eyewatering price to get our Macs. Why should I have to pay to distribute my non malware app, which might even be free and open source, because my users want to defend against other people's malware?

Why is Apple blackmailing the industry over malware protection instead of making it a core part of the OS experience? I'd rather have malware protection than losing my Esc key or redoing the app window background textures yet again.


Why should I let you run non-sandboxed code on my machine?

What do you offer to earn my permission to allow your app?


If you are poor you probably aren't going to start with a Mac.


> I also don't get complaining about obsoleting 32 bit. Use obsolete software on an obsolete OS in a VM like the rest of us

You mean like games released last year which require heavy GPU usage? That's neither obsolete, nor a great experience.


Please cite one game—a single one—from last year which is 32-bit only and "requires heavy GPU usage".

One will do. I'm sure this won't take you long. Even if you can cite one, all that will prove is that the developer was so incredibly stupid that they ignored a full decade of warnings about how 32-bit software would be deprecated soon.


I'd be happy to tell you what it was, but I'm on holidays away from my laptop/steam account.


I'm not even sure it's possible to acquire hardware on which to compile and test a demanding game and not have a 64 bit Dev toolchain, without going to extra effort just to avoid 64bit for what purpose? A trivial performance improvement for an app that is resource intensive while also not needing 4gb Memory?


Notarization will not last. In 2 ou 3 years max, they play the card of security again and will stop the notarization program. Why do you believe they want to bring MacOS and iOS together through same processor family and same developement api ? They want to bring iOS business model to MacOS.

How is it not that obvious ?


"As a user... The $100/year dev license kinda sucks if you just want to hack around and distribute code"

As a user, what does $100 a year even accomplish? It is nothing more than a racket.


You don't need to pay to run your own code or distribute un-notorized code.


Apple is a trillion dollar company. It can easily support running 32-bit indefinitely, perhaps with a download for 32-bit libraries libraries and/or with some sort of containerization. As a platform holder it has the responsibility to do so. Failing to do that is a clear abdication.

It is clear from the post that Apple's platforms are hostile towards any sort of digital executable art.


It's not about "could" but about "should." Apple hasn't sold a computer with a 32 bit CPU in what, 8-9 years?

They deprecated 32 bit support almost a decade ago. Software has a lifespan and always has, to run obsolete code you need to jump through hoops to get an obsolete target running. This isn't new.


Yes, it is about should, and saying "it's a bit old" doesn't justify it.

Why is 8 years self-evidently "long enough"? Banks have code from the 1960s still running, and plenty of people have industrial hardware much older than that. Many of my household appliances are older.

It's only "obsolete" code because Apple has decided it is, it didn't rot.


> Banks have code from the 1960s still running

The only reason "old code" still works in banks is because untold man hours are spent wrapping around it, working around it, and testing it.

A long time ago, some exec saw that some old thing needed updating because the world has changed. There are new regulations, new customers types, new products, new strategies...

A bunch of people get hired. A bright future of next thing is promised. It gets everyone excited and devs start cutting little chunks away off the old thing, making a new thing. Now there are 2 things to maintain, but that's ok, some day new thing will become next thing.

But the exec leaves (probably to another bank) before the next thing became a thing. Another exec comes in, declares the new thing is crap because it doesn't quite get the job done. A next next thing is planned. The previous new thing is now the old new thing. It will be removed, someday. Just not today, because it's _a_ thing now. old new thing will be sunset when next next thing arrives.

Many many years and several execs later, there are many things maintained by a small army of devs and testers. These things work in really strange, archaic, or even stupid ways. Nobody knows how (or worse, why) the entire system works. The most knowledgeable people only know how it _behaves_.

But it's there, still working. And your money is in that thing of things. Many peoples' money are in it.

That's not necessarily a good thing.


"Long enough" is for certain a relative term. I don't think we should be making the comparison to banking software/appliances though. Banks are notoriously slow and I would think are more concerned with correctness than new features. Given how often we see complaints about archaic banking software I'm not sure if 1960's code is a good thing but it has longevity by design. Apple does not need to play by these rules and as an end user I'm ok with that. I think Apple should optimize for the 97-99% of average users even if it is at the expense of a few and I see this change in that lens. Besides, the users who use 8+ year old software probably have the know-how and the will to jury-rig a solution to get around this.


Not true! The Intel Architecture CPUs that's in the Macs of today lets you seamlessly mix 32- and 64- bit processes. Windows 10 Operating System has no trouble managing both types of processes.


The argument that I have heard people make is that some ABIs are changing in 64 bit land, such as syscalls or how appkit talks to the window server, so they have to keep some 32 bit frameworks frozen in time in some places, and update them in others. Maintaining 32 bit versions of that is nonzero cost.

That said, I don't agree with such an argument. It seems like seeking out technical excuses for crappy user outcomes. In another thread on this topic I said: in any job there are parts that are fun and others that are not so fun but necessary. They should try to keep customer apps working.


If you want long term support, buy windows or get linux. Mac is for cutting edge beautiful experiences of the future, not legacy of the past. It's a trade-off. Supporting old tookits makes your Mac less pretty.


If that were true, maybe they should get rid of ancient relics like the Mach kernel. But honestly you sound like you believe Apple craps rainbows, or something. They aren't better than everybody else by virtue of self identity. They make lots of mistakes just like any other group of mortals.


> Apple hasn't sold a computer with a 32 bit CPU in what, 8-9 years?

The Mini was the last Mac to make the switch to 64-bit, in "Mid 2007". So a little over 12 years.

For comparison, the Macintosh was on M68K from 1984 to 1994 (10 years), and on PowerPC from 1994 to 2006 (12 years). IOW, the Mac has been on 64-bit x86 for as long as they've been on any CPU architecture ever.

Another fun fact: there was less than 10 years between the last Apple IIe they sold and the first Power Mac G5.


It's been longer than that. Core 2 Duos are 64 bit chips, and there were core 2 duo Macs releas e in late 2006, so it's been almost 13 yesrs.

The 32 bit support is actually a bit of a red herring here; the bigger issue for some is Apple dropping support for Carbon, a 32 bit only API made to ease porting of classic MacOS apps from the 80s and 90s to OS X, and itself deprecated in 2012.

I guess Apple could have created a 64 bit version, but it also seems reasonable to expect software developers to have rewritten effected software in that timespan. Carbon was originally a compromise for companies not wanting to learn an entire new OS (NeXTSTEP) and API/libraries when updating their apps for OS X -- the fact some apps still need it today is kind of mind boggling.

Then, I suppose the Win32 API has been stable-ish for far longer.


What's wrong with Carbon? CLI tools still work 40 year later. Tk/Qt/Swing programs still work 20+ years later. Why not updated and recompile carbon for 64bit?


A work of digital art is not "obsolete", and trying to pretend that it is is capitalist garbage.

There is lots of great modern art. but none of it makes older art "obsolete".


While I agree, I think there's an important caveat that is missing. To me, digital art needs to be self-contained. Digital art, unlike physical art, doesn't really rely on anything else for it to present itself. It's why digital media has standards. A .jpg should, in theory, be self-contained and only require a device to view. A game presenting itself as digital art, however, needs to be tied to a specific platform and software that may not potentially run in the future for any number of reasons. It needs to be feature-locked and self-contained so that it can be archived. To try and compare modern/older art to modern/older digital art isn't fair or equivalent, in my opinion.


Digital art, just like physical art, needs to be maintained.

If a painting sits in a moldy basement for a decade, don't be surprised when someone finally go to check on it at what happened.


I've always wondered, what exactly is the overhead on running 32-bit code on a 64-bit machine? Why is support eventually dropped? I'm sure there are good reasons but I've no idea what they are.


I believe there is a little bit of overhead when it comes to mapping virtual addresses and dealing with different pointer sizes for the OS, but imo the main reason Apple wants to ditch 64 bit support is so their OS isn't tightly coupled to an ISA that has 32/64 bit modes.

In other words it's not about the overhead, it's about the reliance on x86.


If ISA decoupling is the goal then they should be mandating everyone write to a VM like the CLR or JVM (or variants like Dalvik).

It seems kind of silly to block x86 modes on x64 chips when the backwards compatibility needs of Windows pretty much insures "eternal" x86 modes. It's going to be a while before x86 modes disappear on Intel or AMD chips, and there's evidence to support that like the Itanium (IA-64) failure, and the fact that Windows on ARM made it a commitment to emulate x86 (but not x64) on ARM. x86 is likely going to be a lasting "lingua franca" ISA. Right now I'd almost put more money on x64 being retired before x86, and if anything from that perspective Apple is more tightly coupled to an ISA than before this choice.


> ...if anything from that perspective Apple is more tightly coupled to an ISA than before this choice.

That really depends on what their long-term goal is. If the speculation & rumors about an eventual ARM transition have any merit at all, then sunsetting 32-bit support now can help them smooth that transition, since they'll have a lot less work to do to implement whatever Rosetta-esque dynamic binary translator they would need.


Microsoft and Qualcomm together supposedly just invested tons of money in making sure that x86 emulates well on ARM64, but apparently had little success emulating x64 on ARM64. If the speculation and rumors about an ARM transition have merit, it seems backwards to sunset x86 and not x64.

Which it seems unlikely that Apple could do in secrecy what Microsoft + Qualcomm could not do (very publicly), but I guess all things are possible.


> ...if you're distributing software professionally and can't cough up $100 in revenue in a year...

This is kinda where I'm at with this. Surely anyone even remotely garnering an audience could raise this amount annually. It's a decent cost to an individual but to even a humble audience, it's peanuts.

If you can't raise even that, then IMO your game being lost is of little consequence. Not everything farted out by every indie developer is a lost masterpiece, sometimes stuff is lost because it lacks the traction to merit archival.


devil's advocate: $100 is a lot of money to spend on a hobby/side gig for a lot of people. Like some of us can blow more than that just on IDE's/Git clients in a year, but for some people it's a legitimate hurdle to cross.

When I was in high school working on the weekends, that would have been two weeks wages, which I had other stuff to spend on. And that's from a privileged background, I know plenty of folks who's weekend/summer jobs paid towards rent/bills for their families.

That said, Apple probably doesn't care about the teenagers or indie devs out there who want to make a little game for their platform. However I'd also say to those folks - if you're strapped for money, don't develop for Apple devices. Go make a doohickey on a windows or Linux box.


> devil's advocate: $100 is a lot of money to spend on a hobby/side gig for a lot of people.

Agree. But this isn't some lost nobody. This person has a Wikipedia Page [https://en.wikipedia.org/wiki/Nathalie_Lawhead], is an established game developer and artist, and lives in California. She's won multiple awards and she owns at least one Mac.

Like come on. The refusal to drop a hundred bucks on something that is apparently so important to her reeks of good old fashioned stubbornness, or at worst, entitlement, but even then, she has a significant audience. Crowdfund it, cough up and stop whining.

Like I don't entirely agree with the $100/year model being the only option from Apple, but we're all paying that particular piper. Why is she so special that hers shouldn't be subject to the same?


If it's always OK to pay an extra $100/yr on top of all the other Apple purchases, then the market price is actually $infinity.

Also, some people may look up to Lawhead as a role model as a programmer/artist but not have awards and lots of $.


> If it's always OK to pay an extra $100/yr on top of all the other Apple purchases, then the market price is actually $infinity.

By this logic the cost of anything that you can't own and use ad-infinitum is $infinity.

> Also, some people may look up to Lawhead as a role model as a programmer/artist but not have awards and lots of $.

What does that have to do with her ability or willingness to pay?


I'm very curious why you're getting downvoted by a large enough number of people to be de-emphasized but no one has responded to you.


The problem is that the best indie games communities in my experience were those that revolved around free games. Newgrounds, wc3/sc custom maps, etc. Every now and then someone would run off and get a job based on their games, or turn a decent profit from adverts or selling the game, but it was largely a non-monetary enterprise. The monetary incentive was mostly along the line of a vague hope of working at a AAA shop down the line.

And those communities succeeded, and produce interesting games/genres, primarily due to the lack of barrier to entry (in the sense that you eg bought doom & sc for the game, and then realized there was an editor attached that came free)

Annoyingly, they never learned to share their source code and assets, but otherwise.

Today indie gamedev has become more and more a minaturized version of the AAA industry, and that's a terrible shame.


> The problem is that the best indie games communities in my experience were those that revolved around free games.

Apple is not opposed to free games.

> Today indie gamedev has become more and more a minaturized version of the AAA industry, and that's a terrible shame.

I'm sorry but this comparison is ludicrously hyperbolic. Firstly because the Mac is hardly the platform for free and open gaming, relative to the topic at hand because the Mac itself has a high barrier to entry, being so damn expensive just for the hardware, and secondly because the total expenditure, if you wanted to publish, not develop because ANYONE can develop on a Mac for precisely no money, to publish a game for free is $100 for the dev license.

And, additionally, I see no reason the games couldn't be distributed as source code for free for people to build and run themselves. Is it as elegant as App Store? No, obviously, but that's what the $100 is paying for.

I'm sorry I just don't see why this is a problem. Mac has always been a bit of an exclusive platform, which makes it both of higher quality, but also does indeed introduce barriers to entry. If those barriers are too much for you, then don't do it, you're still spoiled for choice for distribution channels that aren't the Mac App Store.


I'm also thinking of just dropping macOS as a supported platform for Krita: https://krita.org/en/item/first-notarized-macos-build-of-kri...


It seems to me that Apple is shooting itself in the foot very hard with moves like this. At the end of the day, developers are creating value for your platform. If you treat them like shit, they will leave.

A lot of teenagers who are learning to program right now are definitely not going to be able to pay $100 a year to distribute toy programs they're making for fun. Heck, a lot of adults developers won't pay that. Apple has just made Linux significantly more appealing for all of these people.

I own a MacBook Air at home (and a Linux desktop). I know this will definitely motivate me to go back to a Linux laptop for my next purchase. This move isn't just hostile to users, it's hostile to developers, hostile to the people creating value for the platform.


Even if you can easily afford it, the pay-to-publish model should be setting off warning bells. Vanity publishing has been a real thing for a very long time, and normally isn't something that people consciously want to be doing.


This doesn't make much sense. Notarization is for the benefit of users, not developers. To an aspiring young dev, the right-click-opening Mac user market offers more users than the Linux market.

For a user, right-click-open is easier than installing Linux.


I develop about a half a dozen open source utilities, and own Macs. I read the writing on the wall and switched to Linux as my daily driver, because once you're locked in to a closed platform, it's easier to pay the toll than it is switch.

I've unfortunately come to the same conclusion.


Please do. Apple's behavior is not acceptable.


Seconded.

The last time when I was forced to use a Mac laptop, I hated every minute of it... Not because it was smooth and worked well, but that I had less and less freedoms to manipulate the system.


FYI, if you turn off SIP you can do quite literally anything. Rewrite random memory, install random unsigned kernel extensions, replace the kernel with a version you compiled yourself, the works.

The only thing that has no built-in off switch is TCC ("this app would like permission XXX"), although without SIP the ability to nullify it with code is theoretically at your disposal. I wish someone would make a MacForge plugin...


I don't think the point isn't about you or me - we're perfectly capable of bypassing this (for now).

The point is a) about Apple's hostility b) what to expect from your average macOS user as a developer (most will have no idea what SIP even is, and then telling them to disable SIP is borderline irresponsible).


GP said that they as a user disliked using macOS because they "had less and less freedoms to manipulate the system." If GP wants more freedoms on their machine, GP should go ahead and grant their own wish.

Regular software shouldn't run into SIP. It will run into Gatekeeper if the developer doesn't explicitly notarize/sign it, but I don't think it's irresponsible to tell users how to bypass Gatekeeper.


Please consider the third option: ignore notarization, and ask users to let the app through Gatekeeper.


Apple won't court developers until they start losing apps.

I think we're in a very, very narrow window of time where Apple is not popular enough that dropping them means apps are severely limiting user reach, but is still trying to push itself as a platform where creative things can happen -- a tool for graphic professionals.

I'm cautiously supportive of this move, even though I use a Mac occasionally.


People are going to keep buying iPhones and iPads and Macs and Apple Watches and Apple TVs anyway.

Then they'll discover the apps which are available for them.

Your app can either be one of them or, unless you're massively popular, people aren't even going to know that your app is missing.

If Krita goes, there's not going to be any shortage of painting apps for Apple devices.


Obviously there is a limit to how many apps you can lose without people abandoning your platform. Windows Phone failed largely for that reason, and Mac has historically suffered for its lack of game support. People don't just blindly buy devices, there is a threshold where they care about support.

And also obviously, if literally only one app moved off of Mac, nothing would happen.

So what's the threshold? If Blender stopped supporting Mac, would anyone here reconsider using them as a development platform? If Photoshop dropped Mac, would anyone here notice or make a purchasing decision based on that?

Is it significant that Photoshop is being shipped for iOS now -- do we expect that professionals will take the platform more seriously because of that?


> Is it significant that Photoshop is being shipped for iOS now -- do we expect that professionals will take the platform more seriously because of that?

The point that you're trying to make and the point I'm trying to make is effectively the same:

Apple platforms have no shortage of creative apps, have never had a shortage, and their library of apps and games is always increasing, not decreasing.

If Krita abandons Mac, do you think it will hurt Krita more, or Apple, or users?

What's more likely: Will people sell off their Mac and buy a PC just to keep using Krita, or would they rather find one of the many other painting apps available for Macs?


I think I'm more bullish on Krita's future than you are.

Regardless, we're kind of arguing past each other. If Krita abandons Mac and no one else does, then Mac users won't care. If a bunch of indie games abandon Mac, and then Krita does, and then something like Audacity or Blender does, and then it starts to spiral from there where Open Source and hobby developers en-large decide this crap just isn't worth putting up with for a project they release for free, then users will care.

It seems your main objection is just that you don't think that's likely. My assertion is, "Apple won't court developers until they start losing apps." Your assertion is, "Apple is not going to start losing apps." I don't see those two claims as being incompatible, and I don't think your assertion is worth arguing about given that we can just wait a year or two and see what happens.

> do you think it will hurt Krita more, or Apple, or users?

My broader assertion here is that we're in a narrow period where most people aren't using Mac computers, even for creative work -- so developers have a lot more freedom to make these decisions. I don't know the stats for Krita, but it wouldn't surprise me if the number of Mac users was small. I know that as a game developer and Open Source developer, I'm not going out of my way to support Mac. It wouldn't be worthwhile.

In terms of user benefit, I don't feel like Open Source developers have a burden to develop for platforms that are inconvenient. If Mac users really want an Open Source app to support their platform, any one of them can fork the code and pay the notarization fee themselves.

Other Open Source developers have different approaches and opinions on that front. Some of them are a lot more charitable than me.


> My broader assertion here is that we're in a narrow period where most people aren't using Mac computers, even for creative work

Apple has been the underdog in the numbers game since 1980. Macs have always had a smaller percentage of worldwide computer users, compared to Windows, but higher than Linux etc.

But that number has always been increasing. There are over a 100 million Mac users, and many of them doing creative work. Isn't it the second most-used desktop OS?

If someone is going to casually dismiss even a million people, I don't know what to say to them.


Implemented notarization for Corona SDK without any problems. Honestly, no idea what the fuss is about.


Apple is increasingly making it impossible to run software not approved by Apple on machines supposedly owned by those who bought them.

That is, the owner is no longer in control of his machine, Apple is. And if the machine will no longer listen to its owner, can it really be said to be his?

It’s all baby-steps, sure, but you have to draw the line somewhere. And for some people, that line was here.


What is the increase here? Gatekeeper has been in place and has required a Developer ID since 2012.


1. You can ask your users to manually allow Krita to run without notarization.

2. Instead of dropping macOS support, why not charge Mac users? Since you've already paid the developer fee, why not move Krita to the Mac App Store? Tell users that it's a cost of supporting Catalina.

3. You could continue to offer an un-notarized legacy version for free.

Abandoning your Mac users is more likely to just make them move to other apps.

If you feel your userbase is not techie enough to know how to manually run un-notarized apps, they probably won’t be able to switch operating systems just to use your app either.


This was heartbreaking to read. I don't understand how anyone can defend not being able to run software on their own devices, unless it was approved by Apple.


It's the kind of thing I definitely want for every non-technical person in my life. Which is most of them.

We're not that far removed from a time when every non-nerd's personal computer was a cess-pit of viruses and "addon search bars". It was very, very bad. It'd be much worse now given how much more sophisticated the malware scene is, and how much more connected devices are (always-on Internet wasn't even the norm in the early days of the every-computer-has-some-trojans-and-bonzai-buddy era)


If I couldn't have written code on my first computer, would I have been the programmer I am today? Would I even have been a programmer?

I don't know. And that part scares me.


Learning programming's easier even on an iPad than it was on, say, mid-90s desktops, overall. Granted earlier systems that booted straight to a programming environment forced you to grapple with code to do much with them, but they were a hell of a lot less accessible to the average person than a Raspberry Pi, say, price-wise. Even the crappy ones were a lot more expensive than that, especially inflation-adjusted.

The future is probably 10,000x as many total computing devices, most of which will be fairly locked down and not as hobbyist-programmer friendly as we've been used to but much better & safer for most folks—but also 100x as many tinkering-friendly computers, and much cheaper than they were "back in the day", and 1,000x the encouragement and free or cheap resources to learn to program as we had. My concern level that current trends will make programming anything other than more accessible, generally, to kids than it was back in the good ol' days, approaches zero. Even if the percentage of computing devices that are tinkering-friendly drops.


I have not had the unidentified application issue that the author posted but a full two thirds of my games in my Steam Library are flagged as not compatible with Catalina and most likely never will be.

Apple really needed a comparability method similar to how windows allowed you to simulate old environments. not always successfully but at least it was a chance

That notarization fee / process, I have not read beyond this story, does look to be a burden beyond what should be reasonable.


Most of games in my steam library are flagged incorrectly, e.g they still have a 64 bit binary and still run.


I think individual users can still disable the checks (how else can you develop software if you can't execute what you compile?)


The complete loss of control over your own devices is being gradually normalized.


When macOS is completely unable to run unauthorized executables I'll revisit this comment.

Until then, I'm completely in favor of putting all kinds of hurdles between uninformed users and executing random executables downloaded from the internet.


It’s not about being approved by Apple. It’s about a digital signature that proves the identity of the author.


Can users install apps on their iPhones, unless they were approved by Apple? That's the future of software freedom and user liberties on macOS.


Ok, can I use EIDAS? That is surely stronger than whatever identity validation Apple is doing. (I know for sure.) And it doesn't cost $100.


That might be worth pursuing within the EU.

That said, if it required something like a "qualified electronic seal" I think the cost would be substantially more (600 EUR a year?), and Apple might turn around and start charging a notarization fee to recoup their infrastructural costs.


The article mentions a "quality control process", so it's not only about the author being authorized. I view this as a first step to turn desktop computers into locked down iOS devices.


Currently that amounts to an automated malware scan[1] - the closest analog I can see is how Firefox requires browser addons to be submitted for approval even if not distributed through addons.mozilla.org

[1]: https://developer.apple.com/documentation/security/notarizin...


Same thing. Apple has to add an approval stamp. Why does it matter if I download spyware that's been signed or not? It's a control mechanism, the water starts out cool before it boils.


To the satisfaction of Apple, not to the user.


You can't get the digital signature without Apple's approval.


Clearly it's about more than that. the article itself contains why that is.


I agree with the sentiment expressed in this article, but notarization is not the hill I'd want to die on.

Want to play experimental itch.io games? Open a Terminal and type:

    sudo spctl --master-disable
It takes all of five seconds, and it's permanent. It might take less technical users a bit longer, but entering text into a Terminal window really isn't difficult, and technophobes aren't using itch.io.

Some people will say this is a bad solution because it makes your computer less secure, but, like, you can't have it both ways. Either you allow experimental software and accept the risks involved, or you don't. Maybe don't play experimental games on the same machine you use for important work.


I hope we can all agree that notarization is pretty valuable to users. Macs are a mass market product and most users are not technical. Also to create software for Apple devises you need... an Apple device. Isn’t the cost of the device a bigger burden than the $100/yr dev program license?


> I hope we can all agree that notarization is pretty valuable to users.

I don't. As a user, I don't need Apple arbitrarily allowing big players' apps through their approval process while they hold smaller developers to a stricter standard. That will stifle innovation.

If security is the touted excuse, macOS already has sandboxd[1] which can be used with arbitrary apps that aren't in the App Store.

Linux solved the security problem with Snaps, Flatpaks and AppImages, which all use various layers of containers, kernel namespaces and isolation to provide a sandboxed environment for apps.

[1] https://developer.apple.com/library/archive/documentation/Se...


Can you point to evidence that Apple is rejecting anything from being notarized? I haven't seen any.

And sandboxing is completely orthogonal to the fine-grained revocation that notarization allows for. A sandboxed app could still be malicious: say, a weather app that asks for access to your Contacts ostensibly to show weather at your friends' location, but also uploads all the Contacts info to a malicious tracking service. With notarization, this app could have its notarization revoked once it's discovered.


> Can you point to evidence that Apple is rejecting anything from being notarized? I haven't seen any.

Apple is rejecting anything by developers that don't pay them $100 a year, stifling competition in the process.

Apple has a history of conveniently rejecting apps if the rejection is in their financial interest[1].

[1] https://www.theverge.com/2019/5/31/18647249/wwdc-apple-paren...


As a user, I don't need Apple arbitrarily allowing big players' apps through their approval process while they hold smaller developers to a stricter standard.

I think by "user" it was meant the average user which by my estimation is not super technical and mostly sticks to larger apps anyways. Are smaller developers being held to a stricter standard though than larger developers on notarization?


Less competition is bad for the entire market, not just power users.


Don't overgeneralize. I don't agree, for instance. I want to run whatever I want in the computer I bought.


And Apple has stated that they're fine with that: you can disable SIP and Gatekeeper. But for 95+% of Mac users, who only want to run software from reputable sources, these are good steps.


> Macs are a mass market product

PCs and the WWW are massively bigger mass markets , not walled, and it's not like PC users are geniuses. Unless you mean that apple specifically caters to even less technical people.


> I hope we can all agree that notarization is pretty valuable to users

Nope.

I've never had any issue with macOS software acquired outside the MAS and I've never heard of any non-technical user with a problem either.

Notarization is not about security, it's about the iOSification of the Mac.


> I've never heard of any non-technical user with a problem either.

Do Trojans not exist at all on Mac? Honesty question (I have certainly seen them on PCs; on Linux I worry more about packages).


They do exist, and this Gatekeeper system is what is responsible for preventing malware.

When Zoom was found to have a serious security issue, Apple stepped in and blocked execution of the older versions of Zoom.

This would not be possible if malware just mutated to avoid detection. For this reason they want to attach a verified developer identity to applications, something backed by an individual's physical address or business records. You pay for this verification, and get a certificate to sign your applications.

New this year, they added a notarization service. This fixes some issues with signatures expiring, but is also built where Apple scans the application for malware before signing.

The scanning is new, but the developer id requirements have been in place since 2012.

If you distribute an unsigned app, the user will by default not be able to open it. You can set an exception as easily as selecting 'open' from the context menu and then saying you will allow the app to run.

You can also disable both the malware list and gatekeeper in general.

Note this is all separate from distributing in the Mac App Store, where you may run into additional policies around requirements for sandboxing, branding, use of private functions and frameworks, etc.

On linux, package signing is typically direct trust. You can manually choose to trust a packager who isn't trusted by your distribution, which is trusted by default.

I don't think linux distributions have anything to deal with malware after-the-fact, however.


They probably exist, but in my 12 years using macOS I've never had a virus problem nor heard anyone having one. Most people around me use macOS (coworkers, friends, and family).

I know it's anecdotal, but I doubt there is any objective data out there.


No, it isn't. It's pretty valuable for Apple as a precursor to forbidding things like non-webkit html renderers even for applications outside the app store, though.


Apple has no such restriction even for iOS (HTML renderers).

However, you cannot technically implement a JavaScript JIT since they will not give you a security entitlement to create and execute arbitrary code. You would need to either leverage JavaScriptCore, or use a (drastically slower) interpreted mode for JavaScript.


The fee isn't the issue. It is the on going treadmill of endless updates just to exist. That's a big deal.

I don't use app stores. Never have. And it's for this reason. I do not favor the control coming. If my computer will not run an executable on command, it really is not my computer. Useless and definitely not something I trust.

Actually I do grab free things off Google play, but I side load as often as I do that.

I think we're gonna see fairly modern systems added to the category of retro computing. It will be a system that just runs programs developed by others. Imagine that!


> Also to create software for Apple devises you need... an Apple device.

You don't really - you can use Electron and other frameworks to build it on a different OS, and just package it for macOS.


how do you package an Electron app for MacOS for free?


I don't know about the specifics of Electron but there are some tools to create MacOS installer packages on Linux[1]. I've done this to distribute a game written in the Renpy engine, it's a bit convoluted but it works. I'm not sure if app notarization is possible though (probably not), it's been a while since I dealt with this stuff and notarization wasn't required at the time.

[1] https://gist.github.com/SchizoDuckie/2a1a1cc71284e6463b9a


Are you saying that you’re ready to put your name on an app you’ve shipped but never tested yourself? This is the ultimate Fuck You to users: to show that the developer simply doesn’t care about quality.


> Are you saying that you’re ready to put your name on an app you’ve shipped but never tested yourself?

At work, I do this for in-house command-line tools. I have automated unit tests and integration tests, and all the compilation is done on a CI server. And if a problem slips through automated testing, well, then somebody will ping me on Slack and I'll fix it.

I only need to take out my MacBook Air to debug something once every year or two.

We may actually need to start a conversation at work about whether we want to continue supporting Macs internally. We could notarize our own CLI tools, but we also rely on lots of open source CLI tools, and I understand that all of those will eventually need to be notarized, too.


In poorer countries many Devs might not be able to afford the newest $800 iPhone for their potential career start.


I’ve had users who requested a mac version of a tool I develop and provided testing for it.

I can’t test on a mac, as I don’t have the money for one, so I build the binaries with Travis CI automatically, and never test them myself — I also can’t notarize them, obviously.


let's say you're an open source app, and develop on linux using electron. there are some people that would like to run your app on a mac, but you don't have one. you can set up github CI so that your app gets built into an appimage, that's a little work, but not much. should you now also be required to pay for the luxury of sending one or two people an app? should they pay for it?

this is not hypothetical; I literally just received a GH issue update asking if we can use community funds for the $100 ransom^Wfee.


I don’t understand. I keep seeing developers complain about the $100/year developer fee but I also keep seeing strong indications that’s not required for notarization.


I've a certificate whish is valid until 2022 (valid for 5 years) (generated in 2017 after paied 100$). I distribute only free games (GPLv3) that I sign with this certificate. For Catalina, I can't notarize because I don't pay each year (my certificate is valid then it's useless to pay and I distribute only free softwares)... But I'm fucked by Apple for the notarization.

It's too expansive to pay just to distribute games that will be even refused probably on the app store because it's GPL.

Apple is just killing freedom and continue to take advantage of free softwares (free developers) on BSD, mach kernel, a lot of unix tools... even on projects started by Apple like LLVM Apple takes all advantages of all third contributions.


FWIW, Apple doesn't care if apps being distributed contain code licensed under the GPL. They care whether you have the appropriate rights to distribute the app on their store under their terms - and that is asserted by you.

A previous build of VLC was pulled from the store because someone claimed they had copyright on some of the code - because they did not want VLC in the store.



Every single story about it features a comment saying "I thought you only needed a free developer account to notarize apps?" with no replies.

That said, Apple's membership-levels page features "software distribution outside the Mac App Store" as a benefit of the paid level: https://developer.apple.com/support/compare-memberships/


You don’t have to keep paying $100/year. Once a thing is notarized, it’s notarized. Also, anyone can submit any software to notarization, it’s not proof of ownership or identity.


Ah, so you need to pay $100/year to provide your users with security patches and bugfixes for an open source app.


Some developers here forget that there are game developers that make free stuff, hobby games, This developers often get requests to also package for Linux and Mac because the engine used is cross platform. So this people won't afford to buy a Macbook and pay 100$ yearly for one possible user that would like to try his free game.

I also see users asking for 32 Windows version of the games, there are still a lot of people running old computers and have bad internet connection and there is no good reason not to package some basic game app for them if it is possible.


Every Macbook/Mac Pro should come with a free developer account that remains active as long as the purchaser of the machine is using the laptop. Doing so would further distinguish “pro” from non-pro machines, would make notarization less like an annual tax, and would thus encourage rather than discourage such malware-free certifications.

That said, I agree with the article’s author that devs should be free to distribute whatever they want without warnings and without having to update annually. Not every piece of software gets maintained but it can still be useful, especially for artistic works. Let the App Store be the place where Apple certifies safety and leave the rest to the user.


Much like charities know that the best source of donations is previous donors, Apple knows that the best source of revenue previous customers who've already shown a willingness to open their wallets to pay extra for the Apple experience.


This article reminded me of the other major reason I've left OS X, other than five years of shitty Mac hardware which has been covered to death: shitty Mac software. Apple's software has always been on the lowest levels of quality, however, this generally excluded OS X. It's clear to me that Apple intends to remedy that and has already started so with this system protection bullshit that prevents running programs and resets itself on upgrades and possibly at other times randomly (I assume it does so just to fuck with users and show them who's the real boss and owner of their computer).

The author of this piece is 100% correct: none of these changes have anything to do with security. It's all about control. Apple, like most companies, just wants to control everything, whether relevant or not. The end goal is to control exactly what software is ran on their platforms, as clearly evidenced by the linked articles. I highly doubt that we'll be able to run any non-Apple approved tools in a year or two. They are turning OS X into iOS and have already renamed it macOS to let people know it's just going to be a walled garden of shit. The only reason to ever purchase an Apple computer was OS X. Now that OS X is turning into a shitty version of iOS, that last reason is gone. And that's before we get into the garbage hardware. I simply don't understand why anyone would want to support this platform going into the future. It's a platform that stands for censorship. Every component on the platform is designed around it. It's no longer a general purpose computing platform. In the future, it will be even less of one. All in the name of "security" and "privacy." Yup, save the children. But plenty of smart people fall for such stupidity.


Let me just place this signpost in the bandwagon's way:

→» Users CAN run un-notarized apps if they WANT TO. «←

You just have to make us trust you as a developer.


Should the user click on "Cancel" or "Move to trash" to tell Apple that they trust a developer?


I have to menu-click » "Open"

Telling your users to do that (+ any other steps if needed) might take less effort than shaking pitchforks at Apple (though they're certainly deserving of ire in other areas.)


Good luck with that if you have a non-technical user base. Gaining trust when the first thing they see is a big warning telling them this software shouldn't be trusted will also be really tough.


Then what's the problem? That trust should be worth the $100. The message from Apple doesn't say that it's untrustworthy just that it's not trusted by Apple. If users didn't trust Apple, that wouldn't make a difference.

If it's important to you for users to trust your app, then there are ways to do that. One of those ways is paying the $100. There are other ways outside of that, though.


> Good luck with that if you have a non-technical user base. Gaining trust when the first thing they see is a big warning telling them this software shouldn't be trusted will also be really tough.

And it should be tough. The hurdle needs to be high enough to frustrate malicious actors getting people to run random executables they downloaded from the Internet.


You could do it all passive-aggressive like Krita: https://krita.org/en/item/first-notarized-macos-build-of-kri...

A developer who doesn't want to put any effort into gaining trust doesn't sound like someone I should allow to run non-sandboxed code on my machine anyway.


A developer who doesn't want to pay $100/year and spend countless hours going through Apple's arbitrary approval and notarization process, you mean.


And so we reach the end of the for loop: https://news.ycombinator.com/item?id=21506948

:)

To reiterate: You don't have to notarize, if your users trust you enough to manually allow your app to run.


Developers have been worshiping mobile OSs for more than a decade. This is the endgame, and don't say we haven't been warned about it. We 've been burned by walled gardens way too many times to have an excuse. Phones are much less free platforms than desktops. If developers desert phones, people will follow.


I've been in love with OSX/macOS since I switched during the Vista fiasco but I feel it's a sinking ship these days.

I recently was looking to get a new machine for music production and ended up building an Ryzen PC with Windows 10. It cost me the same as an i7 Mini but it is many times more powerful, expandable, and silent. Thank god I moved away from Logic years ago.

Windows is not as pleasant to use as macOS, but it's just as stable. A lot of software seems to run better there (eg: Firefox, Chrome).

For dev work I will keep using my iMac and 2014 MBP with Mojave, at least for the foreseeable future. Unless something dramatic happens at Apple I will most likely end up moving to Windows or Linux in a couple of years when these machines die.


Dumb question - as a dev on OSX (not yet migrated to Catalina), I compile and run my own code all the time. I also install a ton of binaries, like the Rust installation manager.

Would I see any issue here? I suspect everything I write or "install" via curl/bash/etc will be fine. Eg, if it's a downloaded binary in my $PATH, I expect it to continue to work.

So if that's the case.. why is Apple being so harsh about all of this? I feel like malware will just move to curl scripts and all Apple succeeded at was making normal Apps more difficult to develop.

I imagine some might argue that users will be somewhat informed, and will know not to run stuff in a Terminal. Cool.... but, how many times have we seen non-technical folks run stuff in the Windows command prompt? They can be walked through a malware installation process a thousand times over.

So is Apple going to lock the entire platform down? Are they going to limit my ability to run binaries? My own and others? Because if they don't, this all feels pointless. And if they do, i'l _be forced_ to switch to Linux.

I don't like you these days Apple. What is wrong with you.


> So if that's the case.. why is Apple being so harsh about all of this? I feel like malware will just move to curl scripts and all Apple succeeded at was making normal Apps more difficult to develop.

The goal is to prevent users from running untrusted applications or mistrusted applications like trojans. If you can convince the user to click through a security warning or run commands from the terminal, then the user themselves have taken responsibility for evaluating trust.

The current system does depend on the user having appropriate 'spidey sense' to what the developer is asking them to do - the yardstick is a certain level of informed consent. If malware driving the user through terminal prompts becomes a significant user problem, that may unfortunately lead to the system being locked down further.

FWIW, other more serious changes (such as disabling SIP) go beyond a terminal command or requiring a password, to requiring you to reboot onto the rescue partition. Here, they are requiring a much higher degree of informed user consent.


When you download an app through a browser, it will set a quarantine file system attribute. By default, clearing this bit requires the app to be notarized. The user can also right-click open to allow an app to run, add a third-party certificate as trusted, or disable Gatekeeper altogether from the command-line.

I believe Catalina also will be periodic checks to make sure the binary hasn't been modified.

If you don't distribute software off of your own machine, you do not have to think about signing at all.

The rust installation manager and cargo do not set the quarantine attribute, so there is nothing for quarantine to check when binaries are run.


I kind of blame UX and bad incentives for this. Most of these attempts at curating installation all started as well intended means of keeping malware out of the OS. If you think it’s bad today, you forgot what it was like 20 years ago.

However, the evolution of platform protection slowed down as it became clear this was an amazing revenue generator. Incentives became blurred. A new type of malware showed up aimed purely at collecting your data, and our protections haven’t kept up, because that’s no longer the goal.


Companies are in the business of making money. Once they find out they can make more money by not supporting your use case, they will toss you to the curb.

At this point, Apple does this by being a nearly completely consumer focused company. If you aren't the average consumer, they are going to toss you to the curb someday.


Does the notarization licence covers the iOS store or do we need to buy both licences?


Notarization costs nothing.

But it's $99/year to be a part of the Apple Developer program which covers all platforms.


It specifically says that "Software distribution outside the Mac App Store" requires "Apple Developer Program" which costs 99 USD here: https://developer.apple.com/support/compare-memberships/


Can you share a link to the web page where one can create a notarization account for free?


https://developer.apple.com/support/compare-memberships/ says that distribution outside the Mac app store is a paid feature. Who's wrong, Apple or you?


This is both the most compelling and poetic argument in the article, and the most wrong:

> ”Apple’s vision involves us constantly updating work, constantly adding to our games, constantly paying to exist here, even when some of this stuff is done. Often when a game is done, it’s done. Games aren’t a service. It’s like asking for a director to keep updating a movie, or for a musician to keep changing their song so it can keep running. Decisions like this erase our history.”

On the contrary, you chose an ephemeral medium for your art. That choice, like making sandcastles, has consequences.


“I was looking into the requirements and I cannot afford $100 a year.“

I don’t understand how someone can’t afford a $100/year expense on something related to their occupation.


If they pay the expense, will their income increase by more than $100 per year? (not even counting the value of the hours it will take to notarize and renotarize apps)


There are no guarantees but, statistically, Mac and iOS users are far more likely to make purchases than all their other counterparts combined.


I kind of blame UX for this. Most of these attempts at curating installation all started as well intended means of keeping malware out of the OS. If you think it’s bad today, you forgot what it was like 20 years ago.

However, the evolution of platform protection slowed down as it became clear this was an amazing revenue generator. Incentives became blurred, and it’s why you have so much malware on the App Store masquerading as casual games.


So, I make game engine, Corona SDK. It is indie game engine. No issues with notarizing occurred. I really don't understand what all the fuss is about:

* once notarized, always notarized. You don't have to continue to pay for old apps to work

* You can notarize old apps, same as new apps

* It is very easy. One command line command. That's that. There is also GUI wrapper for it, our user like. But in the end, it is very easy to do.


Sure, the article even links to a page describing the process: https://www.molleindustria.org/blog/notarizing-your-flashair....


Windows: Spies on you and shows you ads.

Apple: Censors and controls everything you do on your computer.

Linux: Neither of the above. It's your computer.

The choice seems clear to me.


One thing I don't see people talking about on here is.. html5. Probably the greatest number of individual indie games are games made to be played in the web browser.

It strikes me that as a developer if you can't afford $100 and just want to make games, why not make them in the safety of the browser sandbox?


What if it's a graphically/resource intensive game that would necessitate running it natively? Imagine trying to run an Unreal Engine equivalent in webgl for example.


Hehe, hopefully it's just more fodder for an anti-trust case 1-2 years from now.


Apple has lost its way.


Apple is exactly on its way, as it's always been. You've just been crushed under its wagon.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: