Notorizing can be seen as just another cost of targeting a platform. So do it. You will need at least one machine anyway to do basic testing on. So use that. Get the cheapest imac or whatever you require. So, mac hardware isn't a real issue.
I found the statistics a bit strange as well. Sales were 2% linux, 4% mac, 94% windows. Yet support was divided 30% linux, 50% mac and 20% windows. This suggests linux is more problematic per sale than mac. Yet he has no issue with linux. If this was me, I'd be wanting to know exactly why linux/mac was generating so many tickets. The answer will likely help the overall product.
Personally, you need to jump through so many hoops to release software that none of these thus far mentioned should be big enough to stop you. The real reasons should be: is there a demand in this target market; will that demand generate sufficient profit?
As I'm starting my own journey on this I see all the platforms and app stores have various issues. Everything has tradeoffs. The article's author needs to come up with better arguments. I just don't see his reasons as relevant in comparison to the other much larger costs associated with development.
I'd actually recommend targeting multiple platforms just to force bugs to come out. Different platforms / compilers see and expose different issues in your codebase. So far I'm quite happy to have cross platform support even when i don't necessarily release on those platforms yet.
I find art, sound and game assets are more expensive than an apple dev account or even a mac computer. Others have written in more detail on this. Even generic business costs exceed the subscription/computer costs he's stating as being barriers.
All the small issues here show, together, that indeed Apple doesn't give a shit about gaming and about interoperability with other platforms. They want developers to be faithful to their locked garden and their technologies.
Especially if it's only 4% of your sales. If 4% of your sales doesn't cover the cost of a Mac, code signing, and the cost of support on the Mac then it's a _net negative_ - you'll make more money not being on the Mac.
> ...64 bit is likely your default target anyway - has been for awhile. So, killing 32 bit is not a real issue.
If you depend on 32-bit software, no, it is a real issue. Updating an app and dependencies to all be 64-bit can be a pretty big issue. Not everyone is on a modern stack.
> ...linux is more problematic per sale than mac. Yet he has no issue with linux.
If I were on a PC I would have less issue with linux too, simply because I don't need to buy special hardware or switch computers to deal with those tickets.
> If this was me, I'd be wanting to know exactly why linux/mac was generating so many tickets. The answer will likely help the overall product.
Will it? If they're platform issues, they're not improving the product, just getting it to run.
> Personally, you need to jump through so many hoops to release software that none of these thus far mentioned should be big enough to stop you.
You don't know this guy's resources though. That's like telling someone, if you're already come 25 miles, 5 miles aren't that much more. If they're in a car, sure. If they are running, yes it is— every mile is a lot.
> The real reasons should be: is there a demand in this target market; will that demand generate sufficient profit?
I think the fact only 4% of sales come from macOS means there isn't much demand or profit, right?
> As I'm starting my own journey on this I see all the platforms and app stores have various issues. Everything has tradeoffs. The article's author needs to come up with better arguments.
The arguments are good. You yourself just said it. Everything has tradeoffs, and the tradeoffs of developing for macOS are just not worth this person's time.
> I just don't see his reasons as relevant in comparison to the other much larger costs associated with development.
Except do you really know his costs, and the resources he has to cover said costs?
> I'd actually recommend targeting multiple platforms just to force bugs to come out. Different platforms / compilers see and expose different issues in your codebase.
Wait, what? What's the point of killing bugs for a platform you don't support though? Maybe when I develop for web I should text in IE 5. It'll surface so many "bugs" in my site.
I can point to multiple bugs revealed just by having multiple platform support. Other devs I've spoken to have also had this experience. Different platforms have different tools available.
A computer worth $1600 is dwarfed by cost of business name, software tools, insurance, utilities etc etc. Art, sound etc also are much higher costs.
I'm a one man indie dev. I'm guessing I know a little bit more about this issue. Perhaps not. But I do manage to pay bills etc so I must know something.
You wouldn't if Apple didn't prohibit using VMs. If you want to test compatibility with the actual hardware you probably should have more than one machine anyway.
> This suggests linux is more problematic per sale than mac.
Or that Linux users are more likely to report their problems. IIRC there recently was an article on HN by a game developer who supported Linux in part to get quality bug reports.
Nitpick: Apple doesn’t completely prohibit VMs; it prohibits VMs not running on Apple hardware running the OS, and puts restriction on why you can run them. https://www.apple.com/legal/sla/docs/macOS1014.pdf:
”to install, use and run up to two (2) additional copies or instances of the Apple Software within virtual operating system environments on each Mac Computer you own or control that is already running the Apple Software, for purposes of: (a) software development; (b) testing during software development; (c) using macOS Server; or (d) personal, non-commercial use.”
I think one intended use for that is for testing your software with older or beta OSes, but “already running the Apple Software” to me, implies running the same version in the VM as on the host.
I also think they don’t just allow running in a VM because it would make running hackintoshes perfectly feasible, losing them significant hardware sales.
Before, you needed to sign with a "developer ID distribution" certificate if you wanted to avoid having to tell users to "ctrl-click DMG then click open" to bypass gatekeeper. This certificate required a paid developer account.
Now you can still distribute un-signed, un-notarized software and tell users to ctrl-click open to bypass gatekeeper. All this does is require notarization (Apple running malware analysis) to have it run without a ctrl-click.
Split the build process so that you can click "version 1.2.3" and have it notarized (this is doable in Jenkins for example)
* lack of vulkan support
* porting code to the apple SDK
* cost/benefit for small indie titles
pointing out how many people logged bugs doesn't mean that the OS is buggy. it simply means you've written shitty code. Another odd point with OP is that he's made the effort to start supporting it. Dropping support is always more expensive than simply fixing some issues, as you've already made the investment.
Most developers seem to have rational decisions behind their support. This post seems like they're jumping on the Apple hate bandwagon.
2. All the existing tooling (not the game, but the build process) will need to be 64 bit to run on this new hardware. So the build needs to be re-worked
3. People on Mac require more support, which is harder because dev is not a Mac user.
But there’s some FUD here:
1. There is no review step. Notarization is an automated process that takes about an hour. It doesn’t require a substantive change to your build process. You just submit your build, get a notary receipt file back from Apple about an hour later, and “staple” it to your build using a single tool. It’s not burdensome.
2. The system requirements for preparing a build for notarizarion are pretty liberal if you look at actual hardware usage. Of course some machines fall off the bottom, but it’s not like they’re trying to drive revenue with this. It’s like a teardrop in the ocean.
3. All of this is even less burdensome on developers who follow any kind of process for testing. You would already have a capable machine being used for testing, and would already be seeing delay between your builds being prepared and your distributions going out. Apple’s part of notarization could happen while testing is in process, meaning zero delay. If this doesn’t apply to you, fine. But it probably means you weren’t treating the platform all that seriously in the first place.
As defined by who?
1. Submit build
2. Wait an hour
3. Retrieve notary receipt
4. Staple to correct build
5. Upload build
1. Upload build
Isn't burden the thing you're actually getting paid for when you sell the game?
If adding 2 steps to your build process is such a burden that it results in you not supporting an entire platform, you're either lazy or looking for an excuse.
Wait, it's either code sign and you don't get root or don't code sign and get root?
How retarded is macOS? Why is there no "no code sign and no root, just run as the user"?
Longer term I imagine that OS X will simply have an non-overridable sandbox that tightly restricts what any unnotarized app can do. Eg if you aren’t notarized you get access to your own container and no other part of the file system.
Alternatively you could just sign your code properly and update it to hardware that’s existed for 15 years.
You’re doing a great disservice to the engineers at google if you think the actual release work is so short that a few minutes for notarization is a problem.
The notarization process potentially reducing the number of builds he can push to users in a day from 6 to something less seems like a feature, not a negative.
The only time I've seen anyone try to seriously game on the Mac was a friend of mine that built a Hackintosh with a (brand new at the time) nVidia GeForce 1080ti. He couldn't even maintain 60+ FPS on Rocket League. So there's no wonder why games that require that level of GPU performance don't sell well on the Mac.
The headlines feels like it should be “Sorry macOS users, it doesn’t make sense for me to support macOS”. The rest seems like a bit of clickbaity hyperbole and throwing in a bunch of semi-related arguments.
Now try to integrate it with your CI, because the feedback that the notarization is complete is sent via email.
(FWIW: I don’t have first-hand experience, though I still have an Apple developer license. Probably won’t renew.)
There is a utility (written in Java) that you run through "xcrun" that can upload, check status, fetch and "staple" the result. It usually turns around within 15 minutes for me, but I haven't measured the exact time. (I'm just polling every five minutes.) Check the "log" file, because it can successfully notarize and then report in the log that your signatures are all screwed up.
Details here if anyone needs to do this:
> After uploading your app, the notarization process typically takes less than an hour. When the process completes, you receive an email indicating the outcome. Additionally, you can use altool with the notarization-history flag to inspect the status of all of your notarization requests
Anyway, even if we say 10.14, I wouldn't call 2012 models fairly new.
Edit: take my words with a grain of salt-- I'm not an experienced developer.
For games in particular there should be better sand boxing options.
Since you have to sign with a key that you have payed apple to authorize. Apple has the payment details that connect you and the signed binary together. In addition the membership has licence conditions that in turn allow Apple to sue you.
The intent is a deterrent effect.
Though this morning I realized I misunderstood the point of the grandparent here. He was saying you can have a signed binary that is still malicious. I thought they were talking about what apple does in the case of unsigned.
Our OSes all date back to a time when security simply was not such a concern. All binaries almost have root. A modern OS would be least-privilege all the way.
Btw web browsers show that it is possible to run untrusted code locally in a sandbox and do so fairly safely.
Why? What is the nature of these support requests? Are they specific to your game? Could it be because your game was not properly made for macOS to begin with?
Maybe, just maybe, the problem is developers refusing to actually move forward.
It’s especially galling given game devs are the group that most prominently complains about OS X being behind the times.
That aside, Apple manages to do this - and I can’t imagine your game approaches the complexity of an entire OS -so if you can’t make your software work on a 64bit system then the problem is your code, not Apple.
I'd be very surprised if there's a 32bit mac that'll run their game acceptably well. I'd be surprised if it's ever actually supported 32bit macs. So why was it 32bit in the first place?
I recognize MS insisted on screwing the market by selling 32 vs 64 bit as separate versions of Windows for many years, but even then surely most windows machines have been running 64 by default for a decade?
But seriously: if you make a new piece of software in the last 15 years that only works in 32bit, then you made a choice to target an obsolete platform with worse performance (again, game developers complaining about perf while only building 32bit so throwing away 10-20% perf for no reason strikes me as hollow)
My conclusion has been that when Mac’s first switched to intel there was a combo problem - intel’s mobile chips were still mostly 32bit (remember intel dropped the ball on 64bit), and more importantly while ppc mac’s were capable of 64bit (the G5 cheese grater) the majority of the desktop was iffy, coupled with things like flash, etc that were 32bit.
Also pretty much everyone else - I just checked the status of all the apps present on my Mac (System Report) and the only 32-bit apps on here are QuickTime Player 7 (long-deprecated) and a bunch of garbage Adobe background update processes.
1) Apple charges a flat rate of $100/year for notarization access. Steam and Mac App Store take 15-30%. What additional percentage of Mac sales (either subscription and/or one-time) is being spent on this annually by this developer?
$100/year & 100 units/year = $1.00/unit
$100/year & 1000 units/year = $0.10/unit
$100/year & 10000 units/year = $0.01/unit
EDIT: Steam indicates that this game has sold 0-20,000 units (max 168 concurrent players), between May 2015 and September 2019, so using a straight flatline average of the best-case scenario for sales, the cost to date per unit is:
$500/5y & 23000 units/5y = $0.22/unit
2) Notarization can be added to command-line build scripts and Makefiles and third-party processes, so that it is simple to ensure that builds are notarized as part of releases to Steam. What, if any, engineering obstacles in the developer’s build process blocked this integration?
“A cheaper and simpler process is better for developers and users, and therefore the absence of both cost and complexity is universally the best solution, without regard for any other factors.”
That is not a widely agreed-upon assumption. Since the post relies on this same assumption-by-framing approach, its arguments are weakened by their dependence on an unstated assumption. Answering my questions would force that assumption to be considered openly - and potentially challenged.
Is it better for users that Apple is doing this, regardless of the extra cost and time it assigns to developers?
That’s the question that should be being asked here. Unfortunately, it is not.
I know I’ll probably catch downvotes for it but I think this is a good thing for the average user as someone in security. I know it makes it more of a walled garden but I also know that the harder it is to run untrusted code, the better for the average user.
The “terrible burden” is a couple of extra button clicks if working in Xcode or you could automate it with a script. I did the latter for the project I work on and it’s been fine.
That probably drives a lot of folks to Linux and that’s okay. I’m for diversity in platforms but I’m also for strong protections for the users.
Apple has gotten away with eliminating user freedoms time and time again under the guise of security. In many scenarios Apple's attitude towards its end-users increases perceived security while causing significant hurdles for actual security, such as how iOS no longer allows you to turn off Bluetooth or WiFi from the Control Center. Somehow I've never really seen an explanation for why this is better for security for the vast majority of people.
For 99,99% of users, notarized apps are a positive. For the 0,01% who need it disabled, press the CTRL button.
When someone like the Omni Group or Panic writes a similar screed than it will be newsworthy. This? Hardly...
Since Steve died, every decision at Apple has been anti-dev and anti-user.
The price/performance of the hardware is awful. Apple is the current tech leader in planned obsolescence. Old hardware is iCloud locked by default with no option to contact the registered owner to tell them that they forgot to unlock it. Now they want to extend that practice to laptops. The developer documentation is universally awful. The version changes in Swift are so drastic and frequent that you can't find solutions for the current version because the Internet is polluted with information about previous versions. They are glacially slow to fix bugs. They have removed basic networking functionality covered in the RFCs. They instantly kill backgrounded apps in a mad race for style over function.
I. Could. Go. On.
TL;dr I hate Apple with a burning passion and only inertia keeps me in the Apple dev world.
Microsoft has an unhealthy obsession with backwards-compatibility due to their enterprise market base. Apple has the opposite problem, where they consistently remove legacy features (in many cases, features that are not even "legacy" in nature but required for normal usage scenarios) without giving users a chance to catch-up. Sometimes this is warranted, as it is with removing optical drives, other times, it is not -- such as dropping 32-bit support when neither Windows nor Linux have any intention of doing so in the near future.
This philosophy been accelerating in absurd fashion lately, whether it's dropping the headphone jack from the iPad Pro, the "butterfly keyboard," or leaving the MacBook Pro with no USB-A ports or SD card slot -- two extremely well-adopted technologies that are still ubiquitous nearly half a decade from the release of the all USB-C MacBook Pro.
And this chain of side stepping of an obvious issue went on for well over a decade?
I know it’s in human behaviour to delay things until they absolutely have to do something. Like we all learned in school doing less important homework. But in software there are always dialing pressures to modernize while supporting backwards compatibility.
10yr timeframes are sufficient here and from there if you want to use old stuff then it’s on both the user or the publisher to use some virtualization solution to provide long term support beyond this.
Since the 64-bit x86 instruction set was implemented to sit on top of the legacy 32-bit one, it is literally impossible for x86 hardware to cease supporting 32-bit into the future. So we have Apple here making the choice to cease software support of this backward compatibility feature already baked into the cpu, to save a few bucks or whatever. There is no performance benefit to abandoning 32-bit. Your Mac won't be faster on Catalina because 32-bit support is gone, despite what Apple is trying to imply in their marketing, because the cpus they use are still built from the ground up to support 32-bit. A negligible-by-today's-standards amount of RAM and disk space used by the system will be conserved, but that's really it. So Apple, the wealthiest software company ever, has decided to stop funding software support of a feature already present in the hardware they are selling you at ungodly markups, thus screwing over both customers out of legacy app support, and indie devs with costly extra refactoring work that in many cases offers no perceivable benefits to end users. This is why it is infuriating to be an indie dev supporting Mac right now.
As for my specific case, my software, under the hood a combination of several 32-bit and 64-bit processes, still runs happily on Windows, and most of my customers are on Windows, particularly large organizations that purchase site licenses and sponsor new features or customization projects. But in order for it to be 100% 64-bit, so much of it would have to change due to old dependencies as to require rewriting a huge chunk of it from scratch. This is my technical debt to bear. When it became apparent a few years ago that Mac would phase out 32-bit support, I began work on a full 64-bit rewrite, but I've been sidetracked as customization projects and consulting gigs from Windows-only customers continued to pour in and with limited resources as a small company I had to choose to prioritize those projects which added tangible features to my software right now at the expense of making progress on the 64-bit rewrite. I continue to work on the rewrite project so that I can continue to support the Mac platform, but it sure feels like a whole lot of work to port features to 64-bit that won't have any noticeable improvement to end users! In the mean time I feel terrible that Mac users who upgrade won't be able to use my software until I get around to completing the 64-bit version. But it just never made sense for a company of my scale to devote resources to racing to complete a 64-bit rewrite for the sake of those 30% of sales. These are the kinds of choices that a small business must make and thus is the position I find myself in as an indie Mac developer.
I hope this provides an understandable real world answer to the question of "how can any app under active development in 2019 still be relying on 32-bit?".