Hacker News new | past | comments | ask | show | jobs | submit login
Sorry macOS users, but Apple has gone too far for some of us devs (gridsagegames.com)
142 points by samcat116 on Sept 29, 2019 | hide | past | favorite | 114 comments



As someone who recently worked in game dev I'm not sure why 32 bit was even a factor as 64 bit is likely your default target anyway - has been for awhile. So, killing 32 bit is not a real issue.

Notorizing can be seen as just another cost of targeting a platform. So do it. You will need at least one machine anyway to do basic testing on. So use that. Get the cheapest imac or whatever you require. So, mac hardware isn't a real issue.

I found the statistics a bit strange as well. Sales were 2% linux, 4% mac, 94% windows. Yet support was divided 30% linux, 50% mac and 20% windows. This suggests linux is more problematic per sale than mac. Yet he has no issue with linux. If this was me, I'd be wanting to know exactly why linux/mac was generating so many tickets. The answer will likely help the overall product.

Personally, you need to jump through so many hoops to release software that none of these thus far mentioned should be big enough to stop you. The real reasons should be: is there a demand in this target market; will that demand generate sufficient profit?

As I'm starting my own journey on this I see all the platforms and app stores have various issues. Everything has tradeoffs. The article's author needs to come up with better arguments. I just don't see his reasons as relevant in comparison to the other much larger costs associated with development.

I'd actually recommend targeting multiple platforms just to force bugs to come out. Different platforms / compilers see and expose different issues in your codebase. So far I'm quite happy to have cross platform support even when i don't necessarily release on those platforms yet.

I find art, sound and game assets are more expensive than an apple dev account or even a mac computer. Others have written in more detail on this. Even generic business costs exceed the subscription/computer costs he's stating as being barriers.


What surprises me is the fact that the author doesn't talk about the craziest of Apple's recent choices: the deprecation of OpenGL. This is much worse than the deprecation of 32 bit and, IMHO, it's just Apple giving up on being a relevant gaming platform. This, plus the lack of nvidia drivers and of support for Vulkan.

All the small issues here show, together, that indeed Apple doesn't give a shit about gaming and about interoperability with other platforms. They want developers to be faithful to their locked garden and their technologies.


> All the small issues here show, together, that indeed Apple doesn't give a shit about gaming and about interoperability with other platforms. They want developers to be faithful to their locked garden and their technologies.

Especially if it's only 4% of your sales. If 4% of your sales doesn't cover the cost of a Mac, code signing, and the cost of support on the Mac then it's a _net negative_ - you'll make more money not being on the Mac.


Completely agree. These are real issues with MacOS and Apple. More generally, if you're writing productivity apps on iOS then Apple is also your competitor. Plenty of devs have warned me not to write iOS apps that line up with Apple's existing offerings.


I'm not a game dev but just as a developer I do think a lot of these assumptions are a little unfair. You can't call things "not a real issue" just because they're not a big issue to you.

> ...64 bit is likely your default target anyway - has been for awhile. So, killing 32 bit is not a real issue.

If you depend on 32-bit software, no, it is a real issue. Updating an app and dependencies to all be 64-bit can be a pretty big issue. Not everyone is on a modern stack.

> ...linux is more problematic per sale than mac. Yet he has no issue with linux.

If I were on a PC I would have less issue with linux too, simply because I don't need to buy special hardware or switch computers to deal with those tickets.

> If this was me, I'd be wanting to know exactly why linux/mac was generating so many tickets. The answer will likely help the overall product.

Will it? If they're platform issues, they're not improving the product, just getting it to run.

> Personally, you need to jump through so many hoops to release software that none of these thus far mentioned should be big enough to stop you.

You don't know this guy's resources though. That's like telling someone, if you're already come 25 miles, 5 miles aren't that much more. If they're in a car, sure. If they are running, yes it is— every mile is a lot.

> The real reasons should be: is there a demand in this target market; will that demand generate sufficient profit?

I think the fact only 4% of sales come from macOS means there isn't much demand or profit, right?

> As I'm starting my own journey on this I see all the platforms and app stores have various issues. Everything has tradeoffs. The article's author needs to come up with better arguments.

The arguments are good. You yourself just said it. Everything has tradeoffs, and the tradeoffs of developing for macOS are just not worth this person's time.

> I just don't see his reasons as relevant in comparison to the other much larger costs associated with development.

Except do you really know his costs, and the resources he has to cover said costs?

> I'd actually recommend targeting multiple platforms just to force bugs to come out. Different platforms / compilers see and expose different issues in your codebase.

Wait, what? What's the point of killing bugs for a platform you don't support though? Maybe when I develop for web I should text in IE 5. It'll surface so many "bugs" in my site.


IE 5 is a ridiculous strawman example you bring up in bad faith to drive your point. You know its ridiculous because you specifically list an ancient version that is obviously a bad target. This does not refute what I wrote.

I can point to multiple bugs revealed just by having multiple platform support. Other devs I've spoken to have also had this experience. Different platforms have different tools available.

A computer worth $1600 is dwarfed by cost of business name, software tools, insurance, utilities etc etc. Art, sound etc also are much higher costs.

I'm a one man indie dev. I'm guessing I know a little bit more about this issue. Perhaps not. But I do manage to pay bills etc so I must know something.


> You will need at least one machine anyway to do basic testing on.

You wouldn't if Apple didn't prohibit using VMs. If you want to test compatibility with the actual hardware you probably should have more than one machine anyway.

> This suggests linux is more problematic per sale than mac.

Or that Linux users are more likely to report their problems. IIRC there recently was an article on HN by a game developer who supported Linux in part to get quality bug reports.


”You wouldn't if Apple didn't prohibit using VMs.”

Nitpick: Apple doesn’t completely prohibit VMs; it prohibits VMs not running on Apple hardware running the OS, and puts restriction on why you can run them. https://www.apple.com/legal/sla/docs/macOS1014.pdf:

”to install, use and run up to two (2) additional copies or instances of the Apple Software within virtual operating system environments on each Mac Computer you own or control that is already running the Apple Software, for purposes of: (a) software development; (b) testing during software development; (c) using macOS Server; or (d) personal, non-commercial use.”

I think one intended use for that is for testing your software with older or beta OSes, but “already running the Apple Software” to me, implies running the same version in the VM as on the host.

I also think they don’t just allow running in a VM because it would make running hackintoshes perfectly feasible, losing them significant hardware sales.


Notarization and needing a paid developer account doesn't change the requirement for running software on OSX relative to the previous version.

Before, you needed to sign with a "developer ID distribution" certificate if you wanted to avoid having to tell users to "ctrl-click DMG then click open" to bypass gatekeeper. This certificate required a paid developer account.

Now you can still distribute un-signed, un-notarized software and tell users to ctrl-click open to bypass gatekeeper. All this does is require notarization (Apple running malware analysis) to have it run without a ctrl-click.


I imagine Steam doesn't allow Gatekeeper bypasses. Still, notarization is designed to be easy to script: https://developer.apple.com/documentation/xcode/notarizing_y... - so it really shouldn't be an issue.


The only issue is it can take an hour or more - https://news.ycombinator.com/item?id=21110622 polling for the status to finalize would waste build minutes on your CI.


Well but you don't need to notarize all builds, just those that went through internal QA

Split the build process so that you can click "version 1.2.3" and have it notarized (this is doable in Jenkins for example)


My limited experience notarizing a hello world react electron app took 10 to 15 minutes. Not that much but a large increase from the 1 minute build time before


In my (limited) experience, for non-GB-sized apps, the turnaround time is closer to five minutes.


Can you hand it off to be notarized and then end that job, and have a hook to pick up when that's done?


Yikes. If you can't figure out how to make your code 64-bit safe, or how to make code signing work, you are effective requiring your users to give you root access. Completely unacceptable for a game.


Completely agree. The developer account is a great way to disincentive malicious apps from being made. I would accept the following for not supporting MacOS:

* lack of vulkan support * porting code to the apple SDK * cost/benefit for small indie titles

pointing out how many people logged bugs doesn't mean that the OS is buggy. it simply means you've written shitty code. Another odd point with OP is that he's made the effort to start supporting it. Dropping support is always more expensive than simply fixing some issues, as you've already made the investment.

Most developers seem to have rational decisions behind their support. This post seems like they're jumping on the Apple hate bandwagon.


1. You also need to pay apple a subscription, update your hardware, change your build process. I think there is also a review step during the build.

2. All the existing tooling (not the game, but the build process) will need to be 64 bit to run on this new hardware. So the build needs to be re-worked

3. People on Mac require more support, which is harder because dev is not a Mac user.


The developer has every right to support or abandon whatever platforms they want.

But there’s some FUD here:

1. There is no review step. Notarization is an automated process that takes about an hour. It doesn’t require a substantive change to your build process. You just submit your build, get a notary receipt file back from Apple about an hour later, and “staple” it to your build using a single tool. It’s not burdensome.

2. The system requirements for preparing a build for notarizarion are pretty liberal if you look at actual hardware usage. Of course some machines fall off the bottom, but it’s not like they’re trying to drive revenue with this. It’s like a teardrop in the ocean.

3. All of this is even less burdensome on developers who follow any kind of process for testing. You would already have a capable machine being used for testing, and would already be seeing delay between your builds being prepared and your distributions going out. Apple’s part of notarization could happen while testing is in process, meaning zero delay. If this doesn’t apply to you, fine. But it probably means you weren’t treating the platform all that seriously in the first place.


> It’s not burdensome

As defined by who?

  1. Submit build
  2. Wait an hour
  3. Retrieve notary receipt
  4. Staple to correct build
  5. Upload build
Sure sounds more burdensome to me than:

  1. Upload build
Edit: Lol, people arguing with reality. It may or may not be a good thing, but arguing that it doesn’t add any burden to the developer seems hard to me.


Wouldn't you have jenkins (or whatever your build system is) do that all for you?


Possibly, but even modifying your Jenkins script is a ‘burden’. Apparently their polling mechanism also makes that a bit painful.


Not necessarily.


You know what's also a burden? Writing a good game. So I'll just write a shitty one to decrease burden.

Isn't burden the thing you're actually getting paid for when you sell the game?

If adding 2 steps to your build process is such a burden that it results in you not supporting an entire platform, you're either lazy or looking for an excuse.


Your last point pretty much sums it all up. The dev seems to be basically looking for an excuse to drop the platform.


As far as I can tell they never supported it in the first place. Entire thing is really weird.


Do these guys complain this much about releasing games on consoles too?


Ironically, releasing on consoles got easier over the years. Patch certification cost (and possibly the entire requirement for indies) got dropped and programs like ID@Xbox allow you to release your game for free.


Probably, consoles aren't mentioned at all in the blog post or on the game's website.


Consoles have a user base large enough to be worth the effort.


That’s a fundamentally different platform I think.


In some ways, it isn't that different; consoles are just computers with some platform-weirdness. But I do actually agree insofar as that I can reasonably target NT, Darwin, and Linux from one machine (so long as it's a mac or I'm willing to break the Apple EULA), but to target consoles, so far as I know, requires purchasing additional hardware.


Isn’t this every game ever until a few years ago? The only OS where I’ve ever been bothered by ‘this code isn’t signed’ has been OSX.


In what way is lack of 64 bit support + lack of code signing equal to elevated permissions?


> you are effective requiring your users to give you root access

Wait, it's either code sign and you don't get root or don't code sign and get root? How retarded is macOS? Why is there no "no code sign and no root, just run as the user"?


I'm pretty sure on Linux if you wanted to run Quake3 you had to run as root due to needing direct access to the GPU. Is Linux "retarded" too?


Plenty of coding practices from 1999 are completely unacceptable today.


Yes. That is why Apple requires you to get your code signed.


Please don’t use that term in the way you’re using it. It’s offensive and unnecessary.


I'm quoting the guy above me. Go tell him.


Two wrongs don't make a right.


Or: you don’t codesign and users ignore your app in favor of the one that doesn’t require rebooting to recovery mode to disable core platform security features, so that chrome, etc can break your machine.

Longer term I imagine that OS X will simply have an non-overridable sandbox that tightly restricts what any unnotarized app can do. Eg if you aren’t notarized you get access to your own container and no other part of the file system.

Alternatively you could just sign your code properly and update it to hardware that’s existed for 15 years.


Analogy falls apart for Chrome since it's safe to assume Apple either gives scaning priority to Google or has outright whitelisted their account (since Chrome updates so frequently and there's opportunity for Google to throw money at Apple)


Notarization takes minutes and is presumably automated, so why wouldn’t chrome do it?

You’re doing a great disservice to the engineers at google if you think the actual release work is so short that a few minutes for notarization is a problem.


Does chrome releases multiple builds a day? Scanning usually takes less than 1 hour.


I certainly don't blame the developer for not supporting macOS, but the reasoning he outlines in his post aren't particularly compelling to me. It seems to me porting his game over to MacOS in itself would be a larger undertaking and more valid complaint than the ones he's listed.

The notarization process potentially reducing the number of builds he can push to users in a day from 6 to something less seems like a feature, not a negative.


the first commercial downloadable game I released back in 2015, that let me become a full-time indie developer, has done roughly 25% of its revenue over the last four years on Mac. This is adding together the % of copies sold on Steam for Mac users, and the Apple Mac Store, where my game was featured on the main page for three months. Notarization, and 64-bit support, and all that aside, the audience on Mac that you can sell to is massive, especially since a lot of developers just ignore the platform - I'm fully expecting to support as many macOS devices as I can for my next game.


I feel like there's also something of a snowball effect here because in certain genres the available set of games are good enough to rarely need to reboot into Windows. All the Paradox strategy games, X-COM, a variety of indie games built on Unity, etc. It feels like the only thing I'm ever consistently missing out on is FPS games, and even then there's been workable ports of stuff like Deus Ex.


Anything requiring more than Intel integrated graphics tends to not work well on the Mac. The vast majority of Mac models can't even take a dedicated GPU upgrade, so you're stuck with whatever Apple soldered to your motherboard and at their mercy for graphics driver upgrades.

The only time I've seen anyone try to seriously game on the Mac was a friend of mine that built a Hackintosh with a (brand new at the time) nVidia GeForce 1080ti. He couldn't even maintain 60+ FPS on Rocket League. So there's no wonder why games that require that level of GPU performance don't sell well on the Mac.


What is your game?


So the major points are dropping 32bit support and Apple forcing you to sign your executable? The author claims that you need to pay Apple for the signing. as far as I know this is not true. You just need to pay for being on the application stores ...


I assume this refers to having an Apple Developer account, which is USD99 a year. Some of the points may be valid - support costs versus expected revenue, etc, but mostly these points seem really weak.

The headlines feels like it should be “Sorry macOS users, it doesn’t make sense for me to support macOS”. The rest seems like a bit of clickbaity hyperbole and throwing in a bunch of semi-related arguments.


You have to pay for the signing. You also have to wait several minutes for the notarization process to finish, and submit every program to Apple.


All malware checkers must read the app binary to be effective. Apple forces the app to be scanned once, not one million times on one million desktops.


Apple also forces every single app for macOS to be submitted to them. Seeing as how Apple has historically been willing to compete with the applications on their platform, that seems to at least have the potential to be a problem.


If they want to use your compiled binary to compete with you somehow, buying one copy per app they want to put out of business seems like a better option instead of setting up a whole notirization scheme to trick you into uploading it to them.


I paid hundreds of dollars for a Windows codesigning certificate. The Apple developer program is a relative bargain. If $99/year is stopping the developer this developer I’m not really sure what to say.


Notarization is free and in my experience it can take as little as 60 seconds for it to complete.

https://developer.apple.com/documentation/security/notarizin...



You have to pay the $99/yr to be in the developer program, to access to the notarization service.


Also I heard that you need fairly new Mac hardware to actually use the notarization service.

Now try to integrate it with your CI, because the feedback that the notarization is complete is sent via email.

(FWIW: I don’t have first-hand experience, though I still have an Apple developer license. Probably won’t renew.)


I've scripted this for work, but it does have to run on a mac (for me a late 2013 Macbook pro), which would make it difficult to do in the cloud.

There is a utility (written in Java) that you run through "xcrun" that can upload, check status, fetch and "staple" the result. It usually turns around within 15 minutes for me, but I haven't measured the exact time. (I'm just polling every five minutes.) Check the "log" file, because it can successfully notarize and then report in the log that your signatures are all screwed up.

Details here if anyone needs to do this: https://developer.apple.com/documentation/xcode/notarizing_y...


Actually, that feels like a bigger issue: Does all of this have to run on a mac? That basically precludes build farms, since Apple doesn't make the xserve line anymore and doesn't play nice with virtualization. I guess you could build a farm of desktops...


I’ve long given up on supporting macOS.


You’re not kidding about the email. It’s email or manual polling, no webhooks. Strange. Well, I can suddenly imagine a lot of CI servers will be running SMTP. ;-)

> After uploading your app, the notarization process typically takes less than an hour. When the process completes, you receive an email indicating the outcome. Additionally, you can use altool with the notarization-history flag to inspect the status of all of your notarization requests

https://developer.apple.com/documentation/xcode/notarizing_y...



I am not a Mac developer, but this doc https://developer.apple.com/documentation/security/notarizin... claims you need 10.13 or above, which seems to run on 2009-2010 hardware. I know the latest version of Xcode 10 requires 10.14, but even that is about ~2012 hardware and newer. I think the upcoming 10.15 also has the same hardware requirements as 10.14.

Anyway, even if we say 10.14, I wouldn't call 2012 models fairly new.


If it's anything like iOS dev, you need Apple hardware to access the tools to sign your builds. You can't just run osx in a VM.


You could potentially outsource the signing to a CI service that has the requisite Apple hardware though.


Well, you can... using hacked VMware and Hackintosh tools, it's just illegal :)


64-bit support first arrived in Leopard in 2007. Everyone has had plenty of time.


That's what I don't understand about this. He's using 32-bit deprecation as one of his main reasons for dropping macOS support, yet his decision to exclusively use the older architecture arguably makes the user experience worse. Of course they would like new features, but a modern 64-bit foundation should not be out of the question. It's almost like Microsoft has incentivized this by supporting the older architecture. I know that probably isn't the case, but it seems like this developer should modernize his program.

Edit: take my words with a grain of salt-- I'm not an experienced developer.


Also, there was only ever one set of macs released in 2006 which had a 32-bit Intel processor, as the transition to 64-bit happened only a year after the Intel transition. We're not merely long past the point where you have no excuse for not supporting 64-bit; we're long past where it made any sense to support 32-bit. Even in 2010 if you started a port to macOS and decided to target 32-bit you would be making a bad decision.


I’m glad Apple is locking down binaries. It’s terrifying what malicious binaries are capable of.

For games in particular there should be better sand boxing options.


Signed binaries don't stop malicious binaries. It can help prevent the spread of malicious binaries once they've been identified as malicious, but it does nothing to stop the initial mayhem.


I don't think that the signed binaries are intended to prevent malicious binaries but rather instead to create an evidence trail once an malicious binary has been submitted.

Since you have to sign with a key that you have payed apple to authorize. Apple has the payment details that connect you and the signed binary together. In addition the membership has licence conditions that in turn allow Apple to sue you.

The intent is a deterrent effect.


That’s not true. Apple prevents unsigned binaries from being executed at all.


That’s not true. You can bypass that.


Well of course it can be bypassed, but it certainly doesn’t “step in” at some point either as was suggested. It blocks all unsigned binaries regardless of any damage they’ve caused.

Though this morning I realized I misunderstood the point of the grandparent here. He was saying you can have a signed binary that is still malicious. I thought they were talking about what apple does in the case of unsigned.


The fact that they are capable of that is an OS flaw. Signing and all the rest is repairing a leaky dam with chewing gum.

Our OSes all date back to a time when security simply was not such a concern. All binaries almost have root. A modern OS would be least-privilege all the way.

Btw web browsers show that it is possible to run untrusted code locally in a sandbox and do so fairly safely.


> Of course Mac support request ratios will vary by game due to architecture and player base, but there’s little question that they’re higher in general, ...

Why? What is the nature of these support requests? Are they specific to your game? Could it be because your game was not properly made for macOS to begin with?


In the minds of Apple, anything not programmed solely for the Mac is 'not properly made for macOS'. It's shades of "you're holding it wrong" again.


Seriously, OS X went 64bit 12 years ago. X86 has been 64bit for longer.

Maybe, just maybe, the problem is developers refusing to actually move forward.

It’s especially galling given game devs are the group that most prominently complains about OS X being behind the times.

As for “needlessly” dropping 32bit: if your poorly written 32bit app starts up you necessarily force OS X (and the user’s machine) to pull in 32bit variants of almost the entire platform. Thanks for that. It also means Apple has to maintain 32bit versions of every framework. I can speak from experience here: maintaining a 32bit version of javascriptcore was a massive amount of work. Every new JS feature needs duplicated JIT work, and needlessly complicates the entire JSC codebase.

That aside, Apple manages to do this - and I can’t imagine your game approaches the complexity of an entire OS -so if you can’t make your software work on a 64bit system then the problem is your code, not Apple.


The number of 32bit intel macs is surprisingly small, too. Core Solo/Duo models ran from January to November 2006 (bar one mac mini model that survived into 2007).

I'd be very surprised if there's a 32bit mac that'll run their game acceptably well. I'd be surprised if it's ever actually supported 32bit macs. So why was it 32bit in the first place?


It's a Windows game. People are asking him about the possibility of a Mac port.


Ok, but if the game is only a few years old why didn’t it compile for 64bit?

I recognize MS insisted on screwing the market by selling 32 vs 64 bit as separate versions of Windows for many years, but even then surely most windows machines have been running 64 by default for a decade?

But seriously: if you make a new piece of software in the last 15 years that only works in 32bit, then you made a choice to target an obsolete platform with worse performance (again, game developers complaining about perf while only building 32bit so throwing away 10-20% perf for no reason strikes me as hollow)


I mean, I agree with you completely but what I've found is that a lot of games, whilst are 64-bit themselves, often have a lot of dependencies that, for whatever reason, still have 32-bit binaries in there somewhere.


I occasionally wonder why Mac/ia32 existed as well.

My conclusion has been that when Mac’s first switched to intel there was a combo problem - intel’s mobile chips were still mostly 32bit (remember intel dropped the ball on 64bit), and more importantly while ppc mac’s were capable of 64bit (the G5 cheese grater) the majority of the desktop was iffy, coupled with things like flash, etc that were 32bit.


> That aside, Apple manages to do this

Also pretty much everyone else - I just checked the status of all the apps present on my Mac (System Report) and the only 32-bit apps on here are QuickTime Player 7 (long-deprecated) and a bunch of garbage Adobe background update processes.


Yup, complaining about deprecating 32bit seems to primarily be a choice by game devs who care about frame rates, but will happily throwaway 10-20% perf win that you get on x86_64


This article fails to address two key points that could significantly weaken its core arguments if answered. Those points and the questions that arise from each are:

1) Apple charges a flat rate of $100/year for notarization access. Steam and Mac App Store take 15-30%. What additional percentage of Mac sales (either subscription and/or one-time) is being spent on this annually by this developer?

$100/year & 100 units/year = $1.00/unit

$100/year & 1000 units/year = $0.10/unit

$100/year & 10000 units/year = $0.01/unit

EDIT: Steam indicates that this game has sold 0-20,000 units (max 168 concurrent players), between May 2015 and September 2019, so using a straight flatline average of the best-case scenario for sales, the cost to date per unit is:

$500/5y & 23000 units/5y = $0.22/unit

2) Notarization can be added to command-line build scripts and Makefiles and third-party processes, so that it is simple to ensure that builds are notarized as part of releases to Steam. What, if any, engineering obstacles in the developer’s build process blocked this integration? https://developer.apple.com/documentation/xcode/notarizing_y...


The argument from the article is that in non-Apple ecosystem, the ongoing fee is 0, which is smaller than all 3 of your listed examples. Also, as opposed to "simple command-line build scripts", in non-Apple ecosystem, there is no script required to release versions. I don't see the argument weakened, but rather strengthened.


If Windows also required code signing as a deterrent to malware being distributed, we'd see similar prices (depending on the code signing reseller, currently cheapest is $59/year for 3 years). Instead, we have a huge third-party antivirus market that only deals with malware after the fact. Notarization is Apple's approach to pre-screening applications and will certainly be constantly improving its detection.


Your argument is based on an unstated framing assumption:

“A cheaper and simpler process is better for developers and users, and therefore the absence of both cost and complexity is universally the best solution, without regard for any other factors.”

That is not a widely agreed-upon assumption. Since the post relies on this same assumption-by-framing approach, its arguments are weakened by their dependence on an unstated assumption. Answering my questions would force that assumption to be considered openly - and potentially challenged.

Is it better for users that Apple is doing this, regardless of the extra cost and time it assigns to developers?

That’s the question that should be being asked here. Unfortunately, it is not.


Does Apple have to sign games distributed through Steam? I thought that only applied to their own app store.


Mojave and Catalina added more restrictions around non-App Store programs where you need to sign and notarize.

I know I’ll probably catch downvotes for it but I think this is a good thing for the average user as someone in security. I know it makes it more of a walled garden but I also know that the harder it is to run untrusted code, the better for the average user.

The “terrible burden” is a couple of extra button clicks if working in Xcode or you could automate it with a script. I did the latter for the project I work on and it’s been fine.

That probably drives a lot of folks to Linux and that’s okay. I’m for diversity in platforms but I’m also for strong protections for the users.


Letting megacorporations decide what you do and do not get to do with your personal property isn't really what I would count as a "good thing" in any circumstance.

Apple has gotten away with eliminating user freedoms time and time again under the guise of security. In many scenarios Apple's attitude towards its end-users increases perceived security while causing significant hurdles for actual security, such as how iOS no longer allows you to turn off Bluetooth or WiFi from the Control Center. Somehow I've never really seen an explanation for why this is better for security for the vast majority of people.


You are still able to run unsigned (and now unnotarized) software. It requires holding down a single key when first launching it. I’m pretty sure the trade off here - to prevent the average, relatively uninformed user - from downloading and running random software is very much a worthwhile trade off for malware protection etc.


Sorry, but this reads like "the seatbelt and airbag in my car prevent me from fully utilizing my personal property".

For 99,99% of users, notarized apps are a positive. For the 0,01% who need it disabled, press the CTRL button.


You can still bypass gatekeeper with Ctrl-click, it's less Apple deciding what users do with their property and more Apple making it less likely users get mad that their Mac got some malware (assuming notarization means it catches a good amount of malware before it's sent into the wild)


Don't buy a Mac then. It's not like they're forcing you.


“Starting October 14th, 2019 Steam will require all new macOS Applications to be 64-bit and notarized by Apple.” (https://partner.steamgames.com/doc/store/application/platfor...)


> Beginning in macOS 10.15, notarization is required by default for all software.

https://developer.apple.com/documentation/security/notarizin...


As a Mac user why should I care about this developer who's never released Mac software and probably wouldn't have anyway?

When someone like the Omni Group or Panic writes a similar screed than it will be newsworthy. This? Hardly...


As an active Apple dev with five commercial apps, I agree with the sentiment in the article. I'm ready to leave Apple the moment something better comes along.

Since Steve died, every decision at Apple has been anti-dev and anti-user.

The price/performance of the hardware is awful. Apple is the current tech leader in planned obsolescence. Old hardware is iCloud locked by default with no option to contact the registered owner to tell them that they forgot to unlock it. Now they want to extend that practice to laptops. The developer documentation is universally awful. The version changes in Swift are so drastic and frequent that you can't find solutions for the current version because the Internet is polluted with information about previous versions. They are glacially slow to fix bugs. They have removed basic networking functionality covered in the RFCs. They instantly kill backgrounded apps in a mad race for style over function.

I. Could. Go. On.

TL;dr I hate Apple with a burning passion and only inertia keeps me in the Apple dev world.


why is dropping 32-bit support even a consideration?


People will mention new libraries requiring 64-bit or to reduce the installed size of macOS but I think both of those explanations are (mostly) BS.

Microsoft has an unhealthy obsession with backwards-compatibility due to their enterprise market base. Apple has the opposite problem, where they consistently remove legacy features (in many cases, features that are not even "legacy" in nature but required for normal usage scenarios) without giving users a chance to catch-up. Sometimes this is warranted, as it is with removing optical drives, other times, it is not -- such as dropping 32-bit support when neither Windows nor Linux have any intention of doing so in the near future.

This philosophy been accelerating in absurd fashion lately, whether it's dropping the headphone jack from the iPad Pro, the "butterfly keyboard," or leaving the MacBook Pro with no USB-A ports or SD card slot -- two extremely well-adopted technologies that are still ubiquitous nearly half a decade from the release of the all USB-C MacBook Pro.


Maybe Cogmind depends on libraries which don’t support 64 bit? I agree though, it’s a bit weird. Maybe it’s harder for game-related libraries to switch, compared to the ones I’m used to using in my apps.


Does this mean they are using unmaintained gaming libraries? Or is it just turtles all the way down of people not wanting to take responsibility to switch to 64bit because some random thing they depend on didn’t upgrade so they aren’t to blame?

And this chain of side stepping of an obvious issue went on for well over a decade?

I know it’s in human behaviour to delay things until they absolutely have to do something. Like we all learned in school doing less important homework. But in software there are always dialing pressures to modernize while supporting backwards compatibility.

10yr timeframes are sufficient here and from there if you want to use old stuff then it’s on both the user or the publisher to use some virtualization solution to provide long term support beyond this.


Old game, potentially (you can't continue to sell them without updating them).


Cogmind dates to 2015, it's hardly 'old'.


And people like to make lite of the "Y2K fiasco" - here's how Y2K happened...


It seems better to just say it isn't worth the effort because you're a small indy dev who doesnt have time to handle a bunch of heterogenous platforms. None of those things seem that bad if you had > 1 people contributing


Aging thread by now, but had to chime in with some clarifications as an indie dev dealing with similar issues, to respond to those saying stuff like 'you're a bad dev if your app isn't 64-bit by now'. I am a company of 1 humbly maintaining a cross platform app that's been around long enough to predate the 64-bit era. It's about 70%/30% Windows/Mac sales. When the majority of machines out there became 64-bit years ago, I happily optimized the components of my app that stood to benefit from 64-bit, such as intensive media processing tasks and other bottlenecks. But since a lot of computing tasks just don't need what 64-bit offers, parts of my app that were working well enough using 32-bit components stayed that way, and users never knew the difference or cared because over the years I strategically targeted the areas to rewrite as 64-bit without having to reinvent the wheel and devote resources to rewriting everything.

Since the 64-bit x86 instruction set was implemented to sit on top of the legacy 32-bit one, it is literally impossible for x86 hardware to cease supporting 32-bit into the future. So we have Apple here making the choice to cease software support of this backward compatibility feature already baked into the cpu, to save a few bucks or whatever. There is no performance benefit to abandoning 32-bit. Your Mac won't be faster on Catalina because 32-bit support is gone, despite what Apple is trying to imply in their marketing, because the cpus they use are still built from the ground up to support 32-bit. A negligible-by-today's-standards amount of RAM and disk space used by the system will be conserved, but that's really it. So Apple, the wealthiest software company ever, has decided to stop funding software support of a feature already present in the hardware they are selling you at ungodly markups, thus screwing over both customers out of legacy app support, and indie devs with costly extra refactoring work that in many cases offers no perceivable benefits to end users. This is why it is infuriating to be an indie dev supporting Mac right now.

As for my specific case, my software, under the hood a combination of several 32-bit and 64-bit processes, still runs happily on Windows, and most of my customers are on Windows, particularly large organizations that purchase site licenses and sponsor new features or customization projects. But in order for it to be 100% 64-bit, so much of it would have to change due to old dependencies as to require rewriting a huge chunk of it from scratch. This is my technical debt to bear. When it became apparent a few years ago that Mac would phase out 32-bit support, I began work on a full 64-bit rewrite, but I've been sidetracked as customization projects and consulting gigs from Windows-only customers continued to pour in and with limited resources as a small company I had to choose to prioritize those projects which added tangible features to my software right now at the expense of making progress on the 64-bit rewrite. I continue to work on the rewrite project so that I can continue to support the Mac platform, but it sure feels like a whole lot of work to port features to 64-bit that won't have any noticeable improvement to end users! In the mean time I feel terrible that Mac users who upgrade won't be able to use my software until I get around to completing the 64-bit version. But it just never made sense for a company of my scale to devote resources to racing to complete a 64-bit rewrite for the sake of those 30% of sales. These are the kinds of choices that a small business must make and thus is the position I find myself in as an indie Mac developer.

I hope this provides an understandable real world answer to the question of "how can any app under active development in 2019 still be relying on 32-bit?".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: