Hacker News new | past | comments | ask | show | jobs | submit login

This would be a lot more interesting if f.lux was actually distributing source code. Instead, they were just distributing a binary that you sign with your own dev cert.

Side-loading pre-built binaries like that is a huge risk for users and never had a chance of being tolerated by Apple. Such abuse puts the new free-tier Xcode dev program in jeopardy.




> Side-loading pre-built binaries like that is a huge risk for users

No it is not. You can do that on every single computing device out there - except those from Apple. There is some risk if you are stupid about it, but not huge right. A warning screen would be enough.

> never had a chance of being tolerated by Apple

It's my device, not Apple's. (Well, I would never actually buy an Apple device because of this, but those who do buy them should have the right to install whatever they link.)


You can do it on non-iOS Apple devices, too.


> its my device, not apple's

Except when a device gets highjacked or something by a malware or whatever then user always blames the manufacturer of the OS, I guess apple knowing this just have a strict policies about it. Apple has a right to decide how to build and maintain their OS, they are the ones who created it, not device owners. If device owners don't like something about it, they have an option not to purchase it, Apple does not have an option to not sell a phone to some dumb user who will install viruses and then complain about Apple lack of security


> Except when a device gets highjacked or something by a malware or whatever then user always blames the manufacturer of the OS

In other words, a bad swimmer blames the producer of the bathing drawers for his bad swimming skills. I understand.


It would be ridiculous for a bad swimmer to blame the creator of his or her swimwear.

On the other hand, a bad swimmer might be pretty upset with the creators of his or her boat, if hypothetically the goal is to stay out of the water.

It's just possible that it depends on what you expect from the device when you buy it.


Except that for many users, they would GREATLY PREFER to have the RIGHT to acknowledge the risks entailed, and use the device how THEY WANT TO USE IT. They purchased a physical device, and they deserve to be able to use it how they want. This is just another example of why the DMCA needs to be dismantled.


I've been spending some time trying to understand this today, and I don't understand why such users don't just go buy a device they can root and take full control that way. Why blame a vendor that doesn't ship what you want? Why not just... buy what you do want?

Separately: The DMCA needs to be dismantled, absolutely.


There's no good alternative, except maybe the MiPhone in China. Android is pretty horrible and saddled with all kinds of Google integration.


Well, I agree with you there (in general I mean, I don't know the miPhone). I would love to see a truly open source handheld computer. I've thought about picking up a Firefox OS phone. I just don't think it's Apple's job to create it. That's not what they do.


> Except that for many users, they would GREATLY PREFER to have the RIGHT to acknowledge the risks entailed, and use the device how THEY WANT TO USE IT.

Let's face it: These people (rightly) buy Android and Nexus devices.


I am not sure I follow how you arrive at the conclusion that "many users" would prefer this. What platform are those users on? It appears that many users (certainly a financially measurable majority) have indicated their preferences already.


No bad swimmer will blame a coach who did not control his swimming and let him drawn. That's the logic behind no tech skilled users. Remember how everybody was blaming apple for celebrities photos leaked, they did not blame stupid celebrities for having the same easy password for all their online accounts...


Some of those people think nothing of hiring physical security costing five or even six figures every year. They didn't also consider hiring someone to spend an hour a month administering their personal devices? I notice that I am confused...


Let me have a switch and require me to do some trivial but scary sounding procedure to enable the functionality so that idiots don't run across it - flash the firmware or something. Void the warranty when I do so to discourage the cautious, with appropriate warning labels to go back before doing so.

If you protect the naive from the consequence of their actions forever, then you end up with a world where those who are not naive cannot do anything because of all the 'safety' that they have to fight through first.


The naive outnumber the skilled by so very many... perhaps the skilled need different devices and different operating systems?


This used to be Apple's key offering. They sold systems which were suitable for the skilled and the naive alike. It was wonderful because it meant that the naive could smoothly transition to being skilled, and the skilled could have all the comforts and ease of something suitable for the naive.

Sadly, Apple seems to have given up on this and is now catering exclusively to the lowest common denominator. There's plenty of power left in their stuff, but it's all left over from better days, and is slowly slipping away.


It was Apple's offering only during OS X, and only because its BSD underpinnings. If Apple had chosen to build OS X on a homegrown kernel, Apple wouldn't have been appealing to the skilled.


I disagree that it was only during OS X. Classic Mac OS, and going back further the Apple IIs, were great for power users.

I also don't understand your counterfactual. Apple didn't build OS X from scratch, they took NeXT's OS and ported it to Mac hardware and tweaked it. The result was something both powerful and easy to use. Making UNIX easy to use is not a trivial accomplishment! Yes, if Apple had done something completely different for their next-generation OS then maybe they would have ended up with something worse. But they didn't.


Not sure why you're being downvoted, the only thing that convinced me to use a Mac (for a while) was the BSD base on some nice hardware. Without it it would be like trying to develop on iOS: slow, locked down and flashy.


I wasn't sure how to answer this yesterday. I never experienced Apple's "better days" as I've been developing in Windows for the last 20 years. Apple's current days, well, they seem pretty good for my needs, but I wasn't paying attention during the times you're talking about.

I suppose when I need real control I would look to, say, OpenBSD instead, which I prefer to use remotely from a very pleasant and very predictable - but not super-powerful - Apple machine. Our needs, I'm sure, are not the same.


I like having an intuitive, easy-to-use WIMP system for my common tasks. I don't want to read my e-mail in a terminal window, or use some X11 chat app with a programmer-designed UI.

I also like being able to get a terminal window to fiddle with the guts when I want it. Some tasks are better suited for that interface, and some powerful tools are only available there.

During those better days, Apple fully embraced the UNIX nature of their OS, without compromising the usability of their GUI layer. They got Mac OS X certified by The Open Group. They started opensource.apple.com and distributed enough of the OS's guts that you could actually get a full Darwin OS up and running entirely from source, even if the result wasn't super useful. They gave out a full suite of developer tools free of charge that you could use to build powerful apps for the system, and were in fact the exact same tools that Apple used.

Then iOS came. No more open source, except the absolute bare minimum they're required to distribute for open source licenses. (Their open source archive for iOS 9.0 contains a whole six packages, of which five are various parts of WebKit.) No more visibility into anything. Everything runs in a sandbox that you can't bypass by any means except finding a security vulnerability. The developer tools are still free, but if you want to actually use them to ship anything, or even run anything on real hardware locally, you have to pay and agree to their terms. (The "run anything on real hardware locally" bit has changed with Xcode 7, which is what f.lux was briefly taking advantage of, but this is new as of just a couple of months ago.) Apple suddenly wants to play gatekeeper to everything; if you want to ship apps to your customers, you have to first let Apple review and approve it, and they'll reject you if you don't follow their rules.

The Mac retains much of what was great about it before, but it's slowly drifting the same way.


As far as free developer tools, I again don't know what is different from before. I hear you, but haven't experienced the nirvana you describe: I've been on Windows, where development tools have very often been free to play with and require payment to use them for real work.

The way I look at it is that Apple's gatekeeper role is appropriately minimal on OS X and appropriately strict on iOS, and for my part I trust the twain will never meet because of exactly that "general purpose computer" difference, but you have added some thought-provoking perspective.


Before, the process for shipping an app looked like:

1. Obtain Xcode, either as a free download, or from your OS install media. (Apple just shipped Xcode along with the OS for a while.)

2. Develop your app.

3. Distribute your app directly to customers.

With iOS, the process for shipping an app looks like:

1. Obtain Xcode as a free download.

2. Start developing app.

3. Realize you need to test on real hardware at some point. Before Xcode 7, testing on real hardware, even your own, cost $99/year. With Xcode 7 you can do it for free, as long as you get an account and have Xcode fetch the certificates for you.

4. Submit your app to Apple for distribution on the store. If you did step 3 without paying, then you need to pay your $99/year here.

5. Wait a week or so for Apple to review your app. With luck, they'll approve it. If they find something they don't like, they'll reject it and you get to go back and repeat this process until you appease them.

With the Mac today, you have a few choices. You can go through the App Store, where the process looks much like the iOS process, except you don't have to jump through any hoops to test on your own hardware. You can go through Developer ID, where the process looks like the old Mac way, except you pay $99/year to have a signing certificate. Or you can keep doing things the old way, at no cost, but most of your users will get scary, scary warnings when they try to run your software.

The thing is that it's not about the tools being paid, but the distribution being paid and restricted. I can completely understand paying for developer tools. Until they went crazy, I was happy to pay JetBrains for some of their tools, for example. Your Windows tools required payment to use them for real work, and that's fine. But if you didn't want to, you could have used mingw or something like that, and done everything for free and without restrictions. This option is simply not available on iOS, and is being slowly ratcheted down on the Mac.


To my mind, iOS security is where it needs to be for that device. I respect your opinion, but we're not going to agree on that. I've considered your argument and I don't see anything that changes the fundamental "so don't buy an iOS device, they do not have a monopoly and there are other fish in the sea". For my part, I'll keep buying them because I follow a similar philosophy that security changes are preeminent and should be allowed to break and constrain other software.

Thing is, if I believed the Mac was going to be ratcheted down to iOS levels of paternalistic control, I would be upset too. My conclusion is it'll never happen. I don't think they'll ever break their general purpose computer. If they did, you'd see a massive exodus of former Mac users looking for a new UNIX desktop. Maybe a big enough exodus to finally make The Year of the UNIX desktop happen. :)


That is the common response to my complaints.

The problem is this: what part of the process I described has anything to do with security? As far as I can tell, nothing. Apple's review process is pretty superficial and is entirely geared towards stuff like making sure nobody ships a browser that doesn't use WebKit, or an app that posts information about drone strikes. Getting something malicious past the gatekeepers is completely trivial. It's building the malicious stuff in the first place that's the hard part.


If you believe that arbitrary browser code running arbitrary website code doesn't weaken security, then I don't know what to say except I disagree.


Weaken it more than having everybody use the same proprietary build of WebKit? No, I don't think it does.


Maybe. It's not clear to me how many people would go to the bother of unlocking their devices and then complain to Apple when something went wrong if there was some trivial roadblock.

I'd also, on an ideological note, be inclined to be cautious about separating the consumers and the producers any more than they already are. One of the big promises of computers is that you can automate so much - granted a lot of people are cut off from that power - and I'd want to be going in the direction of making that power more accessible to people rather than less. If I'd had to, for example, buy a specialised programming computer with a programming OS when I was young... I'd probably not be a programmer today. That sort of thing seems like it would be very expensive.

But you may very well turn out to be right - maybe most people can't be allowed to have nice things. :/ Depressing, if true, but not entirely implausible.


Unless you are trying to argue that iOS is replacing desktop operating systems, I don't see why the presence of a locked-down device is a threat to a general purpose computer. I find general purpose computers are not expensive at all and I don't particularly expect that to change.


The switch could simply be downloading the development tools. Then you know that you're in development mode, and that things could break.


[deleted]


Outlawed? Just because one vendor sells a non-general-purpose computer, you feel like general-purpose computers have been outlawed? I don't understand.


Unfortunately the world we live in suggests that yes, you can do what you want with the device. However, the OS is a licensed piece of software and can only be used within their terms.

It sucks, but desktop OSs technically have a similar right. While I have f.lux now on my phone, I'm happy.


> Side-loading pre-built binaries like that is a huge risk for users

That's quite an odd thing to say, isn't it? You're basically endorsing the idea that you shouldn't be allowed to run binary code of your choice on your own computer.

It's interesting to imagine the outrage if this policy was enforced by literally any other company on literally any other device or platform.

> and never had a chance of being tolerated by Apple

That's probably true.

> Such abuse

Can we perhaps find a term other than "abuse"? It seems quite inappropriate in this context. Spamming is abuse. Running non-Apple approved code on your own person device is basically the opposite of abuse.


> That's quite an odd thing to say, isn't it? You're basically endorsing the idea that you shouldn't be allowed to run binary code of your choice on your own computer.

Quite a few people have been saying for a very long time that distributing proprietary software without users being able to see the source is a Bad Thing. It's not your freedom they object to, they feel they are actually championing your freedom.

You or I may pragmatically decide to embrace opaque proprietary software, but certainly there is a reasonable school of thought opposed to the notion. It's much like arguing that we should be free to buy food that doesn't list the ingredients on the side, and drugs that aren't tested.

There're reasonable arguments to be made on both sides of that issue.


You are actually conflating two separate issues:

1. Whether users should be able to run any software (binary or not, proprietary or not) which they happen to have access to, on hardware that they own

and

2. Whether it is right for people to distribute, to users, software which is useful for a purpose, and therefore attractive, but where the users cannot change or even inspect the software itself (and cannot hire anyone else to do these things for them).

The issue at hand was the first one, and you answered by talking about the second one.

I believe the usual Free Software Movement arguments go like this:

For the first question, it is obvious that users should have a right to use the things they own however they wish – otherwise it can hardly be called ownership. For the manufacturer to give such restricted hardware in exchange for money, but still retain this degree of control over the hardware after the transaction, should therefore not be allowed to be referred to as “selling”, since the recipients/possessors of such hardware cannot usefully be considered the full “owners”.

For the second question, while it is obvious that it is unethical to tempt users to give up their freedom to change and inspect the software, or have it changed or inspected (since this checking and tinkering is one of the things which drives a good society), it is certainly not obvious that it should be out-and-out illegal for anyone to offer such unchangeable and uninspectable software to users.


I'm deliberately entangling them, because outside of an abstract conversation, the behaviour of humans and the resulting economics of the marketplace are entangled, as I'm sure you agree. Freedom for humans creates incentives for manufacturers to exploit that freedom.

If you give humans the right to buy anything they want, sooner or later someone will make a health drink containing Radium.

https://en.wikipedia.org/wiki/Radithor

At that point, you either go full Libtard and say, "we're all adults here, caveat emptor" or you regulate the marketplace and make it illegal for people to do business in Radium drinks.

I'm not saying anyone should be prohibited from buying/leasing/licensing proprietary, opaque software. But I do think there are reasonable arguments for such.

So what I would say back to you is "Do not conflate saying there are reasonable arguments for X with saying X should be the case."


r/conflating/confusing


No, I did mean “conflating”.

Lazare was talking about the first issue, and braythwayt responded as if the answer to the second issue was the answer to the first one, implicitly conflating the two issues.


> distributing proprietary software without users being able to see the source is a Bad Thing.

That's a completely separate issue. Apple cares not a jot whether you can see the source; they care that someone was distributing iOS apps outside the garden.


> That's quite an odd thing to say, isn't it?

It's that how many viruses and pieces of malware get around? The user is tricked into installing something and then it goes bad?

Isn't that how most every exploit and virus seen on Android work?

Is that how the problem apps found in the jailbreak app stores for the iPhone work?

Isn't this exact restriction why the worst we've seen on iOS (despite MASSIVE deployment numbers) is social engineering attacks?

I know many people hate the restrictions, but the App Store on iOS has a pretty amazing track record for security (in regard to problem code). Basically the worst we've seen is apps abusing system APIs to get more information than they should and those have been closed relatively fast. No worms, no viruses.


> Isn't that how most every exploit and virus seen on Android work?

Actually, no. Stagefright was about exploiting holes in the parsing of malformed media files (no installation required). Tons of other vulnerabilities were of the form "thing that shouldn't be accessible via JavaScript in the browser/WebView is".

Not that there isn't malware that spreads via installation, but Google's "Verify Apps" service (for sideloaded apps) has been quite effective:

http://googleonlinesecurity.blogspot.com/2015/04/android-sec...


You're not wrong, but don't those same arguments apply to every other device and platform?

Obviously if you prevented people from running non-Microsoft approved binary code on Windows, we'd see a staggering reduction in the amount of malware activity. Is that something you'd endorse?


I consider desktops different than phones. My phone is an appliance for me. I would be pissed if they tried to do it to OS X.

I wrote that as a counter to the 'Windows / OS X does it without issue' line of argument. There are issues. They may be relatively minor (OS X) or very bad (Windows XP), but there is no free lunch.


The drama! You're describing the Windows application model of the past 30 years.

Seriously, this thread is making Stallman look like a reasonable majority candidate for president. It seems all it took for people to forget what caused PCs to blossom is a little bit of gold coating.


>You're describing the Windows application model of the past 30 years.

Considering how many viruses, lost work, malware and hair-tearing that model has induced, it makes sense to go beyond it, doesn't it?


The day I lose the ability to load my own binaries on to devices I own is the day I stop using them.

If you don't have root, you don't really own the device.


Why do you need to own devices? I'm perfectly happy with Apple being my phone's sysadmin, the way my company's IT department is the sysadmin of my workstation (or, more relevantly, the way Google is the sysadmin for ChromeOS devices.)

Modern devices are basically converging toward being enhanced VT100 terminals connected to some multitenant mainframe somewhere (a.k.a. a "cloud.") Whether that's Apple's cloud, Google's cloud, Microsoft's cloud, Canonical's cloud, etc. You could get the same effect (if a little slower) by just having the device be a dumb framebuffer connected to a VM running in said cloud.


The comparison with ChromeOS is flawed because you can go into Dev mode[1] on ChromeOS and do whatever the heck you want in a linux userland. There's even a project called Crouton[2] that allows you to install traditional Linux apps side-by-side on your Chromebook. You can also build ChromiumOS[3] the open-source version of ChromeOS and make it do whatever the heck you want.

I don't know about you but I want my sysadmin to work for me so that when I tell it to shut up and get out of my way, it does exactly that.

[1] https://www.chromium.org/chromium-os/poking-around-your-chro... [2] https://github.com/dnschneid/crouton [3] https://www.chromium.org/chromium-os/developer-guide


> I don't know about you but I want my sysadmin to work for me so that when I tell it to shut up and get out of my way, it does exactly that.

I do actually disagree there! And this is perhaps the fundamental disagreement we have. I want my sysadmin to be a capital-E Engineer and choose their ethics over my desires. Don't let me (the capitalist) tweak the bridge I'm paying for into one that falls down; and similarly, don't let me tweak the computer I'm paying for into one that gets malware, joins a botnet and DDoSes people.

You pay your Engineers, basically, to provide you the service of "knowing what's best for you"; to be your domain-specific nanny, making sure you don't do something you'll regret out of ignorance.


>You pay your Engineers, basically, to provide you the service of "knowing what's best for you"; to be your domain-specific nanny, making sure you don't do something you'll regret out of ignorance.

And what if it's THEIR ignorance (e.g. of market opportunities) that prevents them from doing what you asked them to?

It's not like all sysadmin issues can be judged by pure technical reasons, without business needs taken into account.

You want engineers/sysadmins you discuss with you, warn you when you propose something they think is bad, but ultimately work FOR you, and do what you tell them to. They should never override you to make you "you don't do something you'll regret out of ignorance". It's your company after all.


I'm ~100ms away from most cloud services on a good day, and I've seen ~4000ms in country areas via 3G. Far too many companies are assuming large bandwidth/low latency connections. I'd like my SD card back, Google.

If I've paid the full retail price for a device (often more, I'm in Australia), I expect to own, not rent a device.


> I've seen ~4000ms in country areas via 3G

Hell, my municipal (bargain-basement) phone provider here in Canada gives me ~4000ms latency at the best of times. On the other hand, my city is saturated in "free for users of my ISP" wi-fi hotspots that my phone can automatically jump onto.

It seems like the latter are going to be the true solution to low-latency mobile connectivity for most of the more "spread out" countries that can't afford to saturate the country in cell towers.


Does your motherboard's BIOS allow flashing custom firmware? How about the firmware for the flash controller on a USB stick? If that's your standard, virtually nobody has owned a device in the past 15 years.


I was more talking about kernels and userland binaries - if I couldn't disable UEFI SecureBoot or load my own keys I wouldn't use it.

But yes, if you want to completely trust your hardware you're probably going to be using an old Thinkpad X200 with coreboot - shame about that Intel Microcode though, eh?


Mine does -- I have an X200 I freed myself using a beaglebone.

But you are right: no-one has properly owned their own devices since the 80s because no-one can verify them using primary sources.


No, really no. Mild inconveniences dwarfed by the benefits that freedom offered.


Wondows had security problems due to tons of design decision thar went sour.

And current Windows is quite safe. A few safe coding lessons will do wonders for a company :)


No.


>You're describing the Windows application model of the past 30 years.

Windows makes you sign binaries with your own certificate before running them to bypass the signed code protection?


They do with shell scripts (although there's a way to turn it off).


Amen. As a person whose experience as an engineer and admin came almost entirely from Free Software, the fact that so many people who should know better defend Apple's model is pretty pathetic.


The problem is the world is now networked extensively and much more of our 'life' is online. Creating malware and the financial benefits of doing that are much more understood. State actors have invested billions (if not 100 of billions by now) in cyber attacks. I think it makes perfect sense for Apple to police anyone trying to provide unsigned binaries. They protect the security of their i devices in a way that no other ecosystem owner does.


Their devices? I thought people paid for those things.


Ah, that makes a lot of sense. I thought they were distributing the source for you to compile yourself.


> I thought they were distributing the source for you to compile yourself.

This is what they should have done. Instead, they chose to distribute a binary of their proprietary (patent pending [0]), closed source [1] product.

Not only am I not surprised that Apple shut them down the very next day, but I'm also having trouble feeling any sympathy for their cause.

[0] https://justgetflux.com/ - bottom of page

[1] https://justgetflux.com/news/pages/eula/


Developers distributed binaries before app stores. They could publish a hash for their binary, on an HTTPS web page.


True, but I can understand Apple's position.

Commercial software is unlikely to be a trojan. You have much higher value as a paying customer than as a victim.

Open source software is similarly unlikely to be a trojan, especially if compiled from source.

Closed-source freeware is the perfect attack vector. Opaque and widely disseminated. Who knows what's going on inside? Freeware is responsible for a gigantic amount of malware.


If the source of the freeware is authenticated by HTTPS, users can make their own informed decisions, e.g. to enter a risk pool with millions of other users whose health has been improved, courtesy of a known developer with no track record of shipping malware.

There are no zero-risk activities, only risk signals. Some people choose to trust a corporation with lawyers and endless pages of EULA, some people choose to trust one authenticated developer with a proven track record of service to the community. At the moment, we are still free to make our own informed choices.


I am talking about closed-source freeware only. f.lux is closed source. Nobody can verify whether it does what it says. Apple clearly does not want to set a precedent.

Open source is fine. If f.lux published the source code, then we could compile it ourselves, and Apple would be happy (they recently made the iOS Dev Program free of charge to allow sideloading in this manner.)


Millions of people have used f.lux on other platforms. Hundreds of hours of real-world testing per user, multiplied by millions of users, leads to expectations about the behavior of f.lux on iOS. This is also known as reputation, credibility and brand recognition. For those who need what f.lux offers, this empirical data means more than Apple's walled garden theories and non-liablity EULAs.


There is a gigantic amount of closed-source freeware available on the App Store. Literally hundreds of thousands of apps. Roughly zero of which have had any security scrutiny. Apple's review process only checks user-visible behavior. (Edit: and a quick check for private API use, which is not user-visible, but nor is it relevant to security.)

If iOS's app sandbox is inadequate, then those are all threats too, but Apple doesn't seem to mind them. If the sandbox is adequate, then apps like f.lux are safe even if intended to be malicious.

Apple's position is only really understandable if they don't understand their own technology. Which considering who is probably making these decisions could very well be true.


> This is what they should have done. Instead, they chose to distribute a binary of their proprietary (with patent claim [0]), closed source product.

Apple does the same with their own products.


"the same" what?

sending binaries you have to sign yourself.. I doubt that somewhat, they'll be signed by Apple.

For FOSS they use: http://www.opensource.apple.com


Well, they would sign the binaries themselves if that worked... but you need to sign it yourself to get an iOS device to let you install it.


How is that relevant?


> patent claim

Not seeing anything about patents at that link.


Wikipedia says [1]: "License: Proprietary, with free download. (with patent claim)"

[1] https://en.wikipedia.org/wiki/F.lux


O...k? The linked page doesn't mention patents, Wikipedia doesn't identify what patent, and Wikipedia isn't exactly authoritative anyway.

And also, what in the world does "Proprietary, with free download. (with patent claim)" mean??


Proprietary means not open-standard/open-source.

I remember this coming up on tech news sites a few years ago when I discovered f.lux... this article also makes mention of the patent-pending nature of the software: http://readwrite.com/2013/09/23/flux-a-hack-for-your-devices...


Go to justgetflux.com and CTRL+F for the word "patent".


That makes zero sense. Millions of people have installed f.lux binaries on more sane platforms without problems.


>Such abuse puts the new free-tier Xcode dev program in jeopardy.

Or it would force Apple to adapt similar to how they responded to the black market for developer program spots to access iOS betas by making a public beta program.


Or they just made a public beta program out of their own will -- and not adapted at all.


What is the huge risk for users in sideloading pre-built binaries?

If you're worried about malware, sideloaded apps still run in the same sandbox that app store apps run in. Apple's review process does not meaningfully impact security, so the sandbox is all that stands between you and malware either way.


At a minimum, Apple's review process includes a scan to ensure your app submission doesn't include private API usage.

Like private APIs that can screw with color output, outside of the app. And so on. The security comes from

1. Defining a set of APIs that you're allowed to use

2. Ensuring that you only use those APIs

Sideloading apps skips that important security check.


Step 2 doesn't happen. The private API scan is superficial and basically only catches accidental use (for example, maybe somebody discovered the API in a StackOverflow post and didn't realize it was private). It doesn't catch use that people deliberately want to get through. Getting through the check is as easy as some string obfuscation and a dynamic symbol lookup.

If a given private API is a security problem (and I agree that altering the screen's gamma settings system-wide may well qualify) then that private API needs to be actually secured, by taking it out of process and gating it on a sandbox entitlement.

The private API check is done to allow Apple the flexibility to change those APIs without potentially breaking a ton of apps and pissing off their users. It does nothing whatsoever for security, unless your threat model is programmers capable of writing malware but incapable of obfuscating function calls.


What a sad, pathetic day that you have to consider loading your own code a on a general computing device a risk and privilege.


It's not your own code, it's a binary. With your Apple ID, you can always run your own code.


This is not a programming issue, it is a health issue.

I understand the challenges in the way the policies are set up, but again, this is a health issue that needs to be addressed one way or another.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: