Hacker News new | past | comments | ask | show | jobs | submit login

I think you may be right, but if there is a conflict, it's narrower than you say.

> You agree not to, or to enable others to, copy (except as expressly permitted under this Agreement), decompile, reverse engineer, disassemble, attempt to derive the source code of, modify, decrypt, or create derivative works of the Apple Software, Apple-issued certificates or any services provided by the Apple Software or otherwise provided hereunder,

> "Apple Software" means the iOS SDK, the iOS, the Provisioning Profiles and any other software that Apple provides to You under the Program, including any Updates thereto that may be provided to You by Apple.

Those terms don't apply to your own app. Of course, this is the agreement developers have to sign; let's look at what users have to sign, the iTunes ToC:

http://www.apple.com/legal/internet-services/itunes/us/terms...

The relevant clauses are:

> You agree not to violate, circumvent, reverse-engineer, decompile, disassemble, or otherwise tamper with any of the security technology related to such Usage Rules for any reason—or to attempt or assist another person to do so.

Again it only applies to "security technology", not the app itself.

There is also a default license for apps themselves that has similar language, but it has an exception if "that App Store Product is covered by a valid end user license agreement between you and the Application Provider of that App Store Product".

However, all apps are distributed encrypted and any attempt to decrypt them would presumably be considered circumventing "security technology". And the LGPL requires that the terms allow for:

> modification of the work for the customer's own use and reverse engineering for debugging such modifications.

The latter part is potentially more broad than reverse engineering the library itself, and is probably the biggest problem. Also, in the former, "the work" refers to the entire linked binary, which per above is distributed encrypted. But I (who ANAL) think there may be a valid argument that if the app submitter separately distributes unencrypted copies of the binary, these terms are satisfied. The "security technology" in question does not actually include any code linked into the binary - the only transformation to it is encryption - so the unencrypted and encrypted versions could be considered two different representations of the same work, rather than the latter being a derivative work. Thus it would still be possible to modify and reverse engineer it by referring to the unencrypted version.

But not reverse engineer iOS to debug it, so that's still a problem. Maybe if the library was sufficiently isolated from the system that it could not be reasonably considered necessary to do so...

You are incorrect about needing to give users the right to deploy modified versions. An App Store submitter can satisfy either 6a or 6b: 6a requires distributing object or source code to relink the app, but says nothing about installation of the relinked version; 6b merely says "will operate properly with a modified version of the library, if the user installs one", which is different from providing a means for the user to install one. (Anyway, this may often be accomplished by jailbreaking.)

edit: Not that any of this matters much. Chances are Apple will be conservative and remove an app from the store if it has any ambiguity as to valid licensing - if someone asks. Meanwhile, right now nobody is asking, and some LGPLv2 apps and libraries are in the store. The exact terms only matter if it gets to court, and because of that conservatism, it won't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: