If the US government gives itself the right to install backdoors / exploit vulnerable software (as opposed to notifying companies about vulnerabilities) then I feel pretty uncomfortable about ex-government hackers just becoming freelance mercenaries using knowledge they may have gleaned from those ops once they move onto their next gig.
I can't think of a great solution to this problem.
> I can't think of a great solution to this problem.
There's really only one "final solution" to the problem in the purely technical realm. That would be to make provable security (in the theorem-proving sense) a non-negotiable requirement to all digital logic (both hardware and software) running on networked devices. I don't know if there's even a workable definition that would rigorously describe the goal of such an effort.
... But I believe that if provable security was important enough to everyone (just like "winning the war" in the 1940s or "getting to the moon" in the 1960's), we might possibly achieve it -- at least below the OS syscall level in a few major OSs and in several important userland libraries.
However, that ignores the human element of security, which can't ever be completely solved via mere human effort. People will always be vulnerable to social engineering, for example.
I think your solution needs to extend to the hardware components on the board.
High security MCUs go through great lengths to defeat sideband attacks on the package (some really neat stuff too like failing if exposed to die shaving).
There are secure bus initiatives but they don't extend to the BOM (bill of materials) for all the components.
On top of that, GUI techniques for obscuring physical input (keyboards, UI touches) are needed.
Given Apple's posturing and patch release cadence, I think/feel they are on the side of privacy. Android too. We're on the right track, I wonder if eventually tech will win the arms race for exploits like this? (The rubber hose exploit will always work...)
If something can be created to be provably secure, then it can be an argument for government legislating a back door.
"You said it's provably secure. Now you can give us provably secure access too without hurting your customer's privacy or security, because they're protected by the 4th amendment."
I don't think this can be solved by technology, I think this comes down to politics of freedom, if you get right down to it. And it looks like you're going to have to have that fight anyway.
The goal of probably secure computing would be merely to (hopefully) extend the mathematical certainties of cryptography to computers and software. The politics of cryptography wouldn’t change, they would only be broadened. Intentional back doors would still be possible, and the ramifications of building them would be just as dire.
So the best provable security could do would be to eliminate security holes like buffer overflow/etc. Trust issues (and even side-channel attacks) would still be present as always.
Well I did say "in the theorem-proving sense", meaning that the code undergoes formal verification. There are programming languages for which each function is a theorem that is proved at compile time. That's what I meant.
There are some low-level libraries that have already been partially converted to theorem-proved functions for the sake of security.
> I can't think of a great solution to this problem.
The mentioned government agencies have the "NOBUS" belief: that the concept of "NObody But US" (having access to the "keys to the secrets") works.
This article is just one of a many good examples that it doesn't.
What could work are just the systems which are secure without any exceptions. Which is hard to achieve when enough powerful influences (most often directly or indirectly tax funded, even if not explicitly government organizations) do all they can to make that not happening. It's then easier than it appears to be to achieve the goals of nobody having an access to a really secure system.
"In September 2013, The New York Times reported that internal NSA memos leaked by Edward Snowden indicated that the NSA had worked during the standardization process to eventually become the sole editor of the Dual_EC_DRBG standard,[7] and concluded that the Dual_EC_DRBG standard did indeed contain a backdoor for the NSA.[8] As response, NIST stated that "NIST would not deliberately weaken a cryptographic standard."[9] According to the New York Times story, the NSA spends $250 million per year to insert backdoors in software and hardware as part of the Bullrun program.[10]"
You can make it illegal for ex-NSA employees to use their knowledge of exploits learned while on the NSA payroll. It may well already be the case for all I know.
I hope with all my heart this is treated as the treason it is and not a "plausibly deniable" part of this recent policy of sucking up to brutal Arabian dictatorships regardless of atrocity.
Sometimes you have to disincentivize behavior with prison time and things like that and then hope people don't do it. Trying to prevent some crimes ahead of time is a recipe for dystopia.
In this case, "trying to prevent some crimes [of government employees leaking the golden key]" possibly means "don't make a golden key that lets governments freely hack everyone", which is generally being regarded in this thread as the non-dystopian result.
sounds like maybe they should get a warrant and get legitimate access on an individual basis rather than being allowed to hack everything, you don’t need to hack me if I let you in, it should be just as illegal as it is for them to poke around in my house without a warrant
The US is Dr. Frankenstein, except they didn't learn their lesson from the first monster they unwittingly released into the world and continue to pump them out.
And that's a novel idea when they are on the campaign trail... until they start getting daily national security briefings and learn about the attempted attacks supposedly foiled by good SIGINT. No one wants to be the president who turns that firehose off and "causes the next 9/11". I believe that is what happened with Obama.
Or how he shutdown the drones gaming unit, and his constant efforts towards peace in the middle east by refusing to destabilize countries, etc. Great man.
> But then again many physicists were also convinced Nazi officers.
If the Germans would have won the war, we'd probably celebrate those officers :/ All the torture and killing would be spun as "necessary evil" (if it even came to light), and further investigations would be blocked by the government. How we perceive the past is...complicated.
I can't think of a great solution to this problem.