Wouldn't it be better if we adopt the position that unbreakable crypto is a basic human right? Every time we hear about government demands for backdoors, we technies, cryptographers, thought-leaders, and journalists respond that backdoors are a security risk and can't be implemented safely. I feel this invites endless counter-arguments and new proposals by the authoritarian parts of government. We've been playing this game forever -- as far back as the Clipper Chip in early 1990s -- and it never seems to end.
Maybe we should take the high road: the time has come that unbreakable crypto is to be considered a basic human right; the same as free speech, freedom of thought, or freedom of religion. In this digital age, a lot of those other rights can't be guaranteed without strong crypto. Whenever any mention of backdooring, weakening, disabling, preventing encrypted communication comes up, our reply should be, "no, you can't do that because crypto is a human right, the same as free speech, and is necessary to protect other fundamental rights".
I get where you're coming from, but I think that argument needs some work. Currently, none of those rights you mention that are enumerated in the Bill of Rights are guaranteed through absolute means, they are preserved via checks & balances among branches of the government. And even if they were more absolute, there would still be asterisks as things stand today:
- You have freedom of speech/thought unless it was the motivation behind a criminal action, in which case you're additionally punished for a hate crime
- You have freedom to keep & bear arms except not in some states if your Sheriff doesn't like you
- You have the right to privacy unless there is probable cause to believe you're involved in criminal activity, in which case the court can issue a search warrant
- etc. etc.
Even if you tried to argue that "crypto is a basic right," that wouldn't change anyone's mind who is arguing that there should be an asterisk if you're under criminal investigation.
> Currently, none of those rights you mention that are enumerated in the Bill of Rights are guaranteed through absolute means, they are preserved via checks & balances among branches of the government.
The idea that private use of encryption is any different than that is the recent propaganda.
If the police have a warrant then they can place a bug in your house or trojan your device, which gives them access to everything starting on the day they execute the warrant.
Which is the same as it is for anything else. A warrant isn't a time machine. You can't get one today and use it to read files the accused shredded five years ago. Nor does the government have the right to make everyone submit copies of all their paper files to them to put in a locked government room for hypothetical future access with a warrant. But that is what they want to do with encryption.
What encryption actually prevents is warrantless mass surveillance of content, which is exactly the sort of thing all the other rights categorically protect people from the government doing.
>If the police have a warrant then they can place a bug in your house or trojan your device
If they trojan your device, they can certainly view all of your communications and files from the past unless you've explicitly securely deleted them before-hand.
Would it be fair to paraphrase what you saying like this: The U.S. Bill of Rights are not 100% absolute; therefore, people/courts/whatever are not going to accept a right to unbreakable crypto that's even stronger than those in the Bill of Rights?
I don't think that crypto needs to be a stronger right than the Bill of Rights (confining myself to U.S. laws for the purpose of this discussion). What I'm saying is that whenever government proposals come up about restricting or backdooring crypto, we techies (and other thought leaders) should take the strong position that this is a right and then--maybe--have discussions about warrants, etc., afterward. What we've been doing for over 20 years is arguing that crypto is necessary for good security and backdooring is bad for security. That's all true, but it's a technology argument. It would be much better if we start off with the premise that unbreakable crypto is a right, that we're allowed to use it, that we don't need permission.
You're presuming critics of these rights have long ago won any moral and legal battle over the righteousness of thesr rights in absolute terms.
However, this is not true.
To give an example, even as we enter an age of extreme political correctness and government authority--interepreted as intended to protect people from mean words--you can still find in mainstream media people arguing for the opposite.
In other words, unbreakable crypto may be a worthy goal to aim for, even if we know just as with other human rights that there will be slip ups and set backs.
I wouldn't say that the right to avoid cruel and unusual punishment be thrown out just because the military set up Guantanamo and other black sites around the world. It means we need to fight even harder, not give up on our ideals.
Unbreakable crypto is a natural phenomenon, like gravity, or the hardness of steel. Like gravity, you can access unbreakable crypto anywhere. You can't give the police the power to defy it. Making laws against it is like saying you can't put up a fence or grow a hedge higher than a policeman can jump over.
Rights are, in part, a recognition of the limits of the law. It makes sense to include the ability to hide and potentially destroy information in a way that the law cannot access, any more than policemen and prosecutors can levitate themselves.
As a believer in Jesus, I agree. Within the larger atheistic framework of the world, who or what are these authorities of which you speak? There are no moral absolutes in such a world. Heck, look at the Bill of Rights. It's predicated that there is a god that has made it manifest in creation that humans are a) special, and b) have intrinsic rights that Bill of Rights simple enumerates.
The privacy & crypto policy fights is a replay of the election integrity fight. We democracy defending noobs (in the USA) walked right into the trap of trying to rebut the vendor's appeal to authority and the torrent of bullshit they spewed.
Big Tobacco easily found scientists eager to support their mendacity, for a price. Of course the FBI, NSA, etc. have experts for hire peddling their freedom destroying agenda.
In contrast, the German activists successfully argued that private voting and public counting is a basic human right.
In addition to being The Correct Answer (morally, ethically, legally, logically, etc), such as affirmative position also has the benefit of being easier to communicate.
Not sure what the better argument is- just saying that first- but "specialist" mentioned "higher moral authorities" than the bill of rights. Since the US uses these higher authorities to sell patriotism, I get why someone might think it's a good idea to fold cryptography in with that. Yet I can't imagine how that would be done. In any case, it is important how this battle is won - its justification will shape our future battles. If someone can be called anti-American or "not a patriot" if they bring up legit critique (Chomsky) or if they are an atheist or a Muslim---then it's a sign that observing basic human rights is exactly NOT what a government (the US especially) wants to do, no matter HOW they "sell" authoritarianism to the public. So far, in order to be elected president here, it seems one needs to at least pretend to be Christian. We need to move to arguments that are implicit in our engagement with new ideas. It seems that the basic right to cryptography must be philosophically designed into systems as a feature of their existance, unapologetically. For example:blockchain. Let the idea of cryptography be nationless-- a feauture of globalization. In that way, you tie it to something those in power hold dear and it becomes difficult for them to hold one view without admitting the other. That said, we have been experiencing a troubling complete breakdownn of all reason and consistency in these philosophies and govt. policies. example: the increase in arrests of black people in posession of small amounts of marijuana especially in cities where posession has been decriminalized. (The Intercept, yesterday)
We shouldn't be defining what rights we have and leave the other "rights" to the government. It should be the other way around.
The government should not be able to take freedom from the people at large. Peaceful use of something shouldn't be abridged just to further the goals of the government. It is the government which should serve the people, not the other way around.
The Bill of Rights (the first 10 amendments to the Constitution) do not define the rights we have as citizens, but are restrictions on the Federal government. Even the 10th says as much:
"The powers not delegated to the United States by the Constitution, nor prohibited by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or the people."
But the Federal government will argue what "unreasonable searches" means (IV) or the citizens will argue what "a well regulated Militia" means (II).
The post I was responding to wanted it enumerated as a human right. As you say, our rights are reserved to us and we should be enumerating the right of the government.
I wish people would see this for what it is. The head of the FBI shouldn't be lobbying for laws. He should be working within the boundaries of the law. He shouldn't be lobbying to take something away from many because a fewmay abuse it and it would make his job more difficult.
> Wouldn't it be better if we adopt the position that unbreakable crypto is a basic human right?
Yes, but you're asking a set of people who have not given us much evidence they're capable or willing to implement that human right. It takes a strong will and a stiff spine to stand up to powerful interests and I'm not convinced the people I see reported on in mainstream so-called journalism have that political will. Hiding behind technological inferiority seems to take precedence over standing for our rights on principle.
Freedom of speech, which you mentioned, also takes a beating these days in light of completely and accurately conveying messages some people don't agree with in a timely manner. We can't have freedom of speech because that might involve giving up conveniences, they argue.
Nobody besides a few elite security people even knows how legit crypto products really are. It's all a rumor mill for us lay programmers. Crypto is cool and all but human right? Not sure what that has to do with anything. Crypto is a tool for the good guys and the bad guys, it's not some absolutist concept.
> but human right? Not sure what that has to do with anything.
If the government is pressuring organizations into backdooring their crypto, wouldn't it be relevant whether using effective crypto is considered a human right? Not sure why you think it's so out in left field.
This isn't facetious. We do not yet know which of Impagliazzo's Five Worlds [0] we inhabit. Currently, the best guess is that we live in Cryptomania, but we haven't proven it. Since we would have to live in Cryptomania for your proposal to carry any reasonable weight, we should probably postpone any talk of adding to the list of human rights.
On the other hand, the ability to multiply numbers, while perhaps not a right, is something that the government should probably not bother trying to criminalize.
I don't see why that's an issue. People have a basic human right to life even though it's theoretically impossible to ensure that outcome. The OP's argument is about reframing the conversation to avoid pointless debate with people who might have the right desire to help people, but don't understand why their fantasies about how to achieve it aren't reasonable. Reframing the debate makes it easy to focus on proper measures.
One-time pads are unbreakable, so unbreakable cryptography exists. Impagliazzo's Five Worlds are only about public key cryptography. If public key cryptography didn't exist, we'd still be arguing about private key cryptography.
I think you're reacting to the wrong issue -- a minor and perhaps unwise choice of words: substitute "strong" for "unbreakable" and the point remains -- aren't we better off by standing up for what ought to be a human right -- strong encryption -- instead of knuckling under to implementing backdoored encryption that governments have long wanted cryptographers to implement?
>Wouldn't it be better if we adopt the position that unbreakable crypto is a basic human right?
Is the right to have the ability to pay to have anyone in the world you don't like murdered, without it getting back to you, a basic human right?
If so and people start getting murdered for hire, and law enforcement has no way to stop it, would you be okay with opening in certain cases a previously undisclosed back door, for example as part of tracing payments in the dark net?
What if normally cryptography were secure and uses of the backdoor only happened like this:
>The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Would you be okay with the back door existing if it were used under such a court-ordered scenario? Why did I just quote this text, in your opinion? Why does this text not end with a period after the word violated? Then it would read:
>The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.
Don't you agree that as a "basic human right" this version appears much stronger than the first version I quoted?
So, would you prefer if the second, abbreviated version replaced the first version?
You've still got the word "unreasonable" to contend with.
This is, I think, the place where the metaphor shear between technical limits and legal limits actually happens.
Legally, we treat reasonableness as the salient divider. To a first approximation, everyone agrees that some searches and seizures are reasonable and some are not, and we've developed a well-defined social technology (warrants, judges, courts, etc.) to distinguish the two cases.
But technically the dividing line is possession of the backdoor. Third parties without a backdoor cannot perform a search, no matter how reasonable it may be, and third parties with a backdoor can perform even the most unreasonable search. There is no mechanical way to introduce a hole for reasonable cases but not unreasonable ones, no matter how well we formalize the decision making process.
Software does kinda work like a form of law, but one without recourse to human judgment. There's no judge, no jury, no prosecutorial discretion. It appears that, without a bit of hands on experience with technology, legally minded folks have a hard time really internalizing how constraining that fact is on what's actually possible.
If there is a way of bypassing technology then it should be assumed it will be bypassed, most particularly by those whom that technology's presence would be used to defend against.
Golden keys will be stolen or otherwise recovered (eventually).
Only by making the cost of that effort more than any possible reward will security be maintained. Just the same as physically defending a physical location.
"Would you be okay with the back door existing if it were used under such a court-ordered scenario?"
No, because it would not stay that way. The problem is, that backdoor would get out, and all of the bad guys would know how to use it, and be able to use it. Now I have no protection anymore.
Most of the time, when we're arguing against back doors, it's not because we're worried that the government is going to collect evidence to use against us in court. It's because we don't want our payment transfers to be intercepted by bad guys. We don't want our computers hacked.
"The problem, according to Wray, is that law enforcement is stymied by phone encryption, which is now widespread."
Not American, but isn't criminality in the U.S. like at about lowest point in history? Is there some actual evidence that the lack of ability to do mass surveillance is somehow making criminal investigations less effective?
There's an intimidation factor behind the thought that the government can read any communication you make, if you give them cause, even if you aren't doing anything wrong.
>Nixon targeting blacks, hippies, and the anti-war left
According to Dan Baum, after Ehrlichman died, years after Ehrlichman supposedly told him. Furthermore, even if Ehrlichman did tell him this, Ehrlichman was convicted of perjury, conspiracy, and obstruction of justice.
Hearsay, from a convicted liar, retold by a biased individual with no apparent proof other than his own word.
You already destroyed your credibility with the Ehrlichman point. If you try to mislead people, you lose trust.
I don't need to read the aclu document. I'm not arguing with you. I'm pointing out that the moment you reference Ehrlichman in any way, you will lose all your credibility again. You do damage to your own causes when you try to mislead people like this.
It wasn't an attempt to mislead. You're reading way too much intent into this. I conceded your point, there's no need to linger and dig it in when it doesn't advance the conversation.
> Not American, but isn't criminality in the U.S. like at about lowest point in history?
It's generally been declining since the trailing edge of the Baby Boomers got out of the prime criminality age range.
But the demographically (and possibly also lead-exposure) driven
“crime wave” was a boon to political support for law enforcement and authoritarian politicians, so both have learned that selling the idea of rampant crime is a good way to get what they want politically.
> It's generally been declining since the trailing edge of the Baby Boomers got out of the prime criminality age range.
Not sure if you are making a claim, but I don't really think this is generational other than a coincidence in technological timing - the time you are talking about is the 90's-2000s, which is when improved surveillance techniques, information sharing databases, and other deterrents became more possible and widely used. At that time most of the people on the ground would be gen-Xers (think 21 jumpstreet, colors, new jack city, boyz n the hood, etc, usa wise)
Also, timing wise, the cold war ended and so along with it much of the foreign funny money on all sides for subversive activities, and police spending went up bigtime in most areas..
> Not sure if you are making a claim, but I don't really think this is generational other than a coincidence in technological timing - the time you are talking about is the 90's-2000s, which is when improved surveillance techniques, information sharing databases, and other deterrents became more possible and widely used.
As you say, most of that became available in the (mid-to-late-) 1990s and into the 2000s, the crime peak happened in the early 1990s (violent crime specifically in 1991.)
There's something of a demographic case to be made (not so much generational as the fact that the Baby Boom was a boom, and a big demographic bulge going through that age range has an impact), there's a plausible, and arguably stronger, lead exposure case to be made (the decline, and the preceding rise, closely tracks the drop off, and preceding rise, in use of leaded gasoline.)
The variety of intended policy mitigations adopted well after the peak certainly didn't cause the drop, and probably have no-to-minimal impact.
Especially since most comparable nations displayed the same drop in crime over the same period, without the out-of-control enforcement increases we suffered in USA.
I would hesitate to attribute the decline to Big Brother when there is already a widely accepted hypothesis explaining both the increase and the decline.
tl;dr: Lead exposure has been shown to stunt intellectual development and increase aggression. With the 50s came the personal automobile, powered by leaded gasoline. Crime rose as that generation grew to adulthood. In the 70s use of lead in gasoline declined due to the clean air act. As that generation grew to adulthood crime rates fell. In the 90s the use of lead was banned. That generation is reaching adulthood now.
Lead polution peaked in 1973, violent crime peaked in 1993.
So, we're back to what, the mid 1990s? Before anyone had a phone that stored any information at all? They should be grateful for the few decades of easy work. Now it's back to being real detectives.
Not exactly. We used to have to keep many more paper records, which are really hard to manually encrypt/decrypt. Those were easily available to law enforcement.
The fact that paper records make many things more difficult meant that many fewer records were kept. For example, no one had a paper record of your exact location (or rather that of a flattish oblong chunk of plastic and metal that was always in your pocket), all day long every day of the year. No one had a record of which article in the newspaper you read first, how long you spent reading it, which article you read next, etc. No one had an audio recording of everything uttered in earshot of a stupid-looking speaker that sits in the kitchen of every bored housewife in the nation.
All of those records exist now, and LEOs should have access to precisely none of them.
I'm pretty sure people leak a lot more data into the "cloud", through their SMS messages, emails, apps, etc., than they used to keep in their desk drawer. On average I presume that there is more for law enforcement to go on these days than before technology got so pervasive.
And at least if they search your desk you know about it. The idea that people can go behind your back and look at your info is disconcerting.
Pretty much this. Law enforcement is trying to get their monetary share from crime (or "crime" depending upon how you look at it), and anything they can do to make that easier they want.
> Not American, but isn't criminality in the U.S. like at about lowest point in history?
Sort of. We're approaching 1960s level of crime, but it was lower in the early 60s and 50s. I don't know if we tracked it before then. Crime rates, including homicide rates, are about half of what they were 20 years ago, but about 20% higher than they were in the mid 50s.
> Is there some actual evidence that the lack of ability to do mass surveillance is somehow making criminal investigations less effective?
No. Police investigations still tend to use things called search warrants and interrogation of witnesses and suspects to acquire incriminating evidence. Mass surveillance is very expensive in order to be effective.
I have family involved in law enforcement and they are able to request access to stingrays from state or federal police, but their use is fairly limited and still requires a search warrant in many cases. I don't believe it's gone to SCotUS yet, but many appellate courts have ruled that you've got to have one.
But there is reason to believe pre-1980 statistics had a huge problem of under-reporting more or less everywhere, especially in countries as big (and relatively desolate, at the time) as the US. Even just collating reliable statistics is a challenge today, let alone in the '50s.
Nowadays we almost have the opposite problem, with cellphones and vague laws resulting in increased reporting of entire classes of crimes. So having a declining trend despite these development, is either a huge win or proof that statistics can be tweaked at will (as The Wire taught us, and we all know The Wire is all true... /s )
I agree the older crime statistics are a little suspect, but it's all we really have to go on. And, yeah, the US was a much bigger and much newer nation back then. The Apache Wars and other associated wars against the native Americans didn't wrap up until 1924! Crime statistics just weren't that important.
> as The Wire taught us, and we all know The Wire is all true... /s
I could never get in to The Wire.
The very first scene of the very first episode is a courtroom where a woman changes her testimony on the stand, and the prosecutor doesn't: a) show the woman her sworn deposition, b) request an immediate continuance (surprise testimony is one of the things that warrants one), and c) threaten and then carry out charges of perjury against the woman for refusing to testify according to her statement. Instead, the prosecutor keeps asking questions over and over, driving into the jury's mind the woman's changed testimony. I'm not a lawyer, but I feel like I know something of courtroom procedure. This was just so unbelievable that it made me angry that the writers expected me to believe it. It would be like someone hacking a System/360 Model 30 from their iPhone in 10 seconds.
I understand it was for dramatic purposes to lead to the other witness's murder, but it broke my suspension of disbelief so badly I had a hard time watching the rest of the episode. I don't remember if I ever watched the second episode.
Sorry to hear, but please do yourself a favour and give it a second chance! As my sarcasm indicates, way too many people think It's All Exactly Like That (which of course it isn't, as you just demonstrated), but it's a really good series that makes some very interesting points on the fabric of modern society.
> Mass surveillance is very expensive in order to be effective.
Not if done right, according to William Binney who worked on designing the systems that eventually (d)evolved to what Snowden leaked information about.
That was re: (actual) national security - but the same mechanisms are at play: if you can record what you need to achieve a stated goal for 250m - why do that if you can spend N billion on a less efficient system?
Compared to manual policing, automated surveillance is a really cheap way to run a police state.
Now, if the goal is to minimise crime, as opposed to maximise catching criminals (which really should be a means to an end, not a goal in itself) - better welfare, psychic health services and after school activities are probably the way to go...
What other kinds of crime are there due to which we're willing to give up our right to privacy? Several, perhaps, but remember that this conversation is happening while we're discussing a way in which too much law enforcement has significant downsides to a free and happy society. We wouldn't let the government monitor everyone constantly in order to catch shoplifters.
I suspect that the FBI antipathy to encryption, if it's not fundamentally irrational, is actually about white collar crimes: trade secrets theft, money laundering, insider trading, tax evasion... Violent crime leaves abundant physical traces. White collar crimes may sometimes be un-prosecutable or even undetectable without access to electronic records.
They know that "outlaw secure encryption to stop inside traders" is going to be an even tougher sell than "outlaw secure encryption to stop murderers," so they try to make an illogical argument involving scary criminals instead of a stronger argument with not-scary criminals.
My first guess, when I posed the original question, was organized crime. People who do organized crime would be the ones to benefit the most from encrypted phones. But anecdotally, it seems to me that organized crime is down in the U.S. too. I might be wrong, that's why I asked for evidence.
The white collar crime, yeah, some forms of it might benefit from encrypted communication. But again, where is the evidence? Is the tax evasion (non-institutionalized) skyrocketing?
The whole claim is just very suspicious. I think there is so much open data today that even with encrypted phones it's much harder to leave less trace than in the past.
I too suspect that the FBI's antipathy to encryption is disproportionate to how much it actually hinders law enforcement. But if there were some real reason for the FBI to be this hostile to encryption, I don't think it can be about violent crimes. It has to be more subtle crimes that don't leave physical evidence.
This would be more plausible if federal prosecutors ever prosecuted "white collar" crime in general, rather than only in particular situations to defend the commercial interests of big donors. To see this, look at how many insider trading investigations are targeted at randoms who occasionally learn some privileged information, compared to those targeted at senior executives who bathe in it all day long. The ratio of bitcoin convicts to great recession convicts is still infinite.
Well, people tend to still report other crimes, like property crime, and police still investigate them. Sure, reporting rates aren't perfect but you can still draw meaningful conclusions from trends. I'm not going to do the leg work for you, but almost all types of crime are declining in the US and have been for decades.
There's no data I've seen that suggest anything other than a broad decades long drop in crime.
If “attempting to access a computer system without authorisation” counts, a former aquaintance reported over 24k such attempts per day. Against just his private server.
While I count all attacks from a single process to be “one attack” and “one crime”, others may not.
Then there’s “unlawful pornography”, which varies wildly by jurisdiction — when a friend pointed me at fetlife, one of my thoughts was that the entire site is illegal to view in the UK unless you disable images (I’m no lawyer though).
There are probably a lot of other things describable as “conspiracy to X”, but that’s a guess on my part.
(I think there may be a lot of victimless crimes too, but that’s a different topic).
There's white collar crime. There's several non-violent variations on theft. Loan sharking isn't violent per se (though usually comes with an unhealthy side order of violent crime).
What evidence do you have to suggest the existence of this happening at any scale?
The problem with large scale conspiracies is that they require a lot of people, and eventually someone slips up. This sort of crime is the bread and butter of the SEC, and they aren't making any claims about it increasing.
I have never met anyone who understands what cryptography is, and supports the FBI. As far as I can tell, the only reason they have any support at all is because they confuse people and muddy the waters.
Edit: What I meant by "supports the FBI" is "supports the FBI on this issue". If you understand encryption and support the FBI in a general sense, but think they are wrong on this issue, then you aren't a counterpoint.
> I have never met anyone who understands what cryptography is, and supports the FBI.
I understand cryptography and I broadly support the activities of the FBI. You have now “met” such a person.
It’s fully possible to simultaneously support opposing concepts like end-to-end encryption for users and the organizational imperatives of the FBI. A nuanced perspective might consider that citizens and the FBI have separate prerogatives and competing incentives with respect to private encryption, and this is perfectly fine.
A consequence of an organization optimized to reduce crime is its natural zeal for greater control over things that grant criminals freedom. Secure cryptography represents thermodynamically incontrovertible freedom of communication, which is intrinsically antagonistic to control. In the abstract, any organization striving to (in effect) reduce the rights of criminals will produce friction with the rights of citizens in general. But that doesn’t make the organization’s goals incoherent or evil, it means there must be a system of checks and balances. The ideal goal is one of compromise - we want to reduce net criminality, but we want to increase privacy.
We have that compromise because open, end-to-end encryption (and private companies willing to implement it securely) is available to us. But the existence of arms race between parties does not indicate that one party’s overarching goals shouldn’t be supported. One party desires greater control, the other desires greater freedom. Both can coexist despite their competing incentives.
> In the abstract, any organization striving to (in effect) reduce the rights of criminals will produce friction with the rights of citizens in general.
Criminals are still citizens with rights, or else Due Process has no meaning. Striving to reduce the rights of any citizen is not a defensible goal for a US governmental organization. Rights should apply to all, or they're not really rights. The government steps in when rights are violated.
> But that doesn’t make the organization’s goals incoherent or evil, it means there must be a system of checks and balances.
It does make the organization's goals anti-constitutional.
> The ideal goal is one of compromise - we want to reduce net criminality, but we want to increase privacy.
No. No all the way. We don't and can't "reduce net criminality" directly. That drifts into implications of policing pre-crime, for which everybody in law enforcement wants surveillance as a magic bullet. Beyond "just" constitutional concepts, I think a lot of people would classify pre-crime surveillance, engagement, and enforcement as actual full blown Evil.
The penal system of a free and civilized country should police actual wrongdoings; and should foster positive educational, cultural, and environmental shifts away from crime. Not destroy freedoms in the name of "protecting" them (when in reality their goal is increasing internal & external political metrics), as the ghastly hypocritical current systems do.
This isn't some new balance to be discovered. It is a clear limitation of powers to prevent violations of rights that already represents a balance.
When they're asking for some ability to perform searches and collect evidence, that's not necessarily counter to the US constitution. Quite the opposite in fact.
> Amendment IV
>
> The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
You seem to be arguing against a strawman created by a limitation of our current technology. We don't have a way to give them limited access. Right now, we can give them everything, or we can give them nothing. But they're not asking for everything. So we should find a way to allow truly (technologically) limited access, and to make sure that those limits are tight.
I'm specifically arguing against the generalizations the parent was making in particular. "Striving to reduce rights" of US citizens is generally anti-constitutional as a concept, especially for non-lawmaking organizations.
The 4th amendment doesn't really affect any of the original crypto argument, because the search cannot be performed. Cryptography prevents the "this is the way that's acceptable" act from even starting. Note that the amendment simply says that a warrant is allowable in a bounded implementation; it doesn't demand that warrants must be executable.
For instance, if a warrant is issued to search a non-existent building it's just as moot. It doesn't require the building to be created if it's not there.
I feel that I must point out that while most criminals remain citizens, this is not necessarily the case. In fact criminals (not suspects but convicted criminals) are not even guaranteed freedom from slavery as the thirteenth amendment specifically excepts criminals from its protections. There are actually a number of rights which are diminished by a criminal conviction.
I'm not saying it's right but it's the way it is right now.
This line of reasoning is disturbing to me. I don't want to live in a world where there's effectively a traffic camera pointed at the contents of my phone. Bandying apologetics regarding how limited it's used and how badly it's needed won't change my mind.
Would you consent to having a microphone embedded in your driver's license that the FBI could "only turn on if they really needed it"?
When the people with power get to define the meaning of the constraints on their power, their power is without constraints. It disgusts me when companies and individuals are press ganged into being enforcement arms of the government. In my opinion, it takes a certain blindness to history or a level of sociopathy to be fine with this.
> I don't want to live in a world where there's effectively a traffic camera pointed at the contents of my phone. [...]
Would you consent to having a microphone embedded in your driver's license that the FBI could "only turn on if they really needed it"?
Not the GP, but I didn't read anything in dsacco's post that implies any of these. I understood it as saying that it's possible to understand the technology, and even have a strong pro-encryption and pro-privacy stance, without believing that FBI as an institution is fundamentally corrupt or motivated by malice. There is an inherent tension between the FBI's mission, citizen's rights, and the nature of encryption, and it looks like that tension might never be fully resolved.
> When the people with power get to define the meaning of the constraints on their power, their power is without constraints.
I fully agree, and I'm as convinced as you are of people's inability to restrain themselves once they have unchecked power. "Just trust us" always ends in disaster.
> It disgusts me when companies and individuals are press ganged into being enforcement arms of the government. In my opinion, it takes a certain blindness to history or a level of sociopathy to be fine with this.
This is the part I have trouble with; as I see it, there are more than two ways to characterize one's position on this issue than simply 'for or against', 'principled vs. sellout', or 'clear-eyed vs. blind'.
> Not the GP, but I didn't read anything in dsacco's post that implies any of these. I understood it as saying that it's possible to understand the technology, and even have a strong pro-encryption and pro-privacy stance, without believing that FBI as an institution is fundamentally corrupt or motivated by malice. There is an inherent tension between the FBI's mission, citizen's rights, and the nature of encryption, and it looks like that tension might never be fully resolved.
My opposition to the FBI's stance on this is independent of whether or not the FBI is corrupt or motivated by malice. The FBI, being made up of mortals, is an ever changing organization. Giving a philosopher king autocratic powers is giving that power to the tyrant that comes later. The tension between those with power and those surrendering power is older than the FBI. I agree it will never be resolved.
> This is the part I have trouble with; as I see it, there are more than two ways to characterize one's position on this issue than simply 'for or against', 'principled vs. sellout', or 'clear-eyed vs. blind'.
It's a bit incendiary, I'll give you that. I feel pretty strongly about this and let some of that leak out.
That's an interesting position. I wonder if the opposite is also true: that the more principled and morally righteous the organization, the less it attracts such people and the more it attracts more morally upstanding and intelligent people who like the challenge.
Your hypothetical may be moving on the wrong axis. I think a more appropriate counter example is whether or not positions that no one surrenders any power to attract tyrants. Does a tyrant wish to be a kindergarden teacher?
Not looking for a counterexample, just wondering if the human psychology involved has a dual that can be exploited to ensure high quality, incorruptible investigative agencies.
> I understood it as saying that it's possible to understand the technology, and even have a strong pro-encryption and pro-privacy stance, without believing that FBI as an institution is fundamentally corrupt or motivated by malice. There is an inherent tension between the FBI's mission, citizen's rights, and the nature of encryption, and it looks like that tension might never be fully resolved.
Yes, thank you. This explains my thinking more succinctly and clearly than I did :)
You're misunderstanding the comment you're replying to. It concludes that "there must be a system of checks and balances". Are you saying that there shouldn't be?
I think in this case a slippery slope counterargument is not obviously wrong. According to the UN, a person's correspondance is sacred and each and every one has a universal right not to have the privacy of their correspondance breached.
Why are we now debating the privacy of encrypted correspondance from a baseline of no privacy for unencrypted correspondance?
What says we will not continue to accept more and more, as long as it is only a small step from status quo?
> What says we will not continue to accept more and more, as long as it is only a small step from status quo?
The Overton Window [1] says this is exactly what will happen.
There's a great video [2] about how this applies to Trump (which is where I first heard about it) and gives a good example of what "shifting the Overton Window" looks like.
A person's papers and correspondence is similarly sacred under the US Constitution.
The problem is that digital paper isn't paper, and while you can be charged with a felony for bypassing a computer's access controls, the law doesn't respect those controls at all. Your electronic documents have to be in your home or in your direct control to be protected.
According to the UN, a person's correspondance is sacred and each and every one has a universal right not to have the privacy of their correspondance breached.
The Fourth Amendment says the same thing. It has been under relentless attack from the Supreme Court, but one senses a change in the weather...
Well said. I credit that you've elevated this discussion out of the reductionism it has been started with.
That said, I think if we are to approach the question of the FBI analytically/intellectually we might want to start by asking a few questions.
1. How should we measure the efficacy of the FBI? - Net lives saved / dollar spent? Is it a conflict of interest if we let the FBI self-report lives saved?
2. How do we account for the risks of the agency "going rogue" and not following the rule of law? For example, how can we quantify the risk to civil rights caused by the FBI-King suicide letter [1]?
3. When the FBI presents an argument that encryption is a problem, how can we measure whether it's a trade-off we [the people] want to make [safety vs privacy] unless we're presented with a proposal that articulates what credible threats may be prevented (how many lives saved, program cost)?
4. If the majority of citizens do not consider want to forgo their privacy for additional safety, should the FBI be beholden to respect the will of The People who hire it (via taxes)?
all human beings should be allowed to record their thoughts and communicate without those thoughts being surveilled. What is so different in 2018 that the ability to think and communicate in secret is somehow a threat to law enforcement? Law enforcement has always had to work hard to crack a case. Now that computers exist, they want to have a comprehensive record of everything that's ever been uttered, retroactively play back anything that the authority deems inappropriate, and make arrests on that basis. It is fascism in a can. There could be no clearer indication that political protections are inadequate to safeguard this than the current president, who has run amok without legal consequence (to date).
I cannot understand how any thoughtful person can support anything more invasive than hard police work and a warrant... any legal case can be cracked with human tools, because no criminal operation has 100% operational security. The need to spy on the thoughts of innocence is an insane price for any extra crime fighting ability we get on top of this.
Why does the FBI need to decrypt the phone of a spree shooter?
What could they possible learn, after the fact?
What could they possible know that isn't more easily ascertained by other means?
Every thing about every person is known in real time. If the NSA, FBI, whomever could not identify and thwart undesired behavior with existing tools (follow the money, network analysis, profiling, sentiment analysis, follow the bullets, etc, etc)... Well, they'd have to be unforgivably incompetent.
[I believe they do all these obvious things. Which is why we've seen so few terrorist attacks. (If only they cared as much about spree shooters.)]
Further, what enemy of the state is using COTS communication system? You'd have to be an incredibly stupid drug dealer, human trafficker, money launderer indeed to conduct business via iPhones.
So if the panopticon doesn't and can't help defeat the bad guys, what's the point? Does the government really need to eavesdrop on the selfie nudes my teenagers are sending their friends?
If supermarket chains can predict pregnancy based on shopping history, maybe one day mass murderers may be predicted by their behaviours as recorded in the contents of their phone?
The protagonists have a limited opportunity to go back in time and attempt to thwart the crime, prevent a catastrophe. Paradoxes! Unintended consequences! Regrets!
I think (some of) what the FBI is asking for lately is actually not that unreasonable -- assuming we could do it right. Chris Wray, like Comey before him, is claiming that they only need a way to get into a few selected devices in "exceptional" circumstances for criminal investigations. Whenever the FBI/DOJ guys argue this point, they also make sure to say nice things about the value of encryption for protecting the public.
Do they really believe the nice things they say? I have no idea, but for now I'm willing to take them at their word for the sake of the debate.
Do they really want only a limited ability to access encrypted data? Again, I don't know.
But suppose we could construct a truly limited mechanism that would give them the ability to recover only a small amount of encrypted data. Then we could call their bluff. If they still want more, they'd have to make the case to the public that they should have the all-seeing power that our community claims they're trying to get.
"And so the public policy debate around encryption has been framed as a binary choice between two absolutist positions: either we allow law enforcement no access at all to encrypted data, or we must effectively give them complete, unrestricted access to all our communications"
Most people support giving some level of data access to law enforcement, but the problem is that there's no way to restrict it to just law enforcement. As soon as we give one person or group access to the data, then we've effectively given everyone access to that data.
Perhaps there was more in the presentation, but this is the only idea I see in the slides:
"Idea: Make brute force key recovery possible but very expensive"
Again, that assumes governments that actually follow rule of law, but any criminal is going to use stolen credit cards to rent for servers in the cloud (or a hacked bot farm) and crack that encryption at no cost to themselves.
Your talk was excellent, kudos to you for being willing to present a controversial idea to an audience that is instinctively hostile to any intentional weakening of crypto. It was certainly a hot topic at lunch.
My primary concern and others’ was that strong crypto is already out there and can’t ever be taken away. I understand that you’re choosing to hand-wave over this for the sake of bringing a novel idea to the table, but I’d say it’s a fundamental flaw that can only be addressed by not only outlawing strong crypto in every country, but also finding a way to prevent criminals from implementing and using algorithms that have been known for a while now. That seems, to say the least, quite hard.
We're trying to get a productive discussion started, so hopefully we can reduce the risk of ever getting stuck with something truly horrible like mandated backdoors or key escrow.
Re: strong crypto can't be taken away - That's very true. I guess it depends on what threat model Wray is hoping to go after.
At least until a few years ago, the terrorists weren't even using encryption -- they were posting publicly about their upcoming attacks on Facebook! There was this awkward period of a couple of years where, after every attack, the authorities would claim there was encryption preventing them from stopping the attack. Then some days later, we came to find out that the attacker was already known to them and had been communicating in the clear!
So I think if Signal, WhatsApp, Telegram, ..., all moved to some very-very-expensive-but-breakable model, that would cover most of the guys doing knife and gun attacks, truck bombs, etc. in the real world.
> So I think if Signal, WhatsApp, Telegram, ..., all moved to some very-very-expensive-but-breakable model, that would cover most of the guys doing knife and gun attacks, truck bombs, etc. in the real world.
Doubtful. There are enough privacy obsessed computer scientists that oprn source, secure alternatives would soon crop up. Then you're criminalizing perfectly legal behaviour (programming) just because it indirectly might help some criminals.
Most people are going to use whatever is easiest and/or most popular. The same forces that make it hard to get all your friends onto Signal would also make it hard for violent people to use something new.
> Most people are going to use whatever is easiest and/or most popular
Because if these measures were taken, it would become common knowledge, and criminals would just switch to what's safer for their coordination. It may cast suspicion on anyone using these services, but that's not really enough.
> Do they really want only a limited ability to access encrypted data? Again, I don't know.
These arguments were all hashed out in the 1990's CALEA discussions. Assertions were made to "Trust Us", only limited use of the automated phone tapping requirements are envisioned.
A few years later the CALEA technical requirements for tapping capacity were published.
If I recall correctly (these are old bits), the capacity for the number of concurrent taps the FBI required for Manhattan was approximately the number of simultaneous off-hook phone lines expected for that size population. In other words, "capture it all".
Yeah it's never going to work if the approach is "trust us" or any kind of "limited use" that requires restraint from the humans supposedly being limited.
We need strong technical limitations, on par with the strength of the crypto we use today.
As with most arguments of this nature it sounds OK-ish until it gets concrete. There is no way to scale the cost of cracking encryption to future computing capability. It's like saying "A terabyte! Consumers will never be able to store a terabyte!"
Widespread cracking of 10 year old encryption would be a security disaster, which is why cryptographers think in terms of astronomical numbers, not earthly scales.
Lastly, why, given their track record, are you willing to take US law enforcement at their word? Do they deserve that consideration? Like any potentially deadly tool, law enforcement has to be approached with caution.
> But suppose we could construct a truly limited mechanism that would give them the ability to recover only a small amount of encrypted data.
Do you believe that this is actually possible? How? I ask because this being doable is crux of your argument, and every proposal I have seen falls laughably short.
> I have never met anyone who understands what cryptography is, and supports the FBI.
Your reply:
> I think (some of) what the FBI is asking for lately is actually not that unreasonable -- assuming we could do it right. ... suppose we could construct a truly limited mechanism that would give them the ability to recover only a small amount of encrypted data.
Personally, I'm getting a bit tired of hearing "lol nerd harder! It's impossible!" from people who can't even spell IND-CPA. (Not saying that you are one of those. Just that there are a lot of them around.)
If it's really impossible, then somebody needs to show up with a proof of impossibility.
If it's a matter of a backdoor/special key the FBI must keep secret, it will leak, period. Then what?
If it's a matter of having the computing resources to crack a key, then people's computers are too easily compromised to act as bots to solve any such problem. Our computing infrastructure is simply too vulnerable. To avoid this scenario, you'd have to secure every operating system in the world in a way that they could not be compromised, even in principle. That's as close to a proof as should be needed, unless there's a compelling counterargument to suspect some assumption is false. I have yet to hear such an argument.
1. "I'm not aware of it having happened, therefore it has not and will not happen". I won't even bother listing all the fallacies in that position.
2. Dual EC isn't protecting anything of value, and hasn't in about 15 years. If your reply is meant to be pedantic about my phrasing, notice that the context is about encryption that is actually in use.
It's not protecting anything of value, you mean, besides every VPN session generated by the most important vendor of hardware VPNs in the industry prior to 2016?
i don't know about you but i'm willing to give the computer science professor who gave the presentation at a security conference the benefit of the doubt about understanding encryption
Moreover he has no response to the question of how to scale the idea of "expensive to break" into even the near future. And then he is aggrieved and asks for "proof of impossibility" never mind the more fundamental problem with that ask, it's not something his suggestion even approaches needing. It's trivially insecure. He has done nothing to suggest that the assumption that "we could do it right" holds water. And who is "we?" Even "we" is a tendentious notion.
> how to scale the idea of "expensive to break" into even the near future
This part isn't actually that hard. Suppose your computation costs $1M now, and Moore's Law continues doubling computation per watt every 18 months for the foreseeable future.
Then in 15 years, the same computation will cost 1/1024 as much as it does today. So you're paying about $1000 per message for 15-year-old data.
In 30 years, you'll be paying about $1 per message for 30-year-old data.
Is that good enough? It totally depends. For most messages, it's almost certainly fine, because the value of the message decays over time. For other messages that hold their value, you need to set the initial cost higher. The formula above shows how you could that to achieve a certain cost at a time a certain number of years in the future.
As to the rest of your comment - this is not how we do science (or engineering). If we cried and gave up every time we encountered a hard problem, we'd still be living in caves, hitting each other with rocks.
You can't predict the future of computation. Look at the cost curve of Bitcoin mining. You may say you ("you" not "we") will learn from that. You are suggesting it's ok to trust your assessment of encryption strength. Show me a multi-decades track record of your predictions holding up when cryptographers designing secure systems don't have such a track record. Your idea amounts to Russian roulette where the gun gets more bullets over time.
Yes and no. Can we predict it exactly? Of course not. But we can estimate the rough order of magnitude with high confidence. It's not hard to find charts showing transistor density over time, and Moore's Law holds up pretty well since the 1970s.
If anything, the two big changes on the horizon are (1) quantum computers and (2) the end of Moore's Law. (1) is really hard to predict, and it will make all our current techniques vulnerable anyway. When (2) finally happens, it will only make future predictions easier, not harder.
> Look at the cost curve of Bitcoin mining.
Sure. Mining was expensive, so people put in a lot of work to make it more efficient. Now it's more efficient, but the easy optimizations are gone. ASIC miners are something like 20000x more efficient than recent CPUs. Do you really think there's another 10000x hiding in there? 1000x? 100x?
For comparison, AsicBoost was (is?) regarded this huge big deal, and it only yields about 37% improvement in mining efficiency.
> Show me a multi-decades track record ... when cryptographers designing secure systems don't have such a track record.
I know this is not quite what you asked for, but you don't have to look very far to find an encryption primitive that has "stood the test of time" cryptanalytically. It's called DES, and it's older than most people posting here. The only reason we don't use it now is because it uses tiny little keys that can be efficiently brute-forced.
As you mentioned, that is not what I asked, and your answer is not relevant to the inherent weakness of treading on the edge of weakness in encryption when even those specifying secure key length guidelines must rethink what is secure.
We previously allowed them access to private records with a system of due process involving judges granting warrants on a case by case basis. Instead of following the intent of the law, or even the letter of the law, they recorded everything using either robo-signed warrants or retroactive warrants when they actually needed to make legal use of the information. It was a framework of reasonable judicial oversight and what we got was totalitarian surveillance.
The problem is that law enforcement, spooks, and the judicial system are all on the same side and see themselves that way. You can want a system of due process, but you won't get it.
Exceptional circumstances are a problem no matter what. Say they were serious about that.
Why should the police only be able to search a phone for a terrorism investigation? If I or someone who I cared about was murdered, raped, burgled, etc, I would resent being a second class victim, unable to get justice.
Your approach is interesting. Some things on my phone are my personal papers and effects. Other things are just metadata or other incidental information. There's room for grey.
Who decides what's "metadata". Even with TLS you can still workout what site someone's visiting because the hostname is exposed by SNI (and if it's not SNI, then it's a single IP per hostname so you can just check it).
The hostname tells you when someone's visiting a rape counselling forum or a STI website or a whistleblowing site. It tells you what their political perspective is, their media landscape.
Who would be allowed? Every FBI agent? Just the chief? Even if you could restrict it - thousands of government related figures in the FBI, NSA, DHS, TSA and elsewhere would want the key.
That entirely misses the point. The whole idea behind the existence of a "key" means that the key can either be mishandled, stolen, or independently discovered, rendering cryptography inherently insecure.
Well, the one idea we have now is to limit it to whoever can authorize spending thousands or millions of dollars in electricity to solve a computational puzzle that recovers the key.
In practice, I'd like to see it require judicial oversight to release the funds.
This comment broke the HN guidelines badly. Please don't push discussions into the internet swamp here. If someone is misinformed, you can correct them in a way that teaches the rest of us something, too. You can also just post nothing. But please don't post uncivilly or unsubstantively.
Parent seems to be trying to encourage some real discussion on this topic. For those of us not well versed in crypto, could you elaborate on why this is problematic rather than being dismissive?
Why is this not possible? How could we improve the situation? What are the risks?
You don't have to resort to complicated cryptography: anything the FBI can decrypt, anyone who knows what the FBI knows can also decrypt: is is prima facie impossible to have a decryption algorithm that only works when accompanied by a valid warrant.
So that's it: there's actually no way to allow the FBI to decrypt an iPhones without also assuming hackers will (perhaps after a delay before a leak/weakness is discovered), you only get to choose "hackable" or "not hackable", there's no "hackable only by the honorable American government"
Also, it's already very hard to make things "not hackable" without deliberately introducing a certain way to hack it.
Pretty much every electronic secret the American government tried to keep gets stolen. F-35 plans, diplomatic cables, TSA’s special key, Forms SF-86 for clandestine employees...everything. It is unreasonable to expect the magic phone unlocker to not suffer a similar fate.
When one has read WaPo and watched the "Sunday conventional wisdom shows" for enough years, one gains amazing powers of self-delusion. For instance, one might be convinced that the litany of leaks you mention, and the thousands of leaks you didn't mention, were all intentional at some level. Maybe it was counterintelligence? Since you mention F-35, you might consider the fact that Lockheed made hundreds of millions of dollars "redesigning" the plane after the Chinese got all the original designs. If you wait to ask until late enough at night, some of these mooks will tell you that Snowden was some sort of 5-D chess maneuver! After one's brain has deteriorated to such an extent, why wouldn't one believe in FBI crypto magic?
Yes 'JumpCrisscross and I are in fairly complete agreement on this issue. The counter-argument I offered has always amused me when I've seen it. I reproduced it in the hope that it would amuse you as well; sorry humor is difficult.
It seems like the discussion would need to be... what information that law enforcement wants to have classified as "limited" which the public finds acceptable to definitely be hackable by third parties who will inevitably figure out how to bypass the encryption. Even if it's just a corrupt FBI agent selling the key on the black market, all it takes is one time.
There might be some subset of things that fit this. Maybe a call record? But would the public be cool with the Russian Mafia definitely being able to get their GPS location and text message contents? Saved passwords? All your emails from the last decade? That's how I see the tradeoff and risk. There is little that the FBI would be interested in that criminals wouldn't be able to make a buck off of.
Suppose some system like this is actually possible and not just fiction. Then, the FBI still doesn't need it, because the legal system already has a way to address this problem. Warrants.
This is the way things have worked since the United States was formed and is written into the Constitution. Do you actually think the justice system should not have the ability to use warrants? Then you need to change the Constitution.
As I said: What if corrupt, evil, law enforcement working with corrupt, evil, judges are able to issue warrants and read whatever they want, whenever they want?
I think the problem I see is in there somewhere. Did I state it poorly? How could I state it better?
The point is they already can. You're proposing to change the status quo in every functioning government on the planet. Since these people are ultimately controlled by the voters, they are ultimately acting to fulfill the will of the voters.
No, they can't, at least that's what they're telling us. Some devices, they haven't been able to get into. That is the status quo. Being able to crack properly done, effectively unbreakable encryption is not the status quo, it's a fantasy.
And voting? You are assuming democracy. Also you're forgetting about the tyranny of the majority. If you aren't even going to read what I'm saying, let's just stop here.
You act like encryption is the default state. It isn't. An officer with a warrant has access to your written records in your home and stored with third parties. Why is access to encrypted data magically different and somehow magically overreach where all previous access wasn't?
>You act like encryption is the default state. It isn't.
It wasn't the default state. Past tense. You're taking a very 1980's view. The world has moved on. Sure, there are some types of data that are not yet protected, but that's a deficiency which should be corrected with time.
>Why is access to encrypted data magically different
Nothing magic about it, but it's different for a couple of reasons.
First, because a smartphone is more like an extension of your brain than it is like a file cabinet.
Second, because in the old paper world you imagine (or pretend?) we still live in, the entities interested in accessing people's private data didn't have the internet, so they would have to by necessity go to one house at a time to get written records. Now, they do have the internet, and they can hoover up everything. And they do. "They" meaning not necessarily good guys. Sometimes there are rogue cops, or haven't you heard? Not to mention black hat hackers.
So, because of this extraordinary change in how data can be accessed in bulk, and the personal nature of devices, private data needs better protection. Not magic. Just different times, with different snooping tools, different data, and different devices. So much has changed, you might as well ask what hasn't changed.
Why does encryption magically make a phone an extension of your brain when before device encryption, it wasn't? What do black hats and rogue cops have to do with warrants? Your arguments are all over the place.
Agreeing with other comments, that seems like a problem with warrants in general, not with some supposed technology. If cops can get a warrant to raid anyone's house then of course they'd be able to get a warrant to raid anyone's phone in this situation. The solution to that issue would be to fix how warrants are granted to make them less prone to abuse. But that seems like an entirely different debate.
You may foolishly think that all law enforcement is perfect... it's hard to state what I think of this level of naiveté and stay within HN rules of polite commenting.
Hi! I have a reasonably solid understanding of cryptography; part of what I do professionally is break cryptosystems. I do not think that the condition where billion dollar companies ship products that deny the people the right to "every man's evidence" is a stable equilibrium. Something will have to give. Since I'm not an anarcho-capitalist, it seems unlikely to me that what's going to give is the state.
That doesn't mean I think the FBI is on the right side of every (or even most) crypto debates. But I understand where they're coming from and foresee debates in the future where they'll be on the right side.
What if it weren't "billion dollar companies" selling these phones, but hobbyists building them out of commodity parts? Where do you draw the line? I can build my own automatic rifle, which has no serial number and appears on no list, using parts purchased legally and a drill press. That being the case, how can "the state" police the function of personal general-purpose computing devices and not tie itself in knots?
You can build yourself such a rifle, or maybe even ten of them, but you can't build and sell a billion.
The points of control are pretty easy to find in a context where the Internet is run by a dozen large companies all headquartered in the United States.
But we're talking about relatively simple algorithms here. Not even iOS prevents you from, say, running crypto written in Javascript (as ill-advised as that is).
They don't want to control AES. I think they want the defaults used by commercial devices to be subject to court order. They can deal with the idea that suspects can arrange to protect their communications; they've been dealing with that for centuries. What they have a hard time with is when all communications everywhere are irrevocably protected without trying.
This seems to admit what many suspect: the purpose of this exercise is not to catch criminals, who as you indicate will have good crypto even when good crypto is criminal. The purpose is to surveil normal citizens, all the time.
I don't think such an admission necessarily follows. It has always been possible for determined criminals to invest enough time and caution to make their communications prohibitively expensive to intercept. The vast majority do not do so. Everyone (criminals included!) have to make decisions about the tradeoffs in time, effort, and convenience they're willing to make for security. Changing the default level of security enabled out of the box on commercial devices changes that calculus. It's possible to simultaneously believe that the FBI should not surveil normal citizens all the time, criminals do have the ability to secure their communications to an arbitrary degree if they make the effort, and the default level of security your iPhone provides should be less than "requires nation-state level resources in order to comply with a lawful search warrant."
Let us stipulate that dumb (unsophisticated, foolish, intemperate, whatever) criminals exist. Is it now your argument that LEOs especially need help catching these dumb ones, who would actually be coordinating their criminal activities via cheapest-commercially-available non-burner smartphones with original ROMs? Are they just forgetting entirely the smart criminals who have hired someone to root their equipment? Know your limits, I suppose.
That's nonsense, of course. They'll continue to catch the dumb ones the way they always have: when the criminal screws up and a cop whose shift is still young just happens to be around. They'll continue to catch the smart criminals only very rarely, when someone with enough clout gets pissed off enough. They'll continue to "catch" innocents all day every day, only now they'll have more circumstantial "evidence" of the "this guy googled TVs last month; he was obviously trying to decide which TVs to steal" variety. This is why the Founders wrote the Fourth Amendment.
Sort of like automobile manufacturers, who with all their airbags and crumple zones have defaulted drivers who don't ever think about safety to extremely strong safety. It's really disheartening, to read security professionals hoping to deny security to regular people. I guess I'm somewhat mollified that you've stopped pretending this is about catching criminals.
Oh yeah, what are their names? If these really really really bad people who can only be caught by ignoring 4A actually existed, one or more of them would have been prosecuted in the last decade. This is movie-plot threat analysis. Despite the ironclad logic to the contrary upthread, you cling to the idea that this is about catching criminals. Because you trust what the enforcement-industrial complex tells you about crime and the security of personal computing devices. Why is that? I can't imagine you'd believe what they tell you about crime and ideal incarceration rates...
No, horrifying crimes are not movie plots, sorry. Some of the crimes we're talking about are issues virtually everyone on HN seems to care about (for instance: securities and banking fraud) and are already extremely difficult to prosecute without the kinds of evidence we're discussing.
I won't presume to speak for "everyone on HN", but my personal concern about "securities and banking fraud" is the capriciousness and corruption exhibited by our current prosecution of those. Multiple people are in prison because they sold BTC to someone who sold BTC to someone who used BTC to buy drugs. Meanwhile HSBC, Wachovia, etc. have been caught red-handed laundering billions in drug money, and no one has seen the inside of a jail. Lots of other big banks screwed up badly enough on securitization of subprime loans that their paid-for Treasury Secretary had to bail them out with public funds, and who do the prosecutors go after? Some tiny little Chinese immigrant bank who actually caught and reported a loan officer who tried to write some fraudulent loans. Giving LEOs more power to snoop into every part of citizens' private lives isn't going to make this sorry situation any better.
The phrase "horrifying crimes" makes me think of something violent. Evidence in violent crimes tends to be physical rather than mobile-phone-based.
Of course, you've also already admitted [0] that none of this is about catching criminals.
> Since I'm not an anarcho-capitalist, it seems unlikely to me that what's going to give is the state.
The US already had crypto restrictions for years. Other nations didn't. It meant the US couldn't compete in technical markets that make use of encryption, which today is everything. Even so, crypto was still readily available. I wouldn't bet on the state.
That would be a more compelling argument if US-defined crypto --- for better and often much worse --- didn't practically define commercial cryptography.
Because there are no restrictions. Once those restrictions are put in place, expect the brain drain. After all, these are a lot of careers we're talking about. Assuming we're talking about crypto restrictions of course.
I think you missed the point: the US defined crypto despite the fact that export restrictions were in place. PGP spread across the world, and the laws against it didn't do a darn thing.
I'm not sure I see the relevance. Export restrictions are silly (I shipped a commercial product in the 1990s for which we needed a "real" Canadian version and a broken US version because of them). I don't support export restrictions, and they're not an important component of what the FBI wants. But see below.
I think what the FBI wants is for the default mode of encryption in commercial phones to be subject to court orders. I don't think that's viable either, though I understand why they want it.
You can sum my take on this up as "something's got to give at some point, and it probably won't be the state; also, don't be smug about this, because it's complicated".
The idea that nobody who understands encryption sympathizes with the FBI is silly. That might be moving the goalposts from the original argument --- nobody understands crypto and supports the FBI --- but I'm not sure people have a clear idea of what the FBI is asking for (vs. what the FBI's ambit claim is).
Later:
I misread the previous comment; I get the argument. I don't think the failure of export controls is a good indication of the capabilities of the state; in a sense, they served their purpose well, by ensuring that commercial products were sold in a default configuration that offered trivially breakable crypto.
> I misread the previous comment; I get the argument. I don't think the failure of export controls is a good indication of the capabilities of the state; in a sense, they served their purpose well, by ensuring that commercial products were sold in a default configuration that offered trivially breakable crypto.
Did they? I seem to recall that it was easy for international users to illegally download the "secure" 128-bit version of Netscape, and that most people did. I could be misremembering, though.
Can you elaborate as to what kind of position that would be? I don't see, for example, why an attempt to mandate Clipper chip style key escrow would be a sensible policy.
Is it "not sensible" or is it just something you wouldn't like? If iMessage, WhatsApp, and SMS used escrowed encryption, they'd be in some way fundamentally insecure --- although still safer to use to send your bank account to your accountant than many "secure" encrypted messengers. You might respond by installing Signal (I sure would). They might not care; as long as the default enables investigations, they can probably make most of their cases.
By "mandating escrowed encryption" I mean something like the export controls of the '90s. That is, banning encryption in general without key escrow, or at least something like China's Cybersecurity Law of 2017, which allows the government to demand any product add decryption capabilities, effectively the same thing.
> If iMessage, WhatsApp, and SMS used escrowed encryption, they'd be in some way fundamentally insecure…You might respond by installing Signal (I sure would).
I'm not sure how lawmakers could meaningfully target iMessage and WhatsApp and not Signal with legislation.
Edit: Oh, I see you mentioned default apps shipped with devices. Seems like a nightmare to word that law so that it targets iMessage and WhatsApp, but not, say, HTTPS. (Besides, does WhatsApp even ship with iOS?) I guess it could work in theory, but it seems like a bad idea to me.
Perhaps the notion that governments and their professional law enforcers are entitled to pry under certain conditions that tyrants can redefine at will, will have to give.
Not just talking about the FBI here. The issue is larger than just one country. There are plenty of tyrants out there who should not be given the tools the FBI wants.
I'm not in a bubble. I have met many people who support the FBI's efforts on this issue. Just none of them understood encryption.
Hayden is absolutely a reasonable person, but:
1. He agrees with me on this issue.
2. He makes a pretty strong argument that people who disagree with him (and by extension me) are being unreasonable.
Completely apart from the possibility, more like a fact, that the back door will be breached by non-law-enforcement black hats, there's another serious matter here.
Anyone who thinks that a solution to this might be possible should be prepared to give a good answer the following question:
How does your system protect the rights of people subject to these searches in cases where law enforcement is evil?
Hint: it doesn't. And that's not a mere trivial inconvenience. It's a fatal flaw that can corrupt economies, bring down governments, wreak havoc on lives, destroy cultures, corrode societies, and crush freedom.
There's a reason Thomas Jefferson said "the price of freedom is eternal vigilance." I don't want to live in a place like today's Venezuela, Syria, China, North Korea... we should value our rights and our freedoms.
> How does your system protect the rights of people subject to these searches in cases where law enforcement is evil?
And especially: when law enforcement becomes evil or goes rogue. For example, in 1936 in the Netherlands a census had established a list of all people - including their religious affiliation. When the Nazis marched in a couple years later, they had a perfectly validated source of Jews to deport (per https://de.wikipedia.org/wiki/Judenkartei#Niederlande).
While I do not believe that data about religion will be used as a round-them-up-and-kill-them list ever again, there are other things which we allow various forms of law enforcements or secret services to access or to do which may be OK under our current governments but may very well be used against us in the future. There's a reason why undocumented people have been afraid to register or interact with the government - what Trump and his cronies did basically confirmed all their decade-long suspicions.
> This seems to go against centuries of crusade and genocide, but we can always try our best :)
I'm German. I seriously hope for this world that no one will ever again dare to repeat the atrocities that my ancestors have committed. But then, I am afraid, the Holocaust and the two World Wars have drifted so far out of living history that some may at least be tempted to try the latter one again.
> I seriously hope for this world that no one will ever again dare to repeat the atrocities that my ancestors have committed
If you look a bit outside a euro-centric view of things, you can see this happening today in the Middle East. Just because it isn’t white folks killing Jews doesn’t mean it doesn’t count.
'Eternal vigilance' in the face of a significant portion of population hoping to forcibly enact sharia law on the rest of the likely takes the form of a ideologically moderate strongman...
An inconvenient fact that USians like to avoid discussing in order to continue simplemindedly branding people as 'evil'.. .. But only if we just fund more rebels, they will certainly magically embrace 1700's style western free market capitalism because the west is great!
If silicon valley can just nerd a little harder and make it happen, then surely the FBI can just try a little harder and eliminate all crime, and then the problem goes away.
It's too bad the author didn't mention Wyden until the 5th paragraph. The author puts "Senator" in the title, when "Senator Wyden" would be just one more word.
Also, there was zero mention of what state the senator is from. Whatever happened to the standard "(D-OR)" after first mention of a senator in an article?
I suspect more than a few of us in ostrich country are considering joining you. It gets old, living where most voters can't be bothered to learn the issues before voting. Or after.
Portland is becoming something of a tech hub, and you can work remotely elsewhere. I live in Bend, which is far more 'purple' than Portland is, but the weather is much more to my liking (more sun, less rain).
Come check it out, I'd be happy to buy you a beer.
"Wray’s speech undoubtedly spurred frustration in Silicon Valley"
It's an absolutely ridiculous notion that everyone in the Valley is pro-privacy. It's the exact opposite. Many of the security higher-ups in the Valley are complicit and working very closely with law enforcement to figure out a solution.
These companies are trying to figure out how to balance risk; the risk of providing a secret way to open up someone's phone to the government while keeping it hidden from everyone and the PR fallout of knowingly providing backdoors to supposedly secure personal electronic devices.
It's got nothing to do with some high-and-mighty goal that every person is entitled to privacy(of course the constitution would disagree but that's irrelevant).
This is a hilarious facade. If they need access to the phone data, they will get it.
On top of all that, the San Bernardino case illustrated that there is a market for private security companies to circumvent most security protections anyway.
The US won't be able to force these rules on foreign companies selling outside the US. e.g. Huawei, Samsung, and Sony are not going to backdoor their phones for customers in Asia / Europe. So, corporate / governmental customers in those countries are going to shy away from buying American phones... I can't see how this is a good thing for American companies.
But they will be able to ban sales of phones that do not comply. For God's sake, they banned tiny circular magnets. Certainly they can ban particular models of phones.
You're trying to use logic and reason as they apply to the government. The dirty secret is: they don't. Ever.
Buckyballs! I still have some, bought when they were legal in the U.S. They're pretty strong magnets and stick together and you can make shapes with them.
Only problem -- if a little kid or animal swallows at least 2 of them, then they can pinch some intestines together inside you and be stuck there and kill you.
You can still buy rare Earth magnets of exceptional strength and small size. It’s just that they’re sold through scientific and industrial channels rather than the toy aisle at retail stores or marketed as toys online.
Just after I read an article that a US Senator demonstrates that he understands the risks inherent in backdooring crypto, I read this: https://www.theregister.co.uk/2018/01/25/uk_prime_minister_e... "Here we go again... UK Prime Minister urges nerds to come up with magic crypto backdoors"
I was rather hoping our PM had quietly dropped that nonsense a while back but it seems not. sigh
The simplest answer is to ask Congress to put aside a budget for a Congressional encrypted communications channel with key escrow or some other great backdoor idea. Then, the executive can do an RFP, get it implemented and then we just wait until presumed-private-congressional communications are revealed.
To be fair, while it looks ridiculous that so much effort is spent trying to break into phones in criminal cases, given how much of our lives are conducted through them they are sources ripe with intelligence-- i.e. we can hope to find out who a terrorist received support from, where a school shooter obtained their guns, etc.
Sometimes it doesn't pan out, but such is the nature of criminal investigations.
Why would a terrorist destroy their personal devices yet leave their work phones intact? What kind of criminal would use a work device to commit any sort of crime? I'm sure the work iPhone was managed by a MDM. The US spent millions and even tried to force a company to write a backdoor. They tried to set a precedent, they knew there was nothing on the phone. It wasn't about the phone it was about using fear to get American citizens to sign over more rights.
They didn't spend a million dollars to get data off one phone. They spent a million dollars to try and set a precedent that would let them unlock and inspect any phone they wanted.
>Why can’t they break encryption in a good way while they’re at it?
Thoughts like these always remind me of this[0]:
>On two occasions I have been asked,'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
[0] I know that wasn't a direct quote from Mr. Wray, but it is in keeping with his thoughts.
I keep hearing these arguments and after hundreds of hours of pub talks, the absolute best idea me and my friends have come up with (it's still a bad idea) is government issued crypto certificates.
The idea goes that about the same time someone is issued a social security number they are also issued a cryptographic key pair. The cool thing is that your key pair can establish identity and in case your private key is compromised a new one could be issued easily. The government would have to maintain copies of everyone's private keys.
The only way this would actually work in practice is if a number of new rights were established and a number of existing rights were updated. I don't want to enumerate what we came up with in discussions, but they are numerous.
The only real benefit to all of this is that encryption would have a backdoor which is established before any encrypted communication actually takes place so the protocols could be made as secure as possible. This also means that the government would have to establish a monopoly on something that we can do with a single openssl command.
1. That's a disaster waiting to happen. Lots of actors are targeting governments, not just hackers and criminals (other governments and people from inside the gov itself).
2. That doesn't stop terrorists from using secure encryption.
What you are proposing gives government unlimited access to innocent citizens data while it does nothing about access to malicious parties.
Sorry but this proposal makes no sense! At least the backdoor is a wolf in disguise for government intervention in citizens everyday life.
I agree that its a disaster waiting to happen, so theres no need to apologize, I mention that it's a bad idea in the first paragraph.
To be honest, I don't really know why I even posted it besides the fact that it's something me and my friends have been discussing on and off for about a year now because it keeps coming up in the news.
The "bad guy" problem is an inherent problem and while me and my friends have come up with a few novel work-arounds none of them solves the problem completely.
Another issue is that multi-layer encryption could disrupt the government's ability to execute a warrant. If this BAD IDEA were implemented then multi-layer encryption would need to be made illegal and that starts us down a very dangerous path.
The only thing I think is a good idea is using a crypto key pair as a replacement for ssn but only if it's not really used for encryption.
The only solution is to make crypto accessible. If it is, there is nothing the gov can do about it. You can’t simply put half your population in prison.
It's true that you can't put half of your population in prison but you can bring prison to the population just look at North Korea and a few other authoritarian nations around the world.
We need a way to let law enforcement do their job but also balance that with citizens rights. Backdooring encryption is not the answer, but perhaps there can be some other answer. The alternative is mass surveilance like we have today.
Something along the lines of a wiretap (this would technically be a state-sponsored MITM attack) for electronic communication which would only be possible with a warrant, but then we would probably end up with a secret court (a la FISA) issuing these warrants.
The only point I would like to make is that we need to be open to discussing possible solutions otherwise we might not like what the whistleblowers tell us.
A widespread public backdoor that affects all devices so law enforcement and anyone with the backdoor key can get in? Yeah, they are all over the place. I suspect a lot of stuff the NSA pumps out already has excellent backdoors. I think the only real way a government could encourage its citizenry to use CLEARNET (unencrypted comms) for everything would be to offer some sort of benefits if everything you did was easily observable by law enforcement. Again, the backdoor idea is flaunting arrogance in the face of the Right to Privacy and Unreasonable Search and Seizure, and I think the Senators have a lot of emotional issues to work through before they start commanding the tech sector to do anything. Nobody supports a backdoor, backdoors in software are not good, end up falling into the wrong hands, and have historically never prevented a single catastrophe -- I'd be happy if law enforcement revealed a time when they cracked encryption to find some sort of juicy plot, but alas, I have yet to read a single instance of the sort. If you add a backdoor to our software systems, you're effectively adding a spy camera into every device that uses that chip, and you are not in control of who has the key, despite your best efforts. "Don't worry, only I know the combination to my suitcase, it's quite alright if it exchanges 70000 hands" ... said nobody ever. Law Enforcement and Legislation need to target this realm with a completely different approach, otherwise we leave the back door to the castle completely unguarded and defenseless.
Ron Wyden (D) Oregon is the senator who asked the FBI to name any cryptographers that support backdoors. He’s by far the most tech savvy and encryption savvy member of the house or Senate.
> I like [Ron Wyden] but there are in fact more “tech savvy” Congress critters.
Ron Wyden is all over the issues I care about, to the point where on the rare occasions we disagree I’m willing to believe he’s right because he does policy for a living and I’m just a guy with strong opinions (which is precisely why representational democracies were invented).
The other four? I never hear them championing critical tech policy issues. Yes Ron is in the Senate and they are in the House, but Ron was in my feeds constantly long before he ran for the Senate (and I don’t even live in his district). Savvy doesn’t just mean a degree in my book. Savvy means able to cut through the noise and get to what matters.
And since we are clearly in agreement over the importance of this stuff and just quibbling around the margins, I’ll mention adding Suzan DelBene (D) WA to your list of reps to follow if she’s not there already. I’m not sure what her degree is in but she’s a former Microsoft VP who is incredibly smart and deeply knowledgeable about tech issues (though I haven’t seen that emerge yet as a top legislative focus area for her the way it has for Ron Wyden).
> During his January 9th speech at the International Conference on Cyber Security in New York, Wray called the prevalence of encryption an “urgent public safety issue” and said it had prevented law enforcement from accessing some 7,800 devices in the last fiscal year.
Cryptography with backdoors means humans with knowledge of said backdoors in both industry and government. Even if it were theoretically possible, I have no faith in humans to protect this kind of information.
I'm sure the Chinese and Russian Governments could provide a list of willing companies that would be happy to offer those services to american companies and our government.
Maybe we should take the high road: the time has come that unbreakable crypto is to be considered a basic human right; the same as free speech, freedom of thought, or freedom of religion. In this digital age, a lot of those other rights can't be guaranteed without strong crypto. Whenever any mention of backdooring, weakening, disabling, preventing encrypted communication comes up, our reply should be, "no, you can't do that because crypto is a human right, the same as free speech, and is necessary to protect other fundamental rights".