Hacker News new | past | comments | ask | show | jobs | submit login
Banning E2EE is stupid (github.com/davidchisnall)
207 points by dp-hackernews 12 months ago | hide | past | favorite | 137 comments



Arguing in good faith against the lawmakers who are perfectly aware of the the gaping holes in their own arguments is too common in this field IMO.

If we want to protect the right to encryption and (pseudo)anonimity to the degree we have it now, we have to appeal to emotion. Just like how "child advocacy" groups tried to make you into a monster if you think blanket scanning of all data on our computing devices is a bit too far gone.


The challenge there is that "protect the children" and "terrorists" weigh more heavily on a moral level than "protect my privacy".


Bad actors can use this to actively harm children by intercepting their communication with their friends and family and putting them at risk. No need to even change arguments.


this is too technical , you need to be broader and more fear mongering:

Bad actors will use the this to find children to harm.


"Burning down the village in order to save it. "

[1] https://en.wikipedia.org/wiki/Battle_of_B%E1%BA%BFn_Tre


The best argument I have is « we’re giving tools to the fascists that will come into power in the future ». It’s more likely than not that every country will have a « wrong side of history » gov coming to power, sooner rather than later


We've engineered societies that no longer value freedom as they consider it a nebulous concept that is less important than some in-group momentary righteousness. Even free speech is back on the table in the current zeitgeist, which is insane.


If fascists come to power, you are kind of screwed anyway.

Rubber hose cryptanalysis is pretty effective.


Rubber hose cryptoanalysis doesn't scale all that well. Sure, you can build a traditional surveillance apparatus to find and squash and dissidence, but unencrypted internet makes this a lot cheaper and more effective. At the very least you can force your opposition to give up the advantages of digital communication, and catch those that don't (and change public online discourse through chilling effects)


It does not weigh heavier, and I wish people would stop repeating this.

Our governments wield extreme power and we have a moral imperative to keep checks and balances in place so that power is not abused.

A pedophile with a laptop and a camera is terrible. A Senator without the 4th amendment is worse.

A racist with a social media account is terrible. A President without the 1st amendment is worse.


> It does not weigh heavier, and I wish people would stop repeating this.

They keep saying it because to most people they absolutely do weigh heavier. Very, very few average people are going to agree with what you just said.


Maybe not with all the finer details considered. But we're talking about dirty games, like misleading vague headlines about how children are in danger, and not mentioning how "solving" it directly leads to a nightmare for privacy. It's the shock factor, not the logic. And a lot of people definitely do not care to read the fine print. I'll wager most. In fact, I'll wager overwhelming majority. That's what's scary.

Anecdotally, my close friends struggle reading the single short paragraphs I send them about important events. Bad enough that they skip over a couple numbers, and then say "I'm not doing all the math for that", when it literally says the answer directly. The mere sight of a couple numbers immediately puts them into "gloss-over" mode.

I really have no faith in Gen Z at large. I largely blame Tik Tok, but at the same time, something like it was inevitable. Capitalizing off of a feedback loop of shortening attention spans is going to spell doom for our society.


I keep getting told that, that very few people agree with what I just said.

Every conversation I’ve had IRL on these points, after the initial negative reaction to what I just said, progresses into mutual understanding that an organized government is the ultimate threat when left unchecked - and that these rights are core to the system that protects us from that power structure.

It protects us from the very same people who are telling us to “think of the children.” When Feinstein and co. proposed “child abuse” legislation to “protect the children” - my point is that children need to be protected from her and the power structure she represents first and foremost.

Endlessly repeating that you are “thinking of the children” when you trade their future safety away digs this toxic thought deeper into the cultural zeitgeist.

When I opposed the president’s surveillance program, I was thinking of the children and how it was going to shape their lives in the US.


Porque no dos?

E2EE protects the children from being spied on by pedophiles.


Which population group is most likely to be in possession of pictures of teenagers doing dumb things?

E2EE protects children from being arrested by the government for being dumb children.

[Edit: Added "pictures of"]


"Protect my children's privacy"?


Protect children's privacy!


... from terrorists such as a government!


From child molesters in government! I would be surprised if you could find even one country where there wasn't at least one case of a high ranking official indulging in related crimes.


Just rephrase it as "Keep the government out of my affairs". It's not just "protect my privacy" though, it's also "protect my ability to organize against the government", which in the US heavily overlaps with the second amendment and the rhetoric about it.

In a more globally appealing version, you can spin that into various forms of "This is what the Communists did to prevent people from rising against them" (replace Communists with Nazis, if desired). Or reference V for Vendetta or 1984 or other cult classics.


Authorities have a million weapons and resources against individuals already.


The real challenge is that none of the ones screaming "protect my privacy" can offer a solution to "protect the children" as well.


The real challenge is getting people screaming "protect the children" to provide evidence that "the children" actually need protection in a meaningful way that also reduces the privacy of everyone.


We continue to put responsibility on the government to fix our problems, when many of them nowadays are societal and cultural. You can't slap on a law to fix immorality and lack of wisdom. All you do is restrict the moral and wise. I hope we return to holding parents responsible for raising their kids properly. The world they face now is a terrifying and disgusting colossus, and facing it cannot be done without moral and wise future generations.


We've been asking the pedophiles to stop, but that hasn't been working so far.


I don't see the relevance to my point here.


> The real challenge is that none of the ones screaming "protect my privacy" can offer a solution to "protect the children" as well.

Sure we do! Use parental controls to prevent your children from using any e2e messaging services.


Parents taking responsibility for what their kids do online, and taking their protection into their own hands? Blasphemy. That's what we have the government for.


What makes (E2E)E so special that it protects adults, spies, dissidents, police, soldiers, politicians, people under police protection ... but doesn't protect children?

It protects children. Removing protection doesn't protect children.


To the downvoters and commenters who do not get that "the children" is just a straw man:

The issue at hand can be described very, very easily: law enforcement agencies' mission is going to get harder to fulfil if they lose the possibility to wiretap suspects. The solution is, unfortunately, not that easy!

Now, HNers can cry their lungs out that mathematics forbid a perfectly safe backdoor and what not, but they are missing the point: they are simply ignoring the question and lacking any respect for the people posing it in the first place.

The issue exist; and it makes sense. Furthermore, wire-tapping has always been accepted, at least as a necessary evil, by democracies all over the world. At the same time, absolute remote privacy has never existed, much less been guaranteed. E2EE is a big game changer, and you expect people with governing responsibility to just ignore it?


At the risk of being very directly contrarian: is it?

Is it actually getting harder for law enforcement? Do we have evidence of E2EE harming cases, hurting conviction rates, or similar? At this point in time law enforcement has been routinely dealing with and lawfully overcoming encryption for several decades. Encryption is no longer a novelty law enforcement has no idea how to deal with.

I understand that law enforcement officers may sincerely believe that encryption makes their jobs more difficult and thus the public less secure. Since they are public officials, I am happy to accept their claims as truth. I just need to see supporting evidence. At that point we can begin to weight public safety against privacy. Until then, I can only consider a demonstrable loss of privacy against a purely hypothetical impairment of law enforcement functionality.

Speaking only and solely for myself, I in no way expect people with governing responsibility to ignore end to end encryption. I do expect them to ask questions when people with a demonstrated capacity to function in the presence of E2EE claim it is making their jobs too difficult.


> At the risk of being very directly contrarian

Please, be! Discussion is welcome.

What I despise is only immature people ignoring important societal issues with a sweep of their hands, while crying a void "because privacy".


Basically, I need law enforcement to prove they're not full of shit. Law enforcement and intelligence systems have a long and ugly history of abusing easy access to private information. They also have a long history of working around technical privacy protections at need, in pasts both distant and very recent.

Until then, I think we should give their claims all the weight of all the evidence they have provided. Or failed to provide.


I just think it's really funny but also terrifying that it's the cool thing to hate on the CIA now, knowing all the exposed operations against their own fellow countrymen, and yet people still think they should be the ones with backdoor access into our private lives.


So, I worked for intelligence agencies for much of my early career, and my wife still does, and I get the impression Hacker News fundamentally misunderstands their mission and the way they operate. This sort of effort involving wiretaps and backdoored encryption with state access to service providers is a law enforcement capability, not an intelligence agency capability. Intelligence agencies do not operate within the law and they largely don't care about things like CSAM or internal political dissent. That's the FBI and other part of DHS's concern. Intelligence agencies are at least somewhat constrained in what they do domestically by cleared legislative oversight committees, but internationally, they do whatever they want. They don't need warrants. They may often have, but certainly don't need, the cooperation of governments or law enforcement where they operate. Their tools aren't legal tools. They may use device-level compromise, remote sensing technologies the commercial community and general public doesn't know exists, human spies, and these can be used against foreign service providers or against specific high-value targets directly.

Whatever the law says law enforcement within your own country can or can't do has no bearing at all on what foreign intelligence agencies can do.

All of which is to say, if you're thinking of things like MK Ultra, nobody ever publicly argued we should give that kind of capability to the CIA and it was already illegal at the time they did it and they did it anyway. Trying to make telecom service providers legally obligated to provide an encryption workaround for law enforcement is a completely different thing.


One thing to bear in mind is that any technical lawful intercept capability is also a technical intercept capability for a hostile intelligence agency. Note that "hostile" and "foreign" are not always the same thing.

With this in mind, I think making the tools resistant to abuse is a highly desirable design goal, up there with making the detection of abuse very easy.


Alright fine, CIA was my <insert three letter agency here> choice. NSA is probably more appropriate.


>wire-tapping has always been accepted

The wiretapping of yesterday is not the mass surveillance of today. John from the police taskforce snooping on your landline and taking notes is not even in the same ballpark as mass data collection and categorization of everything, everywhere all the time.

I also accept the former as a necessary evil, but it's a completely different beast from my latter example.

That being said, you raise good points. I don't know what would be the ideal solution to these relatively new worries of governments. I feel like this bump in internet privacy is counteracted by the surveillance network that has been built in the real world, so we could call it even in some ways.


Wiretapping was also used by all police states to preserve dictators, by eliminating competition and punishing citizens for wrongthink. As citizens of a country behind the Iron Curtain, my parents were always afraid to talk on the phone, because the secret police was always listening, either directly or via informants.

You're going to say that in democracy, the state uses warrants issued by a judge for wiretapping and there's due process. How's that working out in the US, given Snowden's revelations and the Patriot Act?

Leftists wrongly think that the state represents the will of the people. It doesn't. Democracy is preserved primarily by (1) a culture of liberalism and (2) the state having limited powers. But as could be seen in WW2, democracy is fleeting, actually.

Being from Eastern Europe I've got to say that seeing privileged Westerners willingly give up freedoms, that other people died for, has been such a huge disappointment.

---

As for crime prevention and investigation, you can still follow individuals around, install bugs in their home, compromise their personal devices. But it's never enough, is it?


> As for crime prevention and investigation <...>, compromise their personal devices.

How do you compromise someone's device that's running iOS 17 and using an encrypted messaging app?

> But it's never enough, is it?

The reality is that people are using E2E en masse, and it's not possible to compromise modern devices the same way it possible to compromise a phone line in 90's. I can open telegram and buy any number of substances/devices/items, paying with Cash App. This just wasn't feasible even 10-15 years ago in the way it is now. So what do we do? (I don't have an answer, fwiw).


If you want to compare abilities, in the 90s you also didn't have Internet-connected cameras on every corner, or in public transportation, also able to do face recognition even with masks on, you couldn't use Wi-fi to see through people's walls, or to monitor the IPs a home connects to, etc.

As for iOS, there are always vulnerabilities that state-level actors can exploit. Not long ago simply sending a message to someone via iMessage could have compromised an iPhone. Which is why iPhones now have a "lockdown mode". Criminals are either dumb enough to expose themselves via insecure devices, or competent enough to use an extra layer of encryption, which can't be prevented. Even if you outlaw general-purpose computing, people can just send encoded messages with custom algorithms, WW2-style.

If you target someone specifically, and have access to their location, there's no way they can protect themselves while also keeping a communication channel open. Let's be honest, this isn't about targeting someone specifically, but about fishing for wrongthink.

> So what do we do?

As a citizen, not my problem. Law enforcement should do their jobs without encroaching on my privacy. Or my child's privacy for that matter.

If scientists invent a way to read people's thoughts, what next? Mass surveillance for people's brains directly because there's no other way?

If the question is mass surveillance, NO is always the answer as far as I'm concerned.


At least in the US you cannot outlaw encryption. It is covered under 1A.


Thank you for expressing your opinion, I appreciate it.

Actually, I don't have a strong opinion because I recognize the issue to be very complex.


Are we sure it's complex? Or is it portrayed as complex for political reasons?

So far we've seen a lot of assertions that encryption is a major hazard to public safety, mostly from groups with a long history of abusing access to wiretaps. We've seen precious little evidence to support this. The other group claims that encryption enhances privacy and pushed back against abuse. There's plenty of evidence to support this.

More than once in my life, I've seen people attempt to advance an unsupported position by claiming it's very complex. Sometimes this is true and they just need time to support their points. Sometimes it's completely untrue, and the idea is to leave uninformed onlookers with the impression of complexity that needs to be negotiated.

How sure are we that this issue is genuinely very complex, versus the real possibility of false complexity in an effort to confuse you and me?


Protect the children from what? There are lots of solutions for concrete problems.


So, since there is not a good solution[1] to "protect the children", let's screw up everyone's privacy (and security) ... seems a reasonable tread off to me.

[1] actually I don't think that is true, but let suppose so


So the children are going to be both less protected and... ought to be educated enough not to produce photos which might cast CSAM willing to protect them from themselves.

[1] Let me guess, the "good solution" you claimed as existing has been described by you already.


With the current batch of proposals the child (teenager usually) making home-made porn¹ would get flagged and in the worst case scenario get marked as a CSAM producer or end up on some indelible list of sexual predators depending on the jurisdiction. Can you imagine what will happen next? For what? Doing what their hormones made them do without harming anyone?

Educate them, absolutely. Monitor their every move online because they might do something stupid? That's an option I would like to limit to their parents/guardians, if it all, not any state or bigtech corporation, and certainly not any automated flagging system.

1: Not CSAM; let's limit this line of thought to consensually made porn.


Afaik, scanning approaches are checking images against known images. It's not an image classifier, it's a perceptual hash.

Of course, possessing or distributing images of underage naked people can get you on a sex offender list, even if that naked person is you. That's a different issue.


It's a black-box, and unverifiable by design.

With a false flag sufficing to ruin your life, I'd rather take no chances for a nebulously defined benefit. In the end all this does is mark those apps participating as unsafe for people who do, incredibly, feel safe now to post such images through a service like WhatsApp or IMessage (do these people even exist?). They'll use something else, and the people you do catch are just idiots and teenagers.

And that false flag event is hardly far-fetched with the extremely broad definition CSAM has. I'm pretty sure that if a parent photographs their 13 year old daughter proudly posing in her new bikini on the beach, and sends it to grandma who naively posts it on Facebook, where someone downloads it for their personal sexual gratification, then posts it on some image hosting site for bragging rights, where it gets hoovered up for the great CSAM database because that site was filled with photos of scantily clad children, and now when the partner of the above parent asks them to send those photos they took of their daughter on the beach for the holiday photo-album they're making, and boom, they are on a list.

Unlikely? With unaccountable processes and black-box databases things like this will happen, and that is excluding the likelihood of bugs happening (which is inevitable).


> Of course, possessing or distributing images of underage naked people can get you on a sex offender list, even if that naked person is you.

I have a very hard time with this. These laws which are ostensibly there to protect the children are actually used to persecute these same children.

Any proposal "to protect the children" that doesn't address this is full of shit.


Remember that teenagers also count as children. They occasionally do -ah- dumb things. Do you really want to rip back the veil of privacy on teenagers?


Protecting the children is easy. Just have parental controls which can be installed by the parents.

The real problem they want to solve is most likely organized crime and terrorism, plus on son countries political dissent.


A solution isn't required.


Yes. You can actually flip the script with "think of the children arguments" because banning E2EE fundamentally makes children less safe. However I suspect it's worse than this: these law maker's don't care about children, they care about power and establishing control over how citizens communicate. There is no argument to deter them, we can only hope to convince the public or else be damned to illegal activity as hackers.


Governments love passing bills that give the government more power. Very rarely do governments pass bills to remove that power later.

This is a power grab and should be treated as such. E2EE makes police work more inconvenient but it should be inconvenient. If you want to optimize for police work convenience, you get a police state.


Some argue you have to appeal to lawmakers to make a law that prevents breaking encryption. Perhaps the avoidance of political engagement by tech inclined folks is contributing to or at least allowing some of this to progress a lot eaiser.

A key part of this issue is learning about the frequency with which this type of legislation is brought up over and over again and how it might be new to some not not everyone.

Perhaps bureaucracies are counting on people relatively new to encryption erosion not having context that they are far from alone.


For every successful bad faith actor there are 10 useful idiots parroting their claims. This perhaps addresses them.


If we want to protect these rights they need to be guaranteed somewhere where daily politics can’t touch it, like a constitution. Otherwise it’s a battle that will only end when the rights are gone.


This assumes that said lawmakers are as intelligent or knowledgeable as they would need to be in order to purely be seeking power.

I do think there's some merit to your second point, however.


I disagree with this take. This implies that pathos is a more durable political mechanism than logos, or that in the short term pathos is more impactful.


It’s not the first time I hear this kind of statement, and I still don’t know how I feel about it. Maybe it’s true, realistically speaking. But it’s also an admission that we can’t be better than this, and that very much tastes like defeat, like reading post-Trump Scott Adams. (Not to be confused with the idea that it “validates” emotional manipulation or whatever; that one’s externally motivated, while this one is internal.)

It also, of course, contains the kind of implicit contempt for the public that has led to widespread disdain of politicians in the first place, because while many people may be easily swept along by appeals to emotion, they still usually have a good feeling for when the speaker doesn’t respect them.

I don’t know if anyone’s really trying to figure out what it would take to make these tactics not work on a large scale, rather than how to use or work around them.

See also: “Guided by the beauty of our weapons”[1] by Scott Alexander. Compare: Paul Graham’s statement[2] that “Java's designers were consciously designing a product for people not as smart as them.”

[1] https://slatestarcodex.com/2017/03/24/guided-by-the-beauty-o...

[2] http://www.paulgraham.com/javacover.html


A great example why the "assume good intentions"-meme is actually pretty broken and leaves those that follow it with a disadvantage. The same way as "assume good intentions" fails when interacting with sociopaths, it also fails when interacting with lawmakers and authorities in general.


Commander William T. Riker: In all trust, there is the possibility of betrayal. I'm not sure you were... prepared for that.

Lt. Commander Data: Were you prepared, sir?

Commander William T. Riker: I don't think anybody ever is.

Lt. Commander Data: Hm... Then it is better not to trust?

Commander William T. Riker: Without trust, there's no friendship, no closeness. None of the emotional bonds that make us who we are.

Lt. Commander Data: And yet you put yourself at risk.

Commander William T. Riker: Every single time.

Lt. Commander Data: Perhaps I am fortunate, sir, to be spared the emotional consequences.

Commander William T. Riker: Perhaps.

https://youtu.be/JCvqMk4P3zQ


TNG is the most comprehensive analysis of ethics I've ever witnessed. No media or publication even comes close to delivering such a sense of understanding the timeless difficulties of the human experience.


And yet, it portrays a "would like to have" world that couldn't be more distant from what we actually have. It is literaly (science) fiction. Using a fictious 24th century chat between a womanizer and a fancy robot to explain to a 21st century real world human they should be more trusting is rather, shall I say odd?


You're missing the point. The setting of TNG is radically different from ours for the reason you described, and yet their problems of trust are the same. Using science fiction to deliver messages pertinent to today is typical and, arguably, the goal of science fiction. It isn't odd at all.


I don't think I am missing the point. As you mentioned, the setting of TNG is radically different from today. And I submit that in such a different setting, it is orders of magnitudes easier to trust, because (most) people from the 24th century are part of that setting, and are vastly easier to trust then people from today. So assuming a person from today just has to believe in the virtual ethics of TNG is just crazy and naiv.

I know that scifi is being used to reflect on the present. That doesnt mean it is always helpful.


It's not about taking these "virtual ethics" at face value. It's about watching a touching moment between characters and recognising that elements of that situation can be related to. If you are unable to consume fiction without being able to do this then I really don't know what to say.


I am blind. I am glad that you are able to call this trivially constructed scene "touching", in a sense I am happy that life for you seems to be that simple. Enjoy.


With lawmakers it’s tricky.

The individual people involved often have good intentions.

But the political machine is a different beast. Seeing like a state, and all that.


Most individuals doing a job have income as their primary incentive. Much of what they claim besides that is a socially necessary mask. Nobody wants to be "just greedy", so they appeal to emotion and claim they have a higher cause. But how do you propose to measure if that higher cause actually exists?


Although E2EE is great I'm not sure I follow the logic of this argument.

Yes, it's easy to implement E2EE on top of an plain communication channel, but why is that relevant? It's also easy to make a machete from plate steel from your local hardware store. It's easy to create chlorine gas by mixing cleaning supplies. You can even make a perfectly functional rifle from the junk in your garage. And yet, you're not allowed to. (Substitute other examples if you live in a freedom loving country.)

We should oppose an E2EE ban (or backdoor mandates) because E2EE is good technology that protects regular citizens from being spied on. Not because it's technologically easy to circumvent a ban.


This was something people largely missed at one point around Australia’s Assistance and Access Bill five years ago: the Prime Minister was widely derided for saying (as reported) “The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia”. Government is all about forbidding things that are physically possible. Maths says you can do these things. Governments can say you’re not allowed to do these things.

(A&A Bill was the rather-dubiously-handled piece of legislation that, in part, makes first-party end-to-end encryption not an acceptable defence for declining to fulfil legal warrants.)


I understand what you mean, the problem is that when you unpack what the result of the government saying "you're not allowed to do these things" is, you're left with a plan that takes away law-abiding citizens privacy without actually accomplishing anything.

(I'm assuming here that the end goal is to circumvent proving criminal actions by giving the government a way to prosecute people for simply hiding communication)

I respect the government saying some things are not allowed, but when it comes to E2EE, even if there's a law saying you can't build your own E2EE, there's actually no way for the government to prove that a communications channel was using illegal E2EE.

The person accused of using illegal E2EE can just say they were simply typing gibberish into the program and weren't actually trying to communicate. Yeah that's obviously bullshit, but without the key you can't prove it, which is exactly the situation they are in now anyway...


The real danger comes when using your own encryption becomes illegal; that is, anyone using encryption within a weakened encryption thing will be an immediate target because obviously they have something to hide. Dystopian but there you have it.


I was thinking the same thing. Even though you can encrypt on top of existing messaging services if it becomes the norm that they can watch all traffic anyone who uses such an extra layer will be immediately flagged and put under more surveillance. So for the surveillance state it would be much easier to deal with people who try to live outside the system / norm and simply criminalize privacy of any kind. I don't agree with it but you can't say it's stupid or they have no incentive to pass this.


And criminals will just send selfies of themselves, with their encrypted message hiding in least-significant-bit noise. Good luck proving there's a hidden message.


I think you’re answering your own question. Sure, we know it’s bad for “us” (the general public), but the author is also casting doubt on its effectiveness against “them” (criminal networks) and setting us up for an easy own goal.


I agree. Although I did throw in futility in my arguments I sent to my MP, my main argument was one of principle: people should be allowed to keep secrets.

Obviously said arguments were not read by said MP and merely dropped into a bucket with all the other adorable little concerns from the public.

The argument posted here is even less likely to convince anyone that matters. It's mostly preaching to the choir.


Exactly.

The fact that it is still relatively easy to do X even after X has been banned does not imply that banning X is stupid. The argument made in the article is fundamentally flawed.


Sadly, stupid politicians are a problem in every democracy. The only way to resolve it is to incentivise more intelligent/knowledgeable people to go into politics, but how?

The best motivation for going into politics is probably patriotism, wanting to change things for the better etc. But then people we would want in politics, unless they are already wealthy, would have to take a substantial pay cut if they "went into politics".

I can mostly speak for my own country (Poland) where a typical salary of a member of Parliament is in the region of €2.5k per month,plus another up to €2k to run an office(but they spend it on employing someone to run the office). The Prime minister of a 40mln people country gets €3k per month. That's the entry level salary in IT in the bigger cities here.

So only people in the following categories go into politics: "really passionate", wealthy already, is interested in political science/can't do anything else better paid anyway(such person usually has a degree in philosophy or history), someone who loves having power over others, someone hoping for a chance to earn money on corruption(illegal or semi-legal).

It is no surprise amongst such group there is a lack of knowledge and no incentive to understand topics like encryption.

That's why I'm very strongly for increasing the salaries in politics(it's a very unpopular opinion here in PL). It it not a solution to all problems, but it is definitely a requirement IMO.


>Sadly, stupid politicians are a problem in every democracy.

Are they actually stupid or do they just play dumb to better serve the interest of their main backers (or supposed audience)?


I've thought the same, but in the end it doesn't give us enough of the people that want to make positive change. People with good ideas and motivations overwhelmingly succeed in the private sector. In democracy, they get held back and waste their time. Whether they consciously think it or not, democracy is a big reason potential good politicians don't participate in democracy.

So like I said, increasing salaries doesn't do enough to shift this balance. Increasing it enough to shift the balance will shift it in the wrong direction: people who are only in it for the money will flock to political positions. And then we have the same ignorant fools again, but at a higher taxpayer expense.

All I see on HN is people trying to come up with "material" frameworks for fixing our problems. It's all bandaid fixes. Actually no, it's bandaids on a slashed artery. Morality and wisdom is in rapid decay. The only true way to fix that at the source is to have strong families again, but that's unpopular now. They've incentivized dissolving families, disincentivized starting healthy families, ruined our attention spans, destroyed proper sexuality (one of our biggest motivators for anything), convinced us that violence is never appropriate, and they've brainwashed us so well that we beg for more of all of it. And as a result, we're likely doomed to fail.


I think people who view entering into politics as just another job and only ever look at it from the lens of salary are never suited to be one in the first place. I can guarantee you they would never be able to become one even if they tried. The real hurdle is that elections are very taxing, and highly gated (if you are at all serious about winning). You almost always need to associate with a political party first, but then most political parties field candidates by seniority. No matter how much qualified you may well be, at the end it all boils down to how much you can convince people of your constituency, all of whom are irrational, and are looking out for their own personal gains rather than country's future as a whole. Majority is not interested in hearing your very technical argument on how you would solve their problems, and guess what majority is what you need to win the election.


The only argument I know of that might work on lawmakers is that mandating backdoors to end-to-end encryption puts your own high-tech industrial secrets at high risk from nation-state adversaries. It puts a huge target on that backdoor with very high payoff if you manage to compromise it, provides for very high rewards for any insider who can be bribed into leaking information enabling the use of the backdoor, or huge risk for the family of any insider who could be coerced into leaking such information.


Yes, very much yes.

Your comment has now been enshrined in my encryption web page. [1] Thank you.

[1]: https://everyoneneedsencryption.gavinhoward.com/#encryption-...


ironically this rhetoric might be what pushes officials to use proper self hosted end to end communication channels.


Of course governments can legislate to ban companies from implementing E2E, such that they cannot read intercepted messages. People who then have a need or wish to still send E2E encrypted messages will find ways to do so using existing methods as the OP eloquently points out.

But then the governments will legislate against citizens usage of the technology. This is the logical next step in the over-reach.


When everybody uses E2EE it's a neutral technology. When people homebrew encryption over an insecure channel it's automatically suspect. Encrypted traffic will stick out like a sore thumb and the courts will force users of E2EE to hand over their keys.


> Encrypted traffic will stick out

Not really, there are steganographic LLMs that can disguise encrypted traffic as a regular chat between two people.


have those been adversarially tested though? I highly doubt such texts wouldn't stand out. The conversation flow would look like absolute gibberish to a normal person so an adversarial LLM could be created to sniff out even such tactics.

I'm reminded of Peripheral where the police AIs would have so much info on every person that any irregularity in movement or behaviour of any kind would instantly flag you and put you under investigation.


Well, everything evolves. They will eventually be adversarially tested. Wise societies will realize that it's better to spend resources to prevent physical violence than to control everyone's thought. It seems it broadly goes that way so far.


So they'll send images with encrypted information hidden in colour channels. Are we gonna ban sending images?


> But then the governments will legislate against citizens usage of the technology.

No, given some exceptions (think dictators trying), that's not going to happen, right? At that point it's not a technological issue. You're talking about the right to be cryptical in everyday speech.

Edit: or, in a different approach, they'd have to forbid (private) use of (some) algorithms. Again, that sounds like you'll have bigger issues then, no?


There is a growing minority of people who think this is the way the world is going. Trends in legislation in this area within Western democracies seems to indicate that privacy is being consistently attacked in the name of protecting the wider public.


In almost any context where banning E2E encryption would have useful law-enforcement results against unsophisticated criminals, there is always a legal way to install a wire tap via an active attack. For example, a judge can force Signal to perform a MITM attack, and an unsophisticated target will simply ignore the key change warning. An alternate approach would be to use the auto-update features of the terminal and force the app distributors to sign and push a trojan to that specific target. You might have objections against such law enforcement techniques, but I think they have their place with proper judicial oversight.

Conversely, if the target is aware of such techniques, it's very, very unlikely they would use a channel they know is monitored.

So the trade-off here is: break the privacy of everyone, but allow law enforcement to do passive attacks without judicial overview. It's a no brainer that both things are bad and they do not improve our safety one bit. Who really finds such powers useful? Spooks and entrenched power structures, of course.


> An alternate approach would be to use the auto-update features of the terminal and force the app distributors to sign and push a trojan to that specific target.

This is a compelling reason to build from source for e2e messagers. Google's move to signing apks with their own keys helps facilitate the above.


Realistically speaking, defending against an active attack from your OS distributor targeting you specifically is an exceptionally tall order. It ultimately comes down to obtaining a known safe compiler, building something like a Gentoo stage 1 installation, then manually audit the source of each and every every security patch you apply, knowing that any minor update might actually hide some kind of remotely exploitable back door.

You would need NSA-level of resources to pull this off, I doubt the apk signing by Google changes things significantly for Android users.


True, but don't you think a massive amount of folks would devote time to an open source alternative just to stick it to govs if they tried to pass something like banning unlocked bootloaders? Grapheneos exists with a far smaller potential user base.


The end goal of this can only be to outlaw all secrets (for all citizens) if you think it through. First they say messenger may not E2EE, next they outlaw circumvention (i.e. using GPG encryption inside the messenger, side channels) and after that comes steganography. Don't be fooled, encryption is almost impossible to hide if you want to do it casually.

What is going to remain?

“If privacy is outlawed, only outlaws will have privacy.” ― Philip Zimmermann

It's simply about totalitarian control at this point.


I agree with the author's statement, like probably a lot of people who already understand how e2ee works under the hood.

I'm not sure if writing code on GitHub to be run in a terminal is going to change policy makers' minds.


This article is entirely correct.

It misses the point entirely though. Lawmakers don't care about actually catching criminals, they want to be seen to be catching criminals.

In this case, nobody is made safer. Normal folks are made actively _less safe_ and criminals are pushed further underground.


"you can do crime, just don't do it where i can see you"

-- politicians, probably


This is exactly what happens in the Mail app on macOS when two people exchange mails and both signs them. Once the public key of someone is known, mails are automatically encrypted from now on while the certificate is valid. It’s excessively convenient and I loathe the certificate industry for making the acquisition of certificate such a painful and expensive process, otherwise we could have encrypted communication with emails easily.


People thought banning alcohol, and other drugs, would work.

People thought banning pornography would work.

Some people still think banning personal weapons will work.

I think we don't learn from failures here.


Banning personal weapons has been shown to work empirically. More examples in that vein:

People thought banning driving without a seat belt would work.

People thought banning driving through red lights would work.


While this may not change the minds of any lawmakers, it actually is a decent piece of example code IMO for using some features of libsodium!


I don't think there will be an explicit "ban on e2ee". That I would almost fear less than what I think is on the table now: a targeted way to add cert backdoors just to the top 10-or-so products and browsers. Generally worded of course, but only enforced on the big players. So that 99.9% of users and messages are compromised but no one really has to worry about the rest. It's a good enough improvement and no one cares whether the people they actually talk about catching are using secure services like they have been all along.



I am absolutely not in favour of banning E2EE, but most people aren't technically adept, and even those who are get lazy, and opsec is hard, so in fact such laws probably would catch quite a few people.

In tech we like to think that if a solution doesn't cover 100% of cases then we should't implement it, but in law, it's all about relative impacts -- in this case, increase in conviction rate vs loss of liberties.


It would be trivial to build a bot that can send a stream of normally-looking messages with secrets steganographed into the stream.

Or if you need to send more data, send a picture which offers ability to hide even more encrypted content into it.

I wish a good luck any government agency that tries to analyse bits of selfies exchanged by people to try to figure out if they are actually using it to coordinate nefarious activities.


We have progressive income tax, why not progressive anti-encryption law? E.g. the more money and power you have the less you get to use encryption.


Yes, this was my thoughts too, since this is pushed by politicians, they should roll out the end of encryption to their phones first, and if in 10 years we uncover no new sex scandals or other big scandals about them, it means that having no encryption is perfectly fine.

If they want encryption but with a backdoor, just send the master key to a bunch of random citizens for monitoring.

In my case, I'm French, so they won't last a week like this. Don't give felon politicians a place to hide.


"That said, if your heuristic is 'words used in strange ways' then you will get false positives from all teenagers."

This made my day.


Do not do this, write to lawmakers pointing out how child protection charities are fantasists.


Note, one of the heavy warning voices in the CIA before 9-11 has pointed out in all his books after leaving the CIA that media and other banning only made it harder to know what is going on in the terrorist world and tracking terrorists down for prosecution.

Banning encryption is about like trying to ban synthetic drugs as we have to legally specify a specific and once that happens the demand for it encourages people to find solutions to that ban.


Absolutely, this is what I said when chat control was first proposed; it will only serve to make criminals out of ordinary people.

And even worse, I foresee not only a black market of app distribution without backdoors, but also a lot of young people will prefer these apps because young people are going to be young people online. And if they know what it means to communicate safely I'm sure they will prefer it over anything else.


People will just start building g from source to ensure their apps aren't backdoored. This will happen all the way down to the lowest level (e.g., mass migration to grapheneos).

Maybe not enough due to proprietary baseband blobs, or maybe govs will try to outlaw unlocked bootloaders, but that is likely where things would go.


The author appears to think these bills are actually targeted to "bad guys".


Just allow some countries to ban End to end, and then see them lose financial services.


And web browsers, web sites, a fair proportion of most operating systems, software updates, remote work VPNs, etc.

End to end encryption is baked into the overwhelming majority of everything that travels over the internet these days. If it were to be banned, there wouldn't really be anything left.


That will be the case eventually, and not everyone can write code and make their own e2ee, so I think there has to be some sort of open source solution that allows people to implement some sort of encryption without any coding.


Yes. I made a toy version of this, it's an extension that adds e2ee to your chat websites by hooking into the send button, and the "add message to local list" function.


Return of the Clipper chip Part V.


Encryption is only used by criminals. I have nothing to hide so I don’t need it


You have nothing to hide? Tell me your password to your email and bank account then.


I get encryption for these types of things but if the police need to see things to protect citizens they should be able to.


> "Encryption is only used by criminals."

Utterly and completely wrong on multiple levels. Almost everyone (in "developed" nations) uses encryption (directly or indirectly). Banking today would be largely impossible without it (just to name one really simple example among many).

> "I have nothing to hide so I don’t need it"

“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” ― Edward Snowden

The ACLU has a ton of articles about why this sort of "I have nothing to hide" thinking isn't really well thought out. Here's just one: [1] The web in general has many more such articles. Spend a few minutes with your favorite search engine.

[1]: https://www.aclu.org/news/national-security/you-may-have-not...


The banking and whatever is fine, but encryption should be back doored so the police can see the data if they need to. If you have nothing to hide this is okay.


Encryption should be back doored so every hacker-wannabe and their dog can sell private information on the dark web, and hold data for ransom even more than they already do, you say? I decline, thank you very much.

As to "If you have nothing to hide", everyone has something that's nobody's damn business to know, and weakening encryption is not a solution. It's just a can of worms nobody really wants opened, even though many people ignorant of the consequences think they do.


Mass surveillance by a benevolent AI is the best outcome for humanity. That or correcting the evil out of man by genetic engineering.


> "Mass surveillance by a benevolent AI is the best outcome for humanity."

Except that "benevolent AI" doesn't exist, and likely won't exist before the evil humans abuse existing AI to destroy humanity.


The Day the Internet Stood Still


That's a poor apology for implementing a social rating score. Which is inevitable since you consider you have strictly NOTHING to hide.


How about the Ansible Vaults in my Git repositories?


Master bait




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: