This article is completely wrong, and almost sabotages the fight against the UK Online Safety Bill, given it claims a victory that simply doesn't exist, and so lures everyone into a false sense of complacency. The UK govt must be ecstatic that they have changed nothing and yet the tech industry seems to believe that they've won something.
All the govt said was "we'll only force scanning when it's "technically feasible" to do so" - i.e. when someone believes the CSAM scanning quality is high enough beyond some given threshold. It's still scanning though, and still fundamentally undermines encryption; it's just potentialy delaying the implementation a bit... having enabled it in law. The thing we should be fighting is enabling it in law.
dang: it's almost worth flagging this thread as being based on entirely incorrect data. For whatever reason, the desire to make progress on this issue means that folks have jumped the gun and are prematurely celebrating a win which is not a win, and thus undermining the whole campaign to protect encryption.
Basically all of the secure messaging platforms have indicated that they will pull out of the UK rather than weaken the security model, if or when they are instructed to. That hasn’t changed either. What this may be politically is a way of the Government saying they won’t actually do it, while pretending that nothing has changed to make it look like it’s not a U-turn and pretending they’re not backing down. Hence their eagerness to pretend nothing is different.
As long as the messaging platforms don’t change their commitment then celebrating a small acknowledgement from the Government that the bill is basically unworkable is not a huge issue.
> What this may be politically is a way of the Government saying they won’t actually do it
But this is the problem: the Government is NOT saying they won't actually do it. They're saying "we'll do it (when we consider it technically feasible)". The ability to force 3rd party scanning software is STILL going into law, and that's the catastrophic bit!
It's terrifying that this story has been perceived as a victory, and therefore we can take the pressure off. The OSB is STILL on track to go into law giving the government the right to tell Signal or WhatsApp or Element etc that they either need to do compulsory scanning or they are breaking the law.
I am literally receiving emails from Element customers and suppliers which begin:
> Saw that the government backed down from their plans last night and immediately thought of Element – you must be pleased!
...which just leaves me speechless. I'm almost wondering if there's a deliberate misinformation campaign here to prematurely claim victory in order to derail the attempt to protect encryption in the bill.
> "We haven't changed the bill at all," she told Times Radio.
> "If there was a situation where the mitigations that the social media providers are taking are not enough, and if after further work with the regulator they still can't demonstrate that they can meet the requirements within the bill, then the conversation about technology around encryption takes place," she said.
With the way the law is going the UK could demand that Tech providers provide backdoors into end-to-end encryption.
The providers can refuse.
The UK can then demand that such apps are not available in the UK.
HOWEVER ... the providers can build WASM equivalents that run in the phones browser.
These can be available elsewhere in the world, and there is no way to stop UK residents from installing them.
If there is no other way to have end-to-end encrypted messaging, some provider WILL offer this ... and they'll make it pretty slick.
You can try prosecute each user (not much chance of success).
Legislation that fights well implemented secrecy will always eventually loose, as the government becomes just one more hostile actor, which the tech is already set up to protect against.
If the government pushes too hard, all that happens is that encrypted messaging moves out of app stores into the open internet ... and then, not only can they not see the content, they can barely see who is using it.
> Legislation that fights well implemented secrecy will always
eventually loose, as the government becomes just one more hostile
actor, which the tech is already set up to protect against. If the
government pushes too hard, all that happens is that encrypted
messaging moves out of app stores into the open internet ... and
then, not only can they not see the content, they can barely see who
is using it.
This is insightful. The mistake the UK government is making lies in
it's naked aggression.
It's become sandwiched in hostility toward Big Tech, and the people.
And it's a no win situation.
It literally wants to get in the middle, and that seems a sign of
great fear of losing power in the digital age.
Fundamentally our government lack the humility to engage in
meaningful, evidence-based, debate with all parties, which would be
extremely difficult but necessary.
Of course encrypted messaging is already out on the open internet.
It's just used for social messaging by a relative minority. The reason
governments love (read: awkwardly tolerate) big platforms is they
concentrate use, where they hope it can be "kept an eye on".
For me this is "Police and thieves in the street, fighting the nation
with their guns and ammunition"
It's a three way fight in which the most important group - the people
- are unarmed, indeed entirely excluded.
What we tried to do on https://cybershow.uk is to present some
accessible banter that helps the main stakeholders - children and
vulnerable people - get a better fix on the issues, and have a voice.
It's a little unclear, but my reading of this is that the power to do it will still be in the law, requiring at most secondary legislation to put into effect (perhaps not even that) if they think they ever have enough leverage over messaging providers, or are willing to spend the political capital. Not a great place to be in really, but better than it actually being deployed.
I’d bet my life we start to see a massive influx of bad press aimed at messaging providers, focusing on how criminals are using their services, over the next few years.
When the general sentiment of the average Dave is ‘encryption === bad’ this BS will rear its head again.
Seems to have been the standard play for governments of this country for decades now.
The place has a CEO, made communication to Tim cook, been featured in Wired, put up massive scare mongering examples of bad people, made vague claims of a large organization. (collective effort of concerned child safety experts and advocates) [Prior instance of their page claims researchers, experts, and advocates].
All this and they're solely focused on a small corner case of 1 business and somehow their google presence is just not there.
This whole thing with the heat initiative is straight up bizarre. Their entire dialog is just so over the top.
I'll briefly summarize the back and forth heat had with apple, condensing it makes the weirdness really shine through:
Heat Initiative: "Listen up! We DEMAND that the private data of iCloud users be made accessible for CSAM scans. You are to remove it all. The guilty must be punished."
Apple: "woah, slow down. We already decided against policing iCloud like that, it'd get way too Orwellian way too fast. We can use on-device Communication Safety systems to make our environment safer while respecting privacy. With features such as detection of potential nudity, we can start cutting off CSAM images and video at the source and work to prevent abuse in the first place."
Heat Initiative: "apple, you make stupid money off anything you touch, you employ geniuses, you have responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and video. Also, we really, really demand scanning iCloud."
Apple: "did you just ignore what I just said or is there some deeper issue like not knowing words good. this obsession you have with scanning iCloud makes me suspect you have ulterior motives.
Heat Initiative: "It's not that I ignored you, I just don't really care about this shit and can't keep up this farce. Either you let the government access iCloud or we'll say you're running 'instagram for kiddy rapists.' we will destroy you."
Apple: "yeah, this conversation is over."
It's either terrorism or child pornography. In my country (Romania) they use to call football hooligans whenever a protest is being made on a bigger scale just so they can put us all in the same boat and have an excuse to use armed forces and disperse everyone on the basis that the event was not peacefully done.
Most of the people I talk to are brainwashed anyway and will happily accept it - just give it time. The discussions over privacy always end-up with something like "I have nothing to hide anyway".
Some anonymous entity has been putting up posters in my neighborhood decrying Apple for "allowing" CSAM to be stored on iCloud. Very creepy propaganda campaign.
Those posters are almost certainly plastered up by someone who genuinely believes in the cause. They have been manipulated by corrupt media, but it is a tactical error to assume that a political party is directly paying for such poster campaigns.
It's also guerilla marketing 101, and they wouldn't be the first company, interest group or marketing agency that used those tactics to promote their products, interests, etc.
This isn't the same ad as described in the article, but is a government funded Saatchi one from around the same time:
https://www.noplacetohide.org.uk/
> The campaign is funded by the UK Government and has been developed by a steering group of child safety organisations with support from M&C Saatchi.
I agree. And they're intelligent (pun intended), so they'll select demographics from those of us who opposed them and illustrate extremist scapegoats most outrageous to us.
I think this is too often not understood or forgotten.
Their problem is that they lost a lot of influencing power. Only old people are still watching the TV or trusting the old media. And although they do vote more, their days are numbered.
News cycle perpetuates bitcoin == bad so everyone you know just repeats "scam" and points to criminals.
Meanwhile the largest institutions and richest people are investing heavily due to its revolutionary nature. Just look at all the ETFs coming out (Blackrock, Fidelity, etc).
Once they are fully setup you'll see the news cycle change sentiment. Rinse, repeat, with any technology.
Not the patent but... incredibly environmentally destructive by design. Utterly privacy violating as you're publishing every transaction you make for the world to see on a public ledger so now your neighbour (and government) get to know about your badger fetish. Non reversible transactions are an invitation to fraud and theft and make routine errors like overpayments and payments to the wrong address potentially disastrous. Not to mention the "problems" it claims to fix like instant, fee free transactions and easy international transfers have been fixed for many years in countries with modern banking systems (i.e. not the USA) Etc. Etc. Etc.
Yeah that was my reading as well. The legislation isn't being changed. The statement even says "We know you can develop [the methods to access], and we still have the authority to order it."
The only relevant part of from op is the govt acknowledging that 2+2 = 4. But it fails to acknowledge that if they want to get 5, they can still order the equation to be 3+2.
Through most of history government always has the power, but the
question is whether it has the legitimacy.
In this case it has the legitimacy, but lacks the power.
This is an unusual turn.
We need online safety for kids. The aims of this bill should obtain
widespread support from everyone.
But instead of carefully researching and implementing difficult ideas,
framing it properly and obtaining permission from the people - a remit
to empower us to embrace online safety on our own terms - it's taken a
strictly 20th Century "Mother knows best, think of the children"
approach and made this a battle with Big Tech.
It is laughably "Yes, Prime-Minister" in its clumsiness. We have
anachronistic throwbacks in charge.
People might be more receptive if the UK government had shown any real intention of going after pedos before this. But the number of scandals and coverups indicate they dont. And this is little more than an excuse to make it easier to spy on their subjects.
It also has a recent history of racial discrimination targeting minorities as a result of false accusations of child abuse, so it's worth making sure there isn't an overcorrection particularly in a society where people sometimes still attribute random individuals together based on perceived ethnic origin. See https://www.theguardian.com/uk-news/2023/jan/04/how-eleanor-... for one example.
You've sleight-of-handed out "government" for "the UK" and linked two stories which don't really involve the UK government.
Furthermore if you read your own links, you will see that the "recent history" of the Rotherham offending is that there is an enormous police investigation costing tens of millions of pounds and a large number of people have been convicted, and the "recent history" of the fallout from Jimmy Saville is that another extremely well-funded enquiry was conducted (IICSA).
The general idea that the UK is particularly accomodating to paedophiles, or that an unusual number of powerful people in the UK are paedophiles, is not supported by evidence.
This meme mainly comes from a serial liar, Carl Beech, who's lies were credulously reported by people who should have known better.
Local police is a part of the government as far as the laws in question are concerned; it's them who will be using or abusing the surveillance powers granted by the law. And the Rotherham scandal took two decades to be addressed, due not least to fierce political interference.
Just yesterday it became clear the foreign office will not release files relating to Prince Andrew until 2065 - long after he's dead. Seems like a pretty obvious cover up.
It seems like "a pretty obvious coverup" to you because you are conspiratorially minded. It's actually just the FCDO explaining what the law has always been.
Should the law be changed? Yes. Is this evidence of a "cover-up"? No.
This question is really too broad to answer. What sort of thing are you looking for? For a general overview of the whole system, this strategy is the best place to start.
Thanks. I guess I meant either new primary legislation that improves something tangible, but more importantly, provided more resource.
As far as I can see, it's still funded from local authority budgets.
I worked briefly in children's homes a long time ago. Kids were placed in assessment centers where they were supposed to be evaluated and sent on to an appropriate home. But there were no places for them, and they stayed in this temporary place for years.
For example, the offence of "Sexual communication with a child" aka "grooming" was created in 2017 (by Serious Crime Act). Enforcing this law is one of the main causes of the arguments over the OSB.
We need road safety. But no genius in gouvernement has created a law that’s says it’s illegal to sell cars that CAN have accident. Generally the government goes after bad drivers instead of car companies.
I really don’t understand why Big Tech should magically stop every crime on earth: child abuse, racism, harassment, etc…
I think the supreme Zuck is a dick, but pulling out of the UK market was the right move.
Of curse, but it's not terribly easy for the average person to put
sophisticated filters into multiple content pipelines on every child's
device (imagine having 4 or 5 kids of different ages and needs).
So a solution I think we brainstormed on the show was mandating open
interoperable APIs that allow easy insertion of (presumably commercial
or open source) plugins into the system, within the user's end-to-end
digital estate, under the control of the user (parent) and completely
rejecting the MITM and endpoint compromise via back-doors that the
government naively proposed.
In many ways that would take a much bigger stick to Big Tech,
It also transitions the definition of "online harms" to those defined
by the guardian/parent rather than problematically allowing the State
to define harms and control the selectors.
What that says to me is that the government are dishonest about the
real aims of the bill.
And further, as a consequence, it crushes my belief that the
government even truly care out child safety except as a vehicle to
greater tyranny.
> So a solution I think we brainstormed on the show was mandating open interoperable APIs that allow easy insertion of (presumably commercial or open source) plugins into the system, within the user's end-to-end digital estate, under the control of the user (parent) and completely rejecting the MITM and endpoint compromise via back-doors that the government naively proposed.
What you're proposing would likely enable the creation of some fairly invasive stalkerware. Don't forget that 1) just because a feature says it's for use by a parent on their child's device doesn't mean that it can only be used in that context, nor that 2) not all parents have their children's best interests in mind.
Or too busy? In plenty of families both parents have to work hard to make ends meet.
Not helped by the fact that children are growing up in a completely different environment to the one their parents remember. Familiarizing myself with TikTok or whatever the kids are into these days would fill me with dread. And the way platforms work means my experience of them would differ dramatically from a child's anyway.
>> Familiarizing myself with TikTok or whatever the kids are into these days would fill me with dread.
You don't need to. Don't give them devices until they're 16, and then implement the built in parental controls that come on every smartphone. When they are 18 and can buy their own, they can do what they want. I'm guessing your actual issue isn't familiarizing yourself with TikTok etc, but is instead facing conflict. I'm not saying it would be easy but pretending the above isn't a workable solution is self-deception.
No, I'm just trying to empathise with busy parents rather than write off parents who struggle with this stuff as lazy.
I know some parents give their children devices for safety reasons. For others, I guess they don't want their children to be socially isolated if all the other 14 year olds are chatting on messenger.
My only point was that it can be hard, and that not everyone who fails at it is lazy. It sounds like we are mostly in agreement.
Oh yeah I get that. My point was more that there is a simple solution that requires almost no time investment (i.e. a good option for busy parents). It might not be for everyone but it’s a solution that is to easily dismissed very often.
Wish I could upvote this harder lol. Parents really dont have enough time/energy to do this better than people who are trained in software/legislation. Especially if they have more than 2 kids/are single parents
Sounds like we need a cultural shift away from governments involving themselves in parenting to governments ensuring parents have adequate time to parent.
Id say i trust the competence of government in this arena more than that of parents on average. If someone is working on policy day in and day out for tech related things, they will have a better lay of the land than the average parent. However, you bring up a good point about ethics. A parent will definitely be far more acting in the best interest of their child compared to the government. An interesting quandry, good point
The other test is what can the “other” party do with any power you give the government. I define “other” as the party with viewpoints you are opposed to.
Modern devices are chock full of features that will work hard for parents while they aren’t even thinking about it. If the can’t be bothered to turn those features on, they shouldn’t be providing the devices in the first place.
This is just as lazy a take as the UK government though.
Many societal problems would be trivial if you could get perfect compliance from the population. You can't, so if you're interested in solving problems you need to be willing to grapple with the world as it is. So far as I can tell online safety has not meaningfully improved since the late 90s - "enforce safety at home" has been the advice for all of that time and it has never worked.
I don't have a solution but "blame the parents" seems to be a very clear non-answer without some plan for how to make creating safety at home more easily actionable.
I started using the Internet in 1998, and the only advice my parents gave me was "Don't reveal anything about yourself online." (Thankfully I heeded their warning and still to this day I'm very guarded about disclosing any form of PII.)
Whilst admittedly a lot has changed in the last 25 years, I'd say only half the parents will actually try to keep their kids safe online.
The other half will sit around watching crappy reality TV shows getting angry at their five-year-old children finding porn on their own personal smartphone* because they don't want to look up how to prevent them from doing that, and instead absolve that responsibility to the gov't...
...who in turn use that as an excuse to censor the Internet.
*not entirely sure why a five-year-old needs a smartphone, but anyway.
They said its technically unfeasible right now. A backdoor key is really not feasible for E2E Encryption. So, that would mean it would only become technically feasible when they ask companies to send over all encrypted packets and break the encryption themselves.
Maybe that's why they want to keep a provision for it in the law, but develop the technology to break (current) public-key encryption schemes themselves?
But then they'll always be chasing, as the world moves to post-quantum encryption and they won't be able to break it anymore. So it'll always remain technically unfeasible.
Its likely that from a political standpoint, it was easier to deem the bill as technically unfeasible now rather than kill it completely.
Setting my personal opinion on this law proposal aside, I think that the UK legislation lost its teeth with Brexit. It's just loud barking for the sole purpose of getting CEOs on the table.
Imagine their influence if they would have stayed in the EU, and if France would have joined them (which they usually do when it comes to more governmental oversight of the executive branches of the government).
What scares me a little now is that there was a loss of balance, which is important for any democracy to make progress. And if Big Tech's reaction is always "well then we just pull out of your market(s)" then it's gonna be an empty threat after the third time.
I don't know how the reactions to these events will be like, but most likely we'll see an increase of propaganda press statements on "how bad secure messaging" is, trying to push the narrative into a different direction.
Nah, this nonsense popped up at EU level also and got slapped down by the Parliament and Council suggesting that enough politicos understand why this is important, for now at least.
I don't really like the headline, it makes it sound negative.
Big Tech tends to have negative connotations, nowadays. So, here the FT is trying to say that a democratically elected government is living in fear of private firms.
While it may be true that our government are now living in fear of not just Big Tech but all types of Big whatever, the fight was way beyond just big tech. Sure Big Tech helped but it still is a badly written and badly thought out think-about-the-children type law that was being fought by everyone not just big tech.
I didn't bother to read the article. Headlines are important There are other things to rage about Big Tech, this is not the one.
You're making it sound as if the headline was poorly written, perhaps by accident or by a poor writer.
I can assure you that isn't the case. Whoever wrote that headline is a copywriting genius. The headline conveys almost the exact opposite of what really happened, without being factually wrong.
lol the uk gov has more negative connotations than bigtech.
Also not quite democratic when the uk electorate last voted for a gov in 2019 but we have had 3 prime ministers since all with vastly different strategies, where the last 2 were chosen by anyone who wants to pay for a membership to the tory party, including fake identities made by journalists who registered from france.
If you had some context, bigtech are actually fighting to keep encryption alive and are the goodies in this story.
Context is important, so is reading. But thanks for your insight in the article you didn't read.
I think you misunderstood OP. Their point is the article headline sounds like democracy loosing out to evil big tech; they, like you, don't see it that way.
In the UK parties are free to choose their leader by any means they like. They have a lot less power than a President - which is part of the reason the previous two were deposed.
Do you read it regularly? It is centre aligned but as a reader, I wouldn’t say it was Conservative aligned. It certainly is a mile off its alignment with the “Tory press” that typically set the new agenda.
I'm a subscriber and I'd say it is small-c conservative (which I'm not really, I just appreciate that it's relatively open about its biases and enjoy the quality of writing), but it isn't aligned with the Conservative Party, which has become almost entirely unmoored from reality.
Usually I would tend to agree that the articles are well written and well researched, however after reading this steaming pile of surface level dross on a subject I have technical knowledge of, I might be less trusting of their editorial slant and quality of journalism in future.
The Conservative Party, and parties on the far right parties across the world, have left conservatism behind. Something the the left and centre parties are having to take up.
I think its increasingly true of several countries, not just the UK. Any State with strong Freedom of Information legislation not surprisingly creates incentives for certain political operatives to want to avoid exposure by use of unofficial channels further out of reach of FoI - private WhatsApp groups etc etc. I don't see this as any different than the instances of private email service mischief that has occurred in a lot of States too over the last decade (avoiding use of official email accounts for contentious discussions).
I think that if you hold a position in the government, all of your communication (including private email, chats, etc) should be subject to public scrutiny. A government official should be defined not as someone who is holding power, but as an elected representative of the people.
Of course not, I'm just a random person on the Internet ;)
The problem here is that a basic loophole is being abused to defeat the spirit of the law, thus making the operation of the government less transparent. The consequences should ramp up in proportion with the level of the abuse. You're using private WhatsApp to conduct state business? Your WhatsApp chat logs should be subject to FoI. If you want to keep family chat private, then move it to Telegram, and DON'T use Telegram to conduct state business, because otherwise your Telegram chat logs are next up to be scrutinized. I don't think the "spirit" of this idea is going too far?
> military secrets
There already are established procedures and laws for dealing with state secrets, including strategic information such as submarine designs, locations, nuke launch codes, etc. Whatever FoI/transparency laws are in place, they need to respect the need for protecting state secrets - that should be pretty obvious.
All that would happen much of the time is that people stop writing anything of consequence down, or seek secret channels like the WhatsApp example. NHS employees in the UK as another example admit privately all the time to not recording things in meetings to avoid FoI exposure.
I agree with you in principle, but you are fighting human nature in practice - generally no one wants to look bad in public. I think FoI legislation is important, but in practice the results have often been mixed, and the approach probably needs to be nuanced.
This isn't even limited to the UK. In the US, we have presidents, vice presidents, and presidential candidates all under investigation for mishandling records by running their own e-mail servers. The most charitable explanation being that they don't want to have to deal with separate personal and work devices; the less charitable explanation being "fuck records laws".
I was a federal contractor for NASA and let me tell you about government email: quota of 250MB of storage (2010-2012).
I spent every morning moving emails with attachments to local folders on my Mac (which had no backups) to keep from going over quota. If I went on a 2 week vacation emails would start to bounce to me.
After that fiasco I understood why Colin Powell told Hillary Clinton to just run her own email server (for non-classified communication).
It's not completely untrue, there was a whole hoo-hah over getting Boris Johnsons WhatsApp messages. They use it to get around the requirement that official communications be logged and available for later scrutiny, much like a bank has to retain communications in case of an audit.
You also need to understand the context that in the UK, and most of the west other than the US, WhatsApp is just the default communication channel for messaging. If you text someone, it's just assumed it's via WhatsApp.
No. I have lived in the UK for 35 years and I didn't even know Google Messages existed until you mentioned it. WhatsApp is the default and I would only use SMS in the rare case that someone did not use WhatsApp.
There is no automatic integration on Android phones between SMS and another network the way iMessage is automatic.
> I have lived in the UK for 35 years and I didn't even know Google Messages existed until you mentioned it.
Exactly, even though it is the default messaging app on Android and receives their SMS texts for them, most users don't realise they are using it or that when they "send a text" to another Android user it is actually sending the message over data.
Anyway my understanding is that a text is a SMS text, and a whatsapp is a whatsapp message. I know that Google Messages hides itself by sending SMS texts when the number isn't known to Messages, and I guess Whatsapp does this too. I also live in the UK and occupy a space within multiple different communities which insist on different messaging apps. May be I just don't the option to be so vague.
I think what the parent meant was that if someone wants you to "text" them, they actually mean to send a message on WhatsApp and nothing to do with whatever texting on a given platform is.
Also, how are you data mining end-to-end encrypted messages? How can you get data out of messages you can't even see the contents of?
Well I assume that the end-to-end encryption is honoured, and the message is not parsed in any way at either end. I also assume that the end-to-end reference is between your phone and the phone of the intended recipient, and not between your phone and the server routing the message. I don't have access to the source so I don't have any way to prove that, but with the objections by Facebook/Whatsapp/Meta to breaking the encryption this seems to be the case.
No, the data-mining with these apps is in your list of contacts. Whatsapp takes these and builds you into a network. Its not what you say, it is who you are talking to and when. That is valuable.
When you sign up, you up load your contacts to Whatsapp. If you try to prevent it, it will insist and not work until you do. You can try to clear down your contacts and sign up without any. However, it will still take your number and look it up in the contacts of all those who have your number. It now has your name and any details your contacts have chosen to keep about you.
Also worth a note that Boris Johnson (former prime minister for those far from the UK) himself said he was happy to share all those WhatsApp messages on his phone with the investigation team, and it was a government department that stepped in to refuse access to those messages, repeatedly. They considered the WhatsApp messages too sensitive for an official investigation to read.
I think the replies and downvoter may have missed my intended point, which is that WhatsApp seems to be simultaneously treated as not an official channel and yet also treated as containing officially sensitive messages.
Whether Boris was telling the truth or not was irrelevant to my intended point.
The article you've linked says it didn't go far (buggy, vulnerable, difficult to use, test deployment in 2020). Also, Google Play store's reviews say the same.
It's only natural that when a public servant want to make sure their message actually reaches the recipient, they would rather use a solution that 100% works. That's not malice, it's common sense.
This question confuses me. Are we really in a year where people don't know you can securely encrypt email within an organization? This is a solved problem, and it's been solved for decades.
It's confusing because you nailed "Plain-text emails" but completely ignored signed, encrypted emails for some reason.
I think the question is not whether it's possible to set up in theory.
The question is whether it's possible to actually deploy and maintain the solution in reality to millions of concurrent users in hundreds thousands of disjoint organizations in thousands of municipalities all over the country. It would require a constant supply (due to organizational churn) of highly qualified tech people who'd willing to be working years for low-five-figures salary in a rigid ineffective governmental structure, surrounded only by non-tech-savvy people. These tech people should also be saints so that they don't embezzle half the budgets while trying to set things up. Also the top visionary who'd push for this multi-year project should not be subject to election cycles.
I honestly think that this level of concentration of long-term effort might be impossible in a country with more than 10 million people or so.
How is it different from WhatsApp (apart from being worse feature-wise and not many people using it)? As far as I remember, WhatsApp uses Signal's encryption, so technically they're the same level of protection?
Not technically feasible is akin to abandonment in government circles.
To revive this, they would have to find an expert to attest that it is technically feasible to have security with a backdoor that government can access, but at the same time is impossible for malicious entities to access.
Ergo, this is technically dead, which is the best form of dead.
> To revive this, they would have to find an expert to attest that it is technically feasible to have security with a backdoor that government can access, but at the same time is impossible for malicious entities to access.
> Ergo, this is technically dead, which is the best form of dead.
Except it's not. There exist such cryptographic trapdoor constructions that are perfectly secure, if the government backdoor key is kept safe.
The problem is keeping the government backdoor key safe. But that's not a literal impossible technical problem. It's much more a social problem.
Don't get me wrong, I really, really wish what you said was true and we could kill this garbage forever by nature of technical argument. But it isn't, so we must keep fighting against it for the real reason: we simply don't want this.
well, by definition if the key is to be used, and to be used more than once, it cannot be kept safe. The key has to go through multiple hands on its way from the senior government official responsible for its safekeeping to the peon assigned to unlock a specific phone at a specific point in time. It could be copied at any one of those points. No amount of technology or cryptography can solve the master key problem. The social problem is the technical problem, they aren't distinct.
So you just make the three companies keep the keys then. People are out here like "a secure backdoor to encryption is impossible" and then don't even blink for the keys for root CAs which is the basis for the world's online security. Or the AWS managed S3 encryption keys.
There's a lot of of hopium in this thread for people who I think want it to be more impossible in practice than it really is.
It was never even suggested that the government would have encryption keys. The government do not have access to SSL traffic, but companies are responsible for CSAM uploaded over SSL.
If a software signing key is compromised it can be revoked and a few weeks later the risk is only to people who don't keep their OS up to date. Further, exploited compromises are detectable, especially if exploited at scale.
If the backdoor crypto key is compromised, sure they can revoke it (assuming they manage to design a competent system), but all the sensitive information up that point is now available to whoever possesses the backdoor key. Unlike the software signing case, exploitation of the compromise is likely undetectable unless the attacker reveals their knowledge somehow.
The same is true of SSL traffic to a bank though isn't it? If a crime group is intercepted encrypted traffic and saving it, then the keys are stolen, they can decrypt that data.
But opponents of the OSB claim it will make communication with your bank less secure - how?
> Microsoft, Google, Apple etc are keeping the keys that allow you to push updates secret, aren't they?
From yesterday:
> the China-Based threat actor, Storm-0558, used an acquired Microsoft account (MSA) consumer key to forge tokens to access OWA and Outlook.com. Upon identifying that the threat actor had acquired the consumer key, Microsoft performed a comprehensive technical investigation into the acquisition of the Microsoft account consumer signing key, including how it was used to access enterprise email.
> There exist such cryptographic trapdoor constructions that are perfectly secure, if the government backdoor key is kept safe.
A big problem with this statement is the term “the government”. If you give the private key to the UK - the US, India and China will want a copy as well.
The UK might want to spy on foreign nationals only in the country and only for CSAM, but that doesn’t mean other nations won’t use it for more traditional espionage.
The ostensible reasoning is "think of the children" horseshit, but history proves such a powerful capability will be abused for unrestrained spying.
Key escrow for the entire US and world was floated with the Clipper chip (1993-1996). That was strangled in its crib because trusting thousands of people at NSA or GCHQ to just not stalk people is sheer fantasy, just as the Snowden leaks revealed.
iMessage stores the e2ee key in iCloud by default, which effectively makes all of a user's communications decryptable by governments and Apple at any time.
To offer a centralized service with actual privacy without zero knowledge p2p constructions, then it falls victim to the Lavabit problem. If you want security and plausible anonymity across your own devices, not metadata, then use a fork of Signal such as Session. (Signal is irreparably broken by being tied to phone number, which is a universal tracking device. The only people who use Signal are drug dealers and software engineers who don't know any better.)
Why is Signal "irreparably" broken? What makes the phone number issue "irreparable"? As I understand it usernames and phone number privacy are in the pipeline.
I'm a software engineer who does know; I'm aware that Signal is currently tied to phone numbers, and I'd love for it not to be, but I still use it, because it's E2EE and easy for non-technical people to use.
When there's something that's easy to use like Signal that uses decentralized cryptographic identifiers and onion routes all traffic, I'll start trying to get people to use that. I'd be happy to hear any recommendations.
If you have a mobile phone number, the domestic intelligence agency knows exactly where you are at all times and any LEO (without a warrant) can also find you. In addition, there have been numerous CCC presentations showing how insecure the global (excluding US) and (separately) US carriers are guilty of promiscuous metadata trafficking ($$) and insecure SS7 setups. As a consequence, for low $, you can go to any one of several shady websites and find the last location of almost any phone number (person unique ID) globally. There are additional varying exploitable vulnerabilities depending on the exact combination of {handset x carrier x country} to impersonate them, tap their line, reveal their exact location, and redirect their phone number through a third-party handset or even a PBX. These are more expensive and some capabilities are forbidden for all but a few selective intelligence uses.
Session (Signal fork) doesn't use phone numbers. It's pretty well-designed overall and uses an onion routing approach. It's already a superset of Signal except it doesn't use phone numbers. https://getsession.org
PS: Using regular TOR on home broadband or cloud servers is relatively risky and inefficient. Sybil attacks on it are common. And to network operators and security agencies it gives an easy "flow tag" of your uplink and exit node data traffic as automatically suspicious.
And don't be naive. The UK absolutely wants it so that it can surveil both it's citizens and everyone else (including within other governments) around the world.
Ye I find it somewhat amusing that sharing a private key with the government is technically impossible. I guess you could be philosophical about whether it is private though, in that case.
Anyway, I am gladly surprised they seem to back off.
Make no mistake this is going to come back around again. Fundamentally the state security apparatus will always want ways to peer into messages and break encryption despite the damages that will inevitably cause when their control mechanisms slip. It's been explained a dozen times to a dozen different administrations and they always come back asking for access.
So excuse what may be my profound ignorance, but aren't there 2 unencrypted points in every communication that they could intercept?
Very roughly, I assume every Whatsapp message follows something along the lines of:
1. Unencrypted input
2. Encryption
3. Encrypted transmission
4. Decryption
5. Unencrypted stream to display handler
Technically - what's to stop them from compelling Apple and Google into putting a software keyboard logger inbetween 1 & 2 and another output logger between 4 & 5?
Edit: I'm not saying this backdoor would be secure btw. Of course it wouldn't. But that seems to me a separate issue than "breaking encryption"
> A planned statement to the House of Lords on Wednesday afternoon will mark an eleventh-hour effort by ministers to end a stand-off with tech companies, including WhatsApp, that have threatened to pull their services from the UK
Dear FT, WhatsApp is not a company, the owner of the WhatsApp service is Meta Inc., also the owner of Facebook and Instagram. (It is misleading to citizens that companies can hide behind the names of their acquisitions.)
Can you elaborate? The article is paywalled. In what sense did they “back down” if not by backing down from the legislation that would violate privacy?
This is framed wrong: It isn't UK against "Big Tech" it's the UK government against privacy and, to some extent, basic logic. That is, the UK government demanded something that's impossible to deliver, and would have ruined the privacy rights of its own citizens had it been able to force companies to deliver the next closest thing which actually got the government its core demands of the police being able to snoop on everyone's messages.
Their purposes have been served. Values have been signalled. Implementation was never going to be possible, which made it all the better a choice, as it means you don't have to actually do anything except blame tech companies when it doesn't happen. Job done.
This is a right-wing Tory government. The same people who initiated the Brexit mess mainly because they wanted to opt out of human rights (sorry I mean "reclaim their sovereignty").
What's weird about that is how it has leapfrogged law in 'real-life'. Minors aren't prohibited from talking to adults face-to-face, and you don't need to show proof of age before starting a conversation with someone! This new bill would turn the internet from an environment where nothing is age-restricted (in practice, if not in law) to one where everything is^.
^ Again, in practice if not in law: since no service provider could fully identify and restrict all 'adult' material in real-time, they will be effectively unable to serve any interactive content to minors. Only if the penalties for non-compliance were low enough would the largest of companies take that risk.
You see this time and time again, some initiative to "just introduce some backdoors, what could go wrong", and then it takes some time for people who understand what it actually means to convince them that it is in fact a really bad idea and it would be a giant disaster.
Why ban e2ee when you could just pass a law giving LEO's the right to passively turn on any mic or camera or look through photos and messages on any smartphone at any time? I mean, how can they keep people safe without that access? Think of the children!
So it seems from the news that it was industry that forced this, but do we know how effective our campaigning and emails to MPs were? Or just some un-noteworthy political cog wheel action?
How could we find out? Do the reasons get leaked unofficially usually?
"The source of the bill itself, the UK Conservative Party, has a
significant number of its own critics calling it "fundamentally
misdesigned" David Davis said its well-intentioned attempts may
constitute "the biggest accidental curtailment of free speech in
modern history."
(* sadly my other sincere comment has been buried by people who
apparently can't read past the first line)
> do we know how effective our campaigning and emails to MPs were?
Campaigning to your MP is and always has been a waste of time.
In addition, the "safer" their seat, the more of a waste of space the MP is because they know their constituents would vote for a pig if the right coloured rosette pinned to it.
Most of the time they don't bother replying, and then if they do reply, you get a two-page party political broadcast, followed by a generic paragraph about "how they understand your concern blah blah blah" but never addressing the point at hand.
I know someone who writes very frequently to her MP with very reasonable criticisms of government activities. The MP always takes the time to respond, despite not actually changing her position in Parliament on any topic. Poor women to have to endure that - both of them!
So they idiotically put into the law something like "wait for technology to be developed to allow snooping without compromising security"? Such lawmakers shouldn't allowed anywhere near making laws.
unbelievable how pols never seem to get it. the purpose of making new laws is to further protect the public - not damage the public to make LEO jobs easier.
sure, it sucks that they can't sit in their office and wiretap anyone, or might seize a phone that they can't crack. all it does mean they have to get off their asses and investigate at point-of-crime, possibly in person.
to society, the great thing about that is that direct investigation is not scalable, which means that collateral damage will be less common among the innocent and bystanders.
I really wish this was a thing. No encryption, or a weak one, or backdoor with government keys that gets leaked (they always leak). Then use those to really spy and watch a firestorm of publicly washed clothes scandals. I really want that. Then, maybe then, the idiots idiots in parliaments around the world will stop this idiocy once and for all.
Imagine nuclear secrets being accessed by every grandma on the planet, that would be laughable. Or the politicians payments to their who...i mean deluxe escorts.
I’d like to see social media companies taking the opportunity to cut off all services to this country. Use them as an example so other nations don’t get any wild ideas.
brilliant idea. let those governments know who really rules the world. we already have a bunch of dictatorships censoring the internet on purpose, so let's do their job for countries whose democratic processes can't do it for them.
I just wanna tell everyone here who is happy about how this outcome was acheived, never to complain about corporations running your democracy. You can't complain only when the outcome is undesirable. You surrendered "by the people" part if democracy because it suited your views.
[The law] will only require companies to scan their networks when a technology is developed that is capable of doing so […] experts believe it could be years before any such technology is developed
What do they mean? Image recognition via homomorphic encryption? “Years” feels like an understatement!
I've found people usually don't understand encryption enough, once you explain to them it's pretty much all or nothing and the ramifications, they usually come around to the common sense version of "we indeed do need encryption"
The real question is why did they want this? Is the UK suffering some giant crime wave or are the powers that be just really intent on making sure people are using Bad Think in their private chats?
This bill (the Online Safety Bill) has a long and politically complicated history. It was originally motivated by the Cameron government's fairly limited desire to mandate that public WiFi had porn filters in place and then seems to have grown over many years to include a huge number of pet projects and power grabs from various career bureaucrats.
I don't think politicians set out to do this but it's been around in some form or other in Whitehall for so long that there's no real responsibility anywhere and it was low priority enough that noone ever thought to properly kill it.
No, they just see that the citizens will have access to greater privacy and security than before (to protect everyone's private information from criminal hackers, foreign states, etc.) and fear losing access to surveillance.
That this will lead to a significant amount of crime is simply assumed, and not from any basis in reality. They also misuse statistics to try and make issues look larger than they are, which does a disservice when these are actually really serious issues. Like with CSAM, they always talk about so many millions of reports, without mentioning that reports are only suspected content, and the vast majority turn out not to actually be illegal (something like 70-80% of reports are discarded by law enforcement in the first pass - but it's understandable that platforms have to be extremely conservative and report anything that even looks a little like it could possibly be CSAM). Then there are massive amounts of duplication and things like that. But it is a serious issue, so it does my head in that they misuse the 'big, impressive reports number' when they know only a fraction lead to investigations, because that just gives critics a basis to show that they're massively exaggerating which could lead people to minimise the issues.
Of course, at the end of the day, you can't surveil your way to safety and these problems are best solved at the source. But that would require things like fighting poverty (oh no, welfare spending!) and deploying large numbers of social workers to help support people in marginalised communities.
As we progress with climate change and climate disaster, it’s clear that eco terrorism is going to be increasing. This has been especially highlighted in UK.
I put it in quotes because honestly it’s just fighting for survival at this point, but the ones in charge have decided to add the word terror to make it scarier.
I had to read through your comment a few times to try to understand it. I think what you're saying is that ecological campaigning groups (particularly, I would guess, those such as XE without a formal hierarchy) are perceived by the current British government to be a threat to the establishment and so they want to prohibit these groups' use of encrypted group chats.
Personally, I don't think this is a particularly legitimate concern (and by extension, not one which I would assume the government has), since the whole point of revolutions is that they don't need to be planned; they are a groundswell of popular opinion turned violent. Even if encrypted group chats were necessary for ecological campaigning groups, I don't believe they are popular enough among the British public or violent enough to make even a minor dent in the Britain establishment.
And I that was kind of extinction rebellion's whole point: money talks, let's hit the money, and sure enough the gained attention. However, if economic crimes were equivalent to terrorism there'd be a lot more fraudsters in high security prisons.
The bill covers a wide range of topics including cyber bullying, online scams and fraud, hate speech, content moderation, terrorism.
The basic premise is that online platforms like Facebook and X will have more legal responsibility and requirements to follow particular UK law and guidance on content that their platforms host.
This is very popular with the press and enjoys widespread public support in polling. All major political parties support it.
The encryption and scanning aspects of it are just one part of the bill.
Because there's a bunch of incompetent people being paid to do bullshit jobs like coming up with this crap. It doesn't matter what it is, someone will make it their thing to make it happen because even having a negative impact is having an impact.
"abandons" seems overstated; "The UK government has conceded it will not use controversial powers" does not mean it doesn't claim to have those powers based on the legislation.
I wonder where does this end? I do feel like nearly once a year some country in western world tries to ban encryption. Can we just make it a right to encrypt communications and be done with this endless debate?
Take a step back from communicating secrets. It should be a fundamental human right to perform math. That's what they're trying to ban. Some equations are apparently too powerful to be performed by lowly humans.
Math isn’t the same as what you do with Math though. Math is the science of patterns. As such following your reason you could apply the same to anything that is a crime today and say it’s your way to express your relationship with Math.
Just to be clear I’m all for encryption and our right to do so, but I feel that equating this to our defence for math is a couple bridges too far?
>It should be a fundamental human right to perform math.
Why?
What if society could be destroyed by performing math? Should it still be a fundamental right? What if the entire universe could be destroyed by doing math?
This reminds me of a couple episodes of The Twilight Zone and The Outer Limits. In one ("Need to Know"), there's some secret; when person A tells person B the secret, person B goes insane (and then tries to tell others the secret). Should a verbally-communicated secret be illegal? In the other, a disgruntled college student figures out how to build a small fusion bomb, and uses it for terrorism and threatens to detonate much larger versions of the bomb. At the end, he's killed and takes his secret with him, but the implication is that the principles aren't really that difficult, and sooner or later some other angry person will figure out how to make such bombs and humanity will be doomed.
Did anyone ever actually cut their police budget? I was under the impression that the problems in SF were a combination of the police having their feelings hurt (and therefore refusing to do their jobs), and prosecutors refusing to actually prosecute anyone.
legislate all you want but enshrining it as a right won’t magically fix that your government is antagonistic to your privacy. the US has its fourth amendment for 200+ years but that’s never stopped it from wiretapping its own citizens despite that being a pretty unambiguous violation of the text.
if you want to “be done with the endless debate” then perhaps use (and contribute to) protocols/infrastructure which aren’t so easily governed.
It’s not science or logic that convinced the UK government, one of the most determined enemies of the Internet. It’s that Meta showed it will follow up on threats when it removed all links to news sites in Canada.
UK politicians love WhatsApp’s end to end encryption as it allows them to evade public records laws (as well as snooping by UK spooks who have not hesitated to wiretap British Prime Ministers like Harold Wilson). Boris Johnson’s extensive use of WhatsApp specifically was in the news during the Covid inquiry.
It’s perfectly OK for the proles to lose encryption, but not for politicians.
> The UK government has conceded it will not use controversial powers in the online safety bill to scan messaging apps for harmful content until it is “technically feasible” to do so (...)
That would be waiting for a quantum computer and quietly hoping that a) nobody develops a strong enough post-quantum scheme and b) there is still civilization after RSA and ECC are broken? Correct me if I'm wrong.
Oh no. That "technically feasible" translates to "when the government will be able to pass the practical parts of this legislation without too many people asking too many questions".
"Strong enough post-quantum schemes" already exist, and every single mainstream communications platform will update to become quantum-proof overnight if/when quantum computers approach that level of capability. Quantum computers cracking encryption is really not a concern on anyone's mind, at least no more than, say, modern processors cracking SHA-1 etc.
There were a lot of pqcrypto candidates, and several of them were indeed thoroughly broken, prey to the fearsome cryptanalyst's laptop left running over a weekend
NIST standardized Kyber and Dilithium, and for now at least, they seem to be holding up. I'd still want to do hybrid (ECC+PQ) asymmetric crypto for the time being, but we're (slowly) starting to gain a modicum of confidence in the new standards, enough for deployment
You can basically just make the numbers bigger. Quantum computers aren't magic, and are still limited in what and how they can process within normal informational theories.
It's already perfectly feasible to do. Meta/Apple etc. can just deploy a client that decrypts the message, scans it, re-encrypts (with a different key) and sends it to their storage where they can store it forever and decrypt if needed.
This way they could even have different clients in different regions still being compatible.
It's just that it would suck and would not be secure any more.
This isn't the first time I've seen someone on HN act like one-time pads are the solution to all of the problems of cryptography.
It's like people read that OTPs are the only encryption method that has been proven to be completely unbreakable (when used correctly) and stop reading there, and then completely miss all the things OTPs don't solve (ie, guaranteeing authenticity), not to mention their massive glaring limitation: How do you transfer the encryption key?
Is quantum computing relevant to symmetric encryption like OTP? GP was talking about asymmetric encryption. My limited understanding is that quantum computing is a threat to asymmetric encryption.
There's also the question of, if you can distribute a key which is at least the same size as your message over a secure channel - why not just distribute your message over that channel in the first place?
Because with QKD you can distribute a random key knowing that there were no observers but you cannot distribute a message with the same guarantees. Specifically, any given bit exchanged might be observed, but that is detectable so the bit can be discarded.
I read some years ago about a non quantum technique to achieve the same based on (I think) noise in a coupled electronic system. I wonder if that has been tested further.
One-time pads are obviously not a serious widespread cryptography proposal.
But the question of, "Why not just send the message instead of the pad" is pretty straightforward: when you have the opportunity to safely deliver the pad, you don't know what the message will be. When you do know what the message will be, you don't have the opportunity to safely deliver the pad.
The difference between one-time pad and stream cipher is provable, absolute secrecy, and really good secrecy. If don't care about that, there is zero point to one-time pad.
Also, it isn't just a "chunk", for one-time pad it has to be the same length as the messages. Which is fine if just short messages but a lot harder if lots of data.
If can exchange lots of data, better off using them as keys for stream cipher.
Doing some armchair navel gazing cryptanalysis, but isn't that only true if you assume the OTP has access to true randomness? What if the attacker breaks your CSPRNG? Or what if the universe is deterministic and therefore a true RNG is impossible?
Similarly relaxing in my armchair, a deterministic universe is compatible with a CSPRNG as long as the information required to recover it's internal state is too diffuse to recover, or is outside the light cone of your adversary.
Eg, rolling a dice is deterministic, and I imagine an algorithm exists that could recover the value of a dice throw from a recording of the sound of it rolling and it's initial position. But once that sound has turned into heat, and that heat has conducted itself about the walls and into the air, I don't think it's possible to recover the sound.
I'm not sure physics really does say that. Physicists seem to believe that information is never lost - but that doesn't mean the information can be retrieved. If it's in a fragile state, then the act of measuring it might change it. Eg an electron has both a position and a momentum, but that doesn't mean you can measure it's velocity.
When you burn a document, all the matter might be transferred into the smoke, but you've rendered it into a stream of particles which is small enough to be effected by Brownian motion. Reversing the process (figuring out the initial position of each soot particle) involves knowing the position and momentum of the air molecules impacting the soot particles. In principle, you could take the current position and momentum of those particles and extrapolate backwards - but you can't actually measure that, not even in theory.
But quantum computing can put the ciphertext in a quantum superposition between solved and unsolved state. Only problem to remain will be simple matter of determining what the plaintext is to be.
Unpopular puffin opinion ahead, for the sake of argument only. Pls respond with arguments vs just downvoting.
Throughout history, encryption has always been a government privilege. Only very recently, and then only in the West, practically speaking, is encryption that is safe from the eyes of government available. The liberalization of encryption has been concomitant with a naive "end of history" belief that once everyone was connected as part of a global network, we'd see an inevitable global rise in liberal forces that would make governments unable to control speech. See Bill Clinton's on "nailing Jello to the wall". Clinton whose administration did liberalize encryption quite a bit.
That "end of history" mindset was wrong. We are back in a quasi cold war, with an active proxy war on top of it and a few more smoldering. And so my prediction is that we will see a reversal to the mean of history, and cryptography become again a govt granted privilege with conditions attached. (Yes yes I know public implementations of every algorithm exist, but China showed the blueprint of the machine that nails Jello to the wall anyways.)
>Throughout history, encryption has always been a government privilege.
no, throughout history has always been the privilege of those with the ability to encrypt, which was often governments, big businesses, or in some cases very clever individuals.
with the ability to encrypt becoming simpler it follows that the cleverness of individuals to achieve it has decreased.
There's no way bad actors will adhere to these restrictions anymore. Unless one's view of government is a very cynical one (they only care about controlling the general population vs actually fighting crime), tightly controlling cryptography is just not the effective tool anymore that it had been in the 90s.
Is this the same argument as "bad people will always flout rules, might as well give every person a machine gun so the good guys can have an even fight"? Looking at China, I think that it is very possible to make it least very uncomfortable, so uncomfortable as to be an effective deterrent.
> And so my prediction is that we will see a reversal to the mean of history, and cryptography become again a govt granted privilege with conditions attached.
Electronic financial transactions (banking, e-commerce,...) and commercial communications are two main reasons we won't see this. Too much impact on the economic levels and the government already have access to this data, so it will be just cons, no pro.
You won't catch bad actors, they will move to illegal services and protocols. I think the voice of reason just prevailed and they realized that it's a bad idea. The only next step I can see is the requirement for data persistence to be done inside the country.
Financial transactions are already typically transparent to the government (they can just ask the registered banks for data). As for commercial communications, it's already the case in many jurisdictions that every service provider operating in that country must provide their private keys to a key escrow. That does not cause a visible collapse of Internet-based services in these countries.
I feel it can be said, without "conspiracy" or paranoia, that there's
a widespread will to bury all activity around this bill.
Government doesn't want it debated or scrutinised. Tech companies want
it to go away. The media doesn't understand it and cannot communicate
the issues. People are scared or too pre-polarised to take a position.
It's been kicked into the long grass by 4 prime-ministers. Even
mentioning here that it is complex and worth examining both sides gets
one down-voted to hell (judging by my other comment).
That "other comment" of yours did nothing of the sort. You merely stated "it's something we strongly support" without further elaboration or explanation.
Seems like they keep dancing up to this in democratic countries and then back off. Presumably there is someone who understands the universe somewhere along the way who gives them a compelling warning.
It's very easy to protect kids online - simply don't allow them online. Banning children from the internet violates fewer people's rights(the number of children) than violating everyone's right to privacy(the total population: adults + children).
The podcast makes a unsubstantiated and unexamined assumption: kids must be online. A cursory glance reveals that they in fact do not.
Ya know, I actually upvoted your very-downvoted comment above, but if you’re going to hand out homework, make sure it’s not 1.25 hours long. If you have a point to make, make it; don’t outsource it to long-winded podcasters.
> How could such a gulf emerge between good intent and practice?
Road to hell paved with good intentions - always has been.
To be honest, I don't think it could have gone any differently. It's an eminently hard thing to achieve: we want everyone to be free on the internet, but we also want "bad guys" not to be, and you can't really disjoint the two sets of people.
The technical aspects of this don't matter one way or the other if you believe that it is a fundamental human right to have secrets and communicate them. That's the root question here.
If you say, "well yes but only for legal purposes", then the jig is up. End-to-end encryption must die. You can fiddle here and there with what getting the warrants and the wiretaps and all that looks like but you have decided the fundamental question. From then on, between civil libertarians and law enforcement the issue becomes a skrimage. The courts will hash it out and subsequent legislation will ebb and flow the limits on both sides. But you have decided the fundamental question. Humans are not allowed to actually have secrets. Period. That is what happens the second you make this conditioned in any way.
Of course there are better and worse versions of this that can exist but ultimately I think it should be opposed as a state violation of human rights.