I feel we've been collectively losing the battle to keep our conversations private. The anti-encryption laws are likely vocally opposed only by a minority, while the majority believes they had no privacy to begin with and governments can read your messages at a whim. And perhaps that's true to some extent. But since most people believe the battle is lost, moreso that this has always been the status quo, then any battle on the subject is lost before it has a chance to begin. We have capitulated on privacy, because it's a vague concept and we don't equate it with freedom, or perhaps our sense of being free is so ingrained in modern societies that we see no risk to it being lost lest something drastic and immediate takes it away, when in fact the very system designed to protect our freedoms (led by people that look like us, think like us and enjoy these freedoms as much as we do) is malfunctioning and slowly erodes rights that previous generations enjoyed. We're not collectively trying to harm our freedoms and yet here we are.
And shortsightedness on the side of lawmakers is baffling. Nobody takes responsibility for vision, we just go along with implementing solutions without considering broader impact or history. If the government has all your correspondence and the government falls into the wrong hands, you're toast, assuming you do not align with the leadership. We're writing that possibility off, but someone gets to brag that they've written legislation to stop the bad guys -- and maybe they did, but the cost was our collective freedom.
It's crazy that we allow this at all. The government can't observe clandestine conversations that occur in person, does that mean they can mandate us to carry a government recording device? That people don't equally balk at requests to encumber encryption is baffling.
I like this vein of argument: assume that it is good that the government be able to snoop on your text messages. Why not in-person conversations as well? OK, let's make it so everything you say gets recorded. Eesh.
> Assume that it is good that the government be able to snoop on your text messages. Why not in-person conversations as well?
Playing devil's advocate: in most US states, the FBI can in fact plant someone with a wire to record a private conversation. They don't have the resources to record all in-person conversations, but maybe they would if they could.
Isn’t a warrant required in advance to place the recording device? Many unencrypted messages will be recorded automatically and be retrievable indefinitely.
I don't believe encryption should have intentional backdoors, but more because of the technical impossibility of doing this without introducing security risk. Your analogy doesn't really work, because what governments are arguing they should be able to do is break encryption and snoop only when given a warrant. They can snoop on in-person conversations when they get a warrant and the general public doesn't oppose that, even though they would not want to carry 24/7 recording devices that snooped on every conversation even in the absence of probable cause to be suspected of a criminal conspiracy.
Couldn't they still use their original strategies for spying if they had a warrant? Just get the phone and install a secret microphone inside, or inside a house or a car. This can be done with cameras, too, though the text on a phone might be hard to read.
Yeah... the crux is should electronic conversations be afforded the same rights, and by extension are they assumed rights for all communication? Where to draw the line? It's definitely murky.
It shouldn't be murky at all. For all of the history of humanity, privacy has been the default state. I could talk to people face to face (what used to be the only way) and it was by default private. I could keep written notes and records and they were by default private.
There were special-case mechanisms to violate that privacy (e.g. search warrants, targeted spying) but by their nature they target specific people and, at least most of the time, go through a process with some checks and balances. I don't have any objection to this type of spying. If someone is suspected of a serious crime, it's reasonable for society to have a way to approve planting some surveillance bugs on them and them only.
It's only now that nearly all communication is over third party systems that government suddenly feel it's ok to spy on everyone and all the time. It's clearly not ok, nothing murky about that.
> special-case mechanisms to violate that privacy (e.g. search warrants, targeted spying)… go through a process with some checks and balances. I don't have any objection to this type of spying
I think this is why it’s such a hot issue though. E2E Encryption is a “get out of all surveillance free card.” Even a mildly trained criminal or terrorist can easily guarantee that his communications will never be intercepted. This has never existed before.
I share with people in the pro-Encryption camp the acknowledgments that you can’t un-invent encryption, so you’ll only be catching the most dim-witted criminals by nerfing the mainstream messengers. Anybody can use the ‘OpenSSL” cli to make unbreakable encryption no matter what laws say.
But I also acknowledge how frustrating it is that a truly bad person can simply bypass all the “just” exceptions to privacy, like a search warrant, if they’re even a little savvy.
TL;DR nerfing iMessage (etc) ain’t it, but I can see how non-evil people in law enforcement wish something could be done about the root problem, which is somewhat new.
> E2E Encryption is a “get out of all surveillance free card.”
That's the misdirection the pro-surveillance agencies use but it's not true at all.
People still ultimately exist in meatspace (as it used to be called). If you get a warrant based on legitimate suspicion to follow someone, you can assign detectives to follow them, plant bugs in their home, wire up collaborators and all the endless techniques that were used before the internet was around. People are still people and they walk around in the real world, they can be spied upon.
Sure, it's not as easy as sitting back in the DA office and spying on everyone all day long with zero effort, but it should not be. When the power is given to remove fundamental rights to privacy from someone, it needs to be based on a legitimate process and it must take effort. If it is zero-effort it will be relentlessly abused.
> all the endless techniques that were used before the internet was around
Ok that’s another fallacy though. Before, authorities could get a warrant to wiretap phones and to check the outsides of your envelopes (a “mail cover.”) Only an idiot criminal would ever make a phone call or mail a letter today for anything even remotely related to crime because e2e encrypted calls and chat apps are a thing now, removing lots of risk (nobody can do anything short of getting in the room with one party and peeking over his shoulder, which is not just hard for “lazy” cops, it would be hard for elite international spies). That’s a whole new thing, and a big deal if your job is say, stopping and apprehending human trafficking rings or organized crime.
Again though I’m not saying “and that’s why we should have a back door for cops.” I’m not and we shouldn’t. Just that people who say things have changed in a very impactful way are not being hyperbolic or dishonest.
> Before, authorities could get a warrant to wiretap phones
So basically, before the invention of phones, no criminal was ever caught?
Clearly that's not true. It is entirely possible to follow and spy on people in person and that's how it was always done before the 20th century.
Just because new technologies exist does not justify using them to inifinitely augment surveillance powers of the state.
> a big deal if your job is say, stopping and apprehending human trafficking rings or organized crime
You don't succeed at human trafficking or organized crime if you never leave the house and spend all your day just chatting on signal. You have to actually go out into the real world and do the crimes. As soon as you step out of the house you can be followed and monitored and caught in the act.
These checks and balances are an illusion. They do not work. Courts are overwhelmed with work and suddenly will sign anything. Minimal impact and minimal risk aside for the victim.
An experienced and higher public official will get permission for everything. There is neither a check, nor any balance at all. This is true for every country that pretends surveillance would happen with care.
"The third-party doctrine is a United States legal doctrine that holds that people who voluntarily give information to third parties—such as banks, phone companies, internet service providers (ISPs), and e-mail servers—have "no reasonable expectation of privacy" in that information. A lack of privacy protection allows the United States government to obtain information from third parties without a legal warrant and without otherwise complying with the Fourth Amendment prohibition against search and seizure without probable cause and a judicial search warrant"
... but if you really want true privacy:
"If you want to keep a secret, you must also hide it from yourself"
-- George Orwell, 1984
By analogy, what if I have multiple pairs of tin cans, where each pair of tin cans is connected by a string, and sell those pairs of cans to people so that they can have private conversations at a distance from one another.
If I provide privately owned physical infrastructure for protecting those strings and facilitating the routing of the strings am I obligated to make that available to the government for eavesdropping?
it isn't murky, it's obvious that it should require a warrant for the government to tap into any conversation that people have a reasonable desire to keep private. The police should be there to serve us and not the other way around.
let's only let those people that absolutely must leave their homes, everyone else should stay inside under government microphone and camera. Imagine how much safe society would be! The government could save so many people like that and keep crime down!
They feel they have nothing to hide. I find the “nothing to hide” argument baffling, as when they say this, I immediately ask them to tell me about their last sexual encounter, in graphic detail. After all, they’ve nothing to hide.
For some reason, they never do, and they usually get rather upset with me.
Many people believe this is happening with your phone. It's a recording device after all, and usually carried by most people. If not you then someone nearby likely has one. All these conversations can be transcripted automatically and the vast amounts of text can be analyzed by AI for whatever purpose they want. The infrastructure is already available.
It was predicted that encryption would be attacked with a veneer of being against child pornography or terrorism.
Problem is most people aren't politically involved and just don't think about any implications of a state being able to fish your messages. And for tech affine users this will likely not be true, but certainly for the masses.
I don't think it's that crazy if you trust the government to use these capabilities in the intended ways, that is to catch serious crimes. The US is a very specific country, because here people have very low trust in government and public institutions, which is why public good endeavors are less developed here, while corporations do whatever they want, as long as it seems legal. Many people would argue that's a bad thing for public good...
> while corporations do whatever they want, as long as it seems legal
They don't even care how it seems. They'll do anything illegal if they will make more money by doing it and being caught than they'd make not doing it at all. Lucky for them they routinely get tiny slaps on the wrist for things that would get any other "person" executed or put away for life. Once you have enough money, crime usually pays pretty well.
>I feel we've been collectively losing the battle to keep our conversations private.
A big part of the issue is that the nature of the conversations has changed. Mail and Telephones were never at any point perfectly private. The idea of having complete privacy in such conversations is actually rather new.
The difference is that those communication mediums now represent nearly all communication, rather than a small fraction of it, and that the effort to meaningfully break that privacy has dropped significantly over what it would have required to surveil millions of people in the 1950s. It doesn't require an East-German-esque security state anymore.
Great! Long distance communication has finally caught up with fundamental rights!
Had those rights been respected all along instead of exploited by perverted, power-obsessed authorities because of how easy it was, it wouldn’t be such a shock to lose the ability again. At least in the US where a right to privacy is a constitutional guarantee, I would hope that Apple and others would defiantly continue to offer encrypted services despite government threats. It would seem like the Human Rights Act guarantees the same right, though I don’t know if it has any higher precedence than any other act parliament.
> Mail and Telephones were never at any point perfectly private
In principle they were not private but in practice they were because in most places the police had to realize that there was a conversation of interest, get a warrant, and use scarce resources.
Now the authorities are able to use machines to monitor traffic patterns for almost all communication the cost of interception is much lower.
The FBI has a history of illegal wiretapping as old as the organization itself. To keep it relevant to one of this weekend's big film openings, you can read all about how illegal recordings fed the hearings that stripped Oppenheimer of his security clearance. And the tradition extends to many other prominent figures including Martin Luther King. And today, this kind of surveillance continues through through more modern guises [0]. The volume of these 702 searches is dramatically down, which is good, but there is no reason to assume that it will stay that way.
Point being, one shouldn't assume that government agencies will adhere to the standard we might wish them to when choosing means of investigation or surveillance.
Oppenheimer was an obvious target, you and I are much less likely to be listened to when resources are scarce. So in Oppenheimer's time my communications would most likely have been very secure.
But now resources are less scarce and the task is easier so blanket surveillance and recording is more nearly practical.
That's not true, people have stepped aside for 10s of thousands of years to have private conversations they didn't want other people to hear. why does the medium matter, whether is pressure waves from mouth to ears or electron/light communications. Why does anyone have a right to listen in?
To me, there is one big argument for privacy: You never know what your govewrnment will change into in the next few years. This basic argument for encryption is often raised in combination with countries which we already consider non-free. But, frankly, I have finally learned the true meaning of this message during COVID times. I would never have expected society deteriorating into this fear/hate driven, media induced witchhunt. Since that experience, I basically expect anything frm the government, which makes the argument for being able to encrypt communication even stronger for me.
Yes, this is something a lot of people don't think about:
What is acceptable (even legal) today may not be tomorrow, or in X years (10, 15, or more)..
If we allow all our private conversations and messages to be permanently archived (and you know they will be, disk space is effectively infinite), who is to say that wouldn't be used against us in the future when laws, or even social standards, have drastically changed?
And we literally have examples, right now, of Facebook handing over private messages between people talking about abortion and then those were used to convict someone.
You can demonstrate this to folks with a very recent change in government: Abortion and LGBT rights.
How much do you want to bet states like Texas or Florida will use government surveillance to prosecute folks seeking medical care?
I give it a month before we hear about someone being charged with some form of conspiracy to leave the state before their poor kids are taken. "Exhibit A will show the defendants intent to move their children across state lines..."
Absolutely. Another concern I have is the demonstrable incompetence of the UK government. If they have my data, which will only ever paint a partial picture of what I do, who knows what conclusions they may draw. I don't want to spend hours talking to nice policemen because my GPS data shows me regularly in the same park as some known terrorist just because they've never heard of a park run.
Yep, this just gives them the tools when they go full fascist. For example the USA had a recent coup attempt, and that same person attempting the coup will be running again for president and there is a non-zero chance that he will win. He has openly stated that he wanted to "reform" the government and install only loyalists, that is the type of regime you don't want to have instant access to all communications.
> I feel we've been collectively losing the battle to keep our conversations private.
The USA is still doing pretty good but the UK and the EU are staunchly anti privacy. They're pretty good on consumer privacy but don't believe that privacy from the government should exist.
This feels pretty opposite to me. I mean sure UK is pretty privacy hostile in practice (CCTVs everywhere), but what does the US have for companies surveillance of people? How much of that data is legal for governments to buy? Maybe the US gov isn't spying on citizens, but how many 5-eyes partners are definitely sending data to them in proxy (by careful surveillance design)?
I guess my question boils down to what specifically does the US do right that the UK and Europe does worse?
That said, it has become apparent to me that we need to impose further limitations because apparently the whitelist/blacklist approach we have thus far is insufficient and the Anti-Federalists were much more correct in the long run.
That only works if you have a government willing to limit government power. Otherwise the letter of the constitution will be preserved while everyone in power do basically whatever they want.
That’s true of every conceivable government. Political office is ultimately a position of power and so those who have it seek to maintain and expand it, but I think we roughly have the right mix of institutions serving the right functional roles, but the powers given to Congress could use a textual update, and a right to privacy more clearly spelled out to account for technological and international diplomacy changes that have occurred since 1789.
> I think we roughly have the right mix of institutions serving the right functional roles
This is also an opinion commonly shared among people living in a country. Otherwise you either have a dictatorial state preventing people from leaving/reacting (keeping them dirt poor being an option for that), or the people rioting in the streets for months until something breaks.
First, the TFA is about Apple, not the US. The US gov. also attempted to get backdoors from Apple (only to give up and go for the standard security vulnerabilities instead of getting a clean entry point)
Then cloud data stored on Apple servers is still open game, and Apple syncs message on iCloud by default. There's very little incitive for the US gov to burn political will on this issue when it won't matter for 99% of people using the devices. Except the UK gov doesn't get that privilege as the data is on US servers, not UK ones.
The fact that almost every US telecoms provider is selling your location - and there is no restriction on law enforcement buying it - means that things are not exactly rosy in the US either.
An insidious part of this discussion is this idea that these laws do not interfere with encryption. IMHO, it's a dishonest stance to take but when debated, the first thing defenders of this legislation say is that this "does nothing to break encryption" or "Privacy and security are not mutually exclusive — we need both, and we can have both and that is what this amendment delivers."
In this plan messages are sent to a third party for analysis. Sure the messages sent to the third party are encrypted but your privacy is entirely violated.
> The anti-encryption laws are likely vocally opposed only by a minority, while the majority believes they had no privacy to begin with and governments can read your messages at a whim
Traditionally, couldn't they with texts? And with all the major social media players?
Isn't stuff like Signal they can't track relatively new and getting outlawed in many places?
It's not just about encryption and privacy. That's missing the forest for the trees.
For a democracy to function, people need to be able to have free and candid discussions about any topic without the fear of being ostracised, persecuted or whatever. Only that way can ideas be exchanged and people get a hunch of what others think about stuff of relevance. Only that way can people partake in sensible democratic decision-making. Framed opinions pushed onto you by one-way media are no substitute. That's dictatorship in disguise.
"Classic" ways of public communication, like town halls, pubs, marketplaces or whatnot, cannot fill that role any longer. But online, places like Twitter, Reddit and some chat services that closed the gap now get killed off, too.
This dystopia cannot be let come to pass.
They could and they did, the problem is they’re now attempting to get the same access to encrypted communications by outlawing privacy, essentially saying they need to get rid of your freedom to speak privately for our greater good.
I used PGP Phone with my dad, which means it was pre-Dec 1999. The crypto battle is kind of orthogonal to general privacy concerns. Stuff like being allowed to sell location data from your phone is not related to building back-doors into encryption. The argument against crypto backdoors is pretty simple: bad guys can get good crypto, and backdoors invariably end up providing access to bad actors, via hacking or secret leaking or corruption.
In terms of human history, easily eavesdroppable communication is relatively new, mass eavesdroppable communication even more. I’d like to believe we’re reverting back to the mean - similar privacy to in person conversation but now over long distant.
So many levels of meta-irony in this thread. You're laughing at a non-existent strawman you've been fed by the media.
The people saying Let's Go Brandon are perfectly happy to say Fuck Joe Biden. That's how the whole thing started. A large crowd at a televised NASCAR race started chanting it in unison whilst the winning driver called Brandon was being interviewed. The words they were saying were perfectly audible, but the interviewer tried to claim the crowd were not saying "Fuck Joe Biden" but rather enthusiastically chanting "Let's Go Brandon".
This immediately became a meme because it's such a perfect shorthand for the duplicity and accountability-free manipulation of the US media.
That's why it became a conservative meme in the following months. But of course because the media don't tell anyone where it came from or why, the people who get their information from it now believe that it's a "hilarious self-own" and people saying it "want to insult but don't want to say the exact words"!
Nvm0n1 gave a good explanations of the meme nature of "let's go Brandon". Not sure why you are being rude. You clearly live in a small political world and don't understand the meta.
When the majority uses vulgarities, how does not using vulgarity make one a muppet? I’d also consider the Muppets rather avant-garde for their time, so maybe you mean it as a complement as in Gonzo, Fozzy, or Beaker (or any other pathetic Muppet).
If you think the _words_ are the vulgar part of modern right wing ideology, I don't even know what to tell you.
That said, I am perhaps guilty of the same offence there - per the BBC America article "10 British Insults Americans Won't Understand" [1], a muppet is "someone who’s stupid, gullible and incapable of independent thought" - unfortunately the pearl clutching tends to extend to the typical four letter word I would use for such people.
Interesting take. I understood conservatives to use it because it is also taking a shot at the so-called MSM, who dubiously created the phrase. I never knew people though conservatives were being muppets by using it.
The 'MSM' (what a dated phrase) didn't create it, it was just a single reporter at a NASCAR race who either misheard it or bowdlerised it for the live TV audience.
> The anti-encryption laws are likely vocally opposed only by a minority
I'd like you to back up that claim because from what I had seen about surveillance and terrorism most people supported it(even the patriot act had popular support in the polls). Only people smart enough to know about encryption oppose this. Most people who don't understand tech pretty much assume the government is already looking at messages. Long before snowden, illegal phone tapping was a public secret people were fine with so long as the government doesn't abuse that access. Even before computers, they had secret rooms where they opened to read people's letters without a warrant. Not one major political candidate that I can recall since 9/11 has mentioned expiring the patriot act or investingating the NSA and recommending criminal charges in their campaign, nor does it get brought up in their town halls.
Anti encryption? Not sure most people support that specifically.
But generally speaking, I dont think you’ve emphasized it strongly enough. People arent just supportive of trading privacy and freedom for the promise of safety. They are literally begging for it.
I wonder why the UK legislators not preparing a new regulation mandating every keys to every door and safe having a bypass mechanism for government officials.
Behind every door and every locked place there could be child pornography and illicit materials hidden!! Every house, every hotel safe are suspects!
Criminal oversight, criminal oversight!
(and if they think their reasoning for backdoors into online chat and conversation is mandated by this supid reasoning of theirs then it must be valid for all entrance doors of every home and buidlding and every locked spaces as well! Getting easy access to material without assistance or knowledge of the people involved.)
I'll play Devil's Advocate (because I enjoy throwing myself into the fray — especially when arguing against a point I actually agree with).
No one is mass-sharing their safe of child-porn worldwide with thousands of other child-porn voyeurs.
The internet and its ubiquitous accessibility combined with digital image file formats has changed the landscape for those that would fight these heinous crimes.
It is indeed a new and special case where a locked safe is not.
The most amazing psy op I’ve ever seen is this convincing the public that it’s people who use encryption who are the perverts, and not the people who literally want to creep on other people’s private photos of their kids / spouses / romantic partners.
This is exactly the opposite of how it works in the real world.
You can have multiple 1-on-1 chats, and CP collectors are pretty good at finding each other online. While I'm very much against this UK legislation, people in the tech sector do themselves no favors b¥ minimizing/handwaving the CP problem.
Isn't that why we pay gov agents to hang out in the seedy parts of the web? If CP consumers are able to find each other with relative ease, I'm sure agents impersonating them can find them too. Then it's up to them to locate them, likely by exploiting an opsec failure along the way.
I see no reason the give the gov eyes on everything when they have a perfectly valid route to investigations.
I'll also play Devil's advocate even though I'm a very vocal advocate for data privacy in real life.
These government agents are people as well, and as people they'll suffer to wade through the dark corners of the web to locate a minuscule part of all the CP data available worldwide. These people will inevitably require mental care to deal with the trauma of looking at GiBs and GiBs of small kids being raped, that's not a cost we can handwave to "we employ government agents", there's not a limitless supply of people willing to do this job, and doing this job doesn't come for free, not for the individuals doing it neither for society (mental trauma care; staffing high enough to allow rotations and avoid burn out, or PTSD, etc.).
I'm not sure I subscribe to that. I've spent a lot of time or the web, and I've seen a lot of gore. So much so it doesn't phase me now, even after not visiting those sites for a decade. I haven't dealt with csam, but I imagine the impact is similar. You get desensitized to it. The difference: LE gets to put and end to it via their investigations. That is rewarding enough for people to stay, I assume.
Plus even if you did dragnet everything, you'll still need a human in the loop to verify the findings, even if ml classifiers are used to find leads.
Plus what about people that just encrypt their messages locally before sending them across a backdoored platform? The gov can't make encryption illegal. It's just math. Folks will steno it into memes or whatever is hot that gets shared to allow it to live in plain sight.
I know LLMs aren’t a magic bullet for everything, but this seems like exactly the kind of problem they could solve well as they become more cost effective. Dramatically shrink the pool of real people required to keep an eye on this, and sub in virtual agents adept at detecting CP patterns.
Security back doors also carry the risk of having the opposite effect. We’ve seen many examples of powerful, nation state developed tools being leaked online.
There may be a rise in scenarios where people (including children) have their personal devices breached and are extorted for the exact thing the security backdoors were created to prevent.
To a point LLMs can help with this. Additionally software has played a part in this for many years but nobody is getting convicted for possessing media that hasn’t had its contents validated and graded by a human.
Governments use CP/Abuse to pass legislation, which is immediately used for almost anything other than that. If a government did actually care about child welfare there are many things they could do that would actually help children.
But this law to "protect children" is being passed by a government that is simultaneously cutting services that help children, and also trying to reduce/restrict sex ed that so that it is easier to abuse children, I'm going to say that "protecting children" is not their goal with this law.
As I said elsewhere, any argument for "we need the ability to remotely access the content of a phone" (which is what the law is demanding: silent updates to remove encryption and anything else they "need") equally applies to having a government mandated cameras in every house. That would actually prevent child abuse. It would prevent domestic violence. If such was happening the police could intervene immediately. It would remove he said/she said from courts: you can simply play the video from the home.
By the definition of "no encryption because protect-the-children", mandatory cameras in every room of every house is both acceptable, and also objectively superior to BS anti-encryption laws. You can't abuse or beat your family members in the first place if a camera is watching, but anyone moving CSAM around on the internet isn't going to have a problem getting an actual encrypted channel - all the law does is make your personal communication, banking, finances, etc insecure, criminals are safe. Of course even if someone does use a now insecure communication channel to share their child abuse it's moot: the cameras in their house would have already caught them.
So why screw around with "make everyone (including children) unsafe to 'protect the children'" when there's a much more effective solution that would stop the abuse in the first place?
Also, the UK already passed a bunch of "you don't get privacy" laws in order to "prevent terrorism", and they seem to be used primarily to catch people not picking up dog poop, not paying tv licensing fees, etc which sure as shit doesn't sound like it's terrorism related.
In summary: if a law that would otherwise violate fairly basic rights is being pushed with clearly emotive justifications like "child abuse", "terrorism" you should assume you are being played.
The police do not need more power. They do not need to violate everyone's rights. They need to do their jobs and do actual work with what they already have, and demonstrate basic competence before they get any more invasive tools. Recall the Ariana grande bombing in the UK? Multiple friends and family of the bomber had independently and repeatedly reported them to the UK police. 9/11: multiple US government agencies had all the information needed, but were too busy trying to compete with each other. In addition (tens of, if you look at the US) thousands rape kits that aren't even processed, and weirdly they keep finding serial killers and rapists when. they. just. do. their. job.
Throwing away more of our privacy, when police already have huge amounts of information that they just can't be bothered to look at, is beyond stupid.
Similarly, based on the track record of supporting and aiding child abusers, and cutting support for children, any claims a government makes saying something is to "protect children" is clearly false.
The very, very large majority of child abuse happens at schools, half by other children, half by staff. Especially in phys ed/sports. Think about this: this is so prevalent you will probably remember cases of this yourself. Of course, government has made sure there is absolutely nothing child services or police can do about teacher abuse, and nothing can be done at all about teacher (or school) responsibility when they fail to protect children during school (despite the law saying they are responsible). The only thing child services can do is effectively attacking the child (using violence to demand the child be "treated"). And while the police technically can go after the perpetrator, they don't need any proof to go after the child (which "solves" the problem), but they need proof to go after the perpetrator ...
The whole system around children is setup to protect government against children, FIRST. This will not change with surveillance, in fact it will be made a lot worse, and this is a far bigger problem than CP.
So if people claim that all government people are "good people", who are "just trying to help". The problem is the result. The real question is what happens to those children, to the victims, the rest doesn't matter.
To some extent that's actually true. Most teachers that eventually get convicted for ... students aren't pedophiles. They're people with no previous offences, who really did start their job to teach, who at some point lose their self control when dealing with children, because it's so easy. The problem with it being so easy, which leads to the sad observation that they tend to be responsible for a LOT of children, and therefore make a lot of victims before anything is done, because the system is set up to protect them. The current record is a German director of child services, who sexually abused over 30000 children. You might ask how you can even do that, that's almost one child per day, for a decade. The sad answer to that is that he had a large team around him. Teachers, in more normal cases are accused of abusing between 5 and 10 children.
These people know this. They go through the web merely to punish and use violence against society to force what they consider decency standards. The problem is, they do not help these children, they usually only damage them (which leads to constant embarrassments. FBI 2 years ago proudly announced they had done a razzia, and "saved" 43 children from prostitution. Before 2 weeks passed 41 out of 43 of them had run away, with the very large majority very likely going back to prostitution). They just throw them into the child services system which destroys these children's lives far more than even actual abuse does. And, of course, mostly they were not abused but protected by their parents, and were far safer at home. And even that's ignoring that this is government: a significant fraction will get deported if they don't run away.
And this is ignoring the "accidental" studies, which follow this pattern: someone wants to make a career and needs, effectively, to find a large amount of children abused, to get attention. These can't be actually abused because of how humans work: abused children ... abuse others themselves, damage themselves, do drugs, act in criminal ways, ... So nobody in the system wants those, at best they are very problematic children. What these individuals need to do is convince authorities that a large cohort of normal (very young) children are in fact abused, and "need help". Mostly, of course, they don't tire themselves, and do this by simply lying, then ripping the child from their family, but for example there's the "on this anatomically correct doll" scandal.
The "plus" side of this is that we have plenty of evidence of the effect of child services ABSENT earlier abuse or problems in the child's life. And the evidence is damning. These children don't study. They suicide 10x more than normal children. They essentially do not, at all, continue studies after high school, and a high percentage doesn't finish high school
I would like to point these are studies, but this is extremely visible to people working in the field. The help government provides doesn't help children. They know this, yet they keep working in this system. This is, obviously, not moral.
My point was that minimizing the csam issue isn't that relevant to the terribleness of a proposed law. Csam will likely always exist, unfortunately. It's too hard to prevent; the punishments seem to be nondeterrents.
We already have mountains of awful laws that are passed via appeal to csam emotion. I see no reason to pass another one under the appeal of csam reduction when we all know how it will really be used.
I suspect, as always, the bad guys would find ways around any such measures. Leaving, normal, law abiding, tax paying citizens with diminished rights and freedoms.
They literally are though? Numerous cases of child abusers abusing their children and publishing the footage of it from their homes, not to mention the mass storage and hosting (hosting _may_ be less common now).
People physically abuse their family and children at scale (domestic violence is not rare), a really good way to stop this would be to mandate cameras and microphones in everyone’s houses.
Don’t worry though: to see/hear recordings from inside someone’s house police will need a warrant.
In order to distribute illegal material, you have to go public with it, i.e. contact strangers. One of those strangers is going to be a police agent or a snitch seeking favors from the police.
Sure you could limit yourself to known contacts, but that wouldn't make you rich nor would it create self-sustaining network.
The obvious counter is that raiding a house is a very public and relatively expensive action. There's a natural disincentive to it. OTOH digital surveillance is more akin to being a ghost who can float through a door without anyone ever knowing you're there, and teleport there and back instantly with no physical time or effort
I think this is the big change people miss when it comes to police powers in the digital age. There's not been really been a culture shift from the police nor the citizens about police powers, just a increase in quantity of ability that has turned into a difference in quality. For decades agents of the state have had the theoretical ability to surveil people, and the people were generally OK with it because of the assumption that "well they won't bother doing it to a random guy like me", which was true, back when it required actual footwork, and even pulling up someone's file was a physical task
Assuming one does accept targeted surveillance in extreme cases, then I'm not sure how to solve this apart from the frustratingly "stupid" and probably unenforceable solution of requiring the police and intelligence agencies to be stuck on 90s tech. Theoretical legal sanctions against doing this appear to have no teeth, the only way to discourage surveillance abuse is to make each instance have a non-circumventable operational cost
So to use a concrete example: I actually like private CCTV in public spaces. I do feel safer knowing the police can use that footage if I'm a victim of crime. But I only accept it because most systems are just writing to a local device, meaning there's a cost to the police asking for it. Any scheme to automate access to these records over the internet, no matter how well-meaning and theoretically legally restricted they are, would inevitably be used to make those scenes from The Bourne Ultimatum seem quaint
- they needs a judge order and because the previous point you can counter sue if reasonable, at least theoretically
- they can't impersonate you all around the globe including in countries outside of the UK just by raiding you
- a raid is time wise also limited
- police being able to raid doesn't prevent you from guarding yourself against random criminals or agents of foreign hostile governments raiding you
- you know when you are raided
- you are still allowed to put steal doors in your home
- I probably missed a bunch of points
I.e. it's not the same, not at all.
A more correct comparison would to state police can raid you, so they should be able to hack your device but only to retrieve information and being required to leaf a message behind (and even that comparison has issues).
Dunno how I feel about that metaphor. To me the current situation isn't just saying "police have the power to kick in doors". It's more like "citizens are banned from building houses with doors that make it hard for police to enter".
Because real life isn't an action movie, and police would rather try to ram the door for a while and hope it budges, instead of instantly going for rappelling from the rooftop or for using a ladder. Especially given how relatively uncommon steel core doors in apartments are in practice.
One guy climbing through a shard-lined window is easy to mow down; the first cop that dies halfway through will block access to the rest. It also disarms the guy with the riot shield.
Opening the door lets a group of them quickly file in at once to swarm the occupants.
If you could invade someone's house as easily and invisibly as digital surveillance, physical search warrants would be swiftly ignored by the police too
Given the possibility of invisible and essentially free screening of private conversations, the analogy is probably more "install cameras in every home that only the government can access". But we promise, they'll be turned off most of the time.
They don't mandate every keys to every door because they CAN break almost every door with force. The police can do it with a warrant and some "special forces" can do it clandestinely.
At least in England and Wales, unreasonable force can be used in self defense (Crime and Courts Act 2013) as long as it is not grossly disproportionate, and people are regularly let off for that, so you're wildly off base.
This includes cases where people have killed unarmed intruders.
What will typically get people in trouble is not self defense, but cases where the victim continues after an attacker is no longer a threat (is subdued or fleeing).
The UK keeps trying to throw their weight around only to find out they aren’t that big. The CMA with Xbox and now this. They aren’t large enough for Apple and Microsoft to make massive changes just to accommodate a relatively small market.
The UK with Brexit are starting to find out very quickly how insignificant their nation is and how easy it is to route around the oversized egos of their so-called leaders.
Most governments deserve a good helping of scorn including my own, despite our best efforts to persuade them otherwise. I find most people in the UK quite agreeable. I recommend not taking criticism against your government personally.
While I’m glad the rather embarrassing empire thing is on the wane I really don’t see how that is relevant here.
The UK isn’t trying to tell anybody what they can do in any other country. It’s just laying down the law for what happens within its own borders which it’s entirely within its rights to do.
Equally why you would think an American company could or should tell the British government what to do is surely itself an example of empiricism is it not?
It is not that the UK can’t set the rules for their own country; it is just that they need Microsoft and Apple more than Microsoft and Apple need them and so the government has little leverage here.
Further, they aren’t large enough to block the deal so they will only make things worse for their own citizens as it will just deny citizen access to certain products and services or create or otherwise cause companies pull out of the market causing businesses and citizens to look for sun par alternatives.
Apple and Microsoft can’t tell the UK government what to to, but the power dynamic is still what it is and the reality of what blocking the deal will actually accomplish is still the reality.
Individually no, but in aggregate... A few games here, iMessage, Facetime, Whatsapp maybe, some other internet products, etc
Does it boast something else to make up for the loss of these things? Trading something for nothing always feels kinda bad. And if the trade is "We stood up for the right thing with respect to Cloud Gaming monopolization", you constituents might be kinda pissed after a while.
You know the old adage "Bread and Circus". Well you gotta make sure the circus is running well.
The question is this: is the law good in principle?
Meaning is the law good regardless of what Apple decides to do? Or is it only good if it forces Apple to break encryption?
I think the answer is obvious. Its not a principled law. Its just a threat. They wouldn’t rather lose access to facetime and iMessages than have access to them but encrypted. Thats just crippling themselves.
Hard to tell about the principles of the law itself.
I don't hold any of the British government departments in high regard, so I will assume malicious power-seeking on their part regardless of anything else.
But for the principal of forced decryption?
Most non-criminal organisations don't know how to secure their communications properly, so while organised criminals that hire decent IT can trivially secure their communications, I don't expect criminals to be generally capable of keeping out of sight of the authorities.
Non-professional criminals will probably all get scooped up easily by a surveillance dragnet.
And the only way past the competent criminals is old-fashioned intel gathering.
But the British government can't actually act on dragnet-scale information anyway:
They've been cutting back on police, court buildings, magistrates, public lawyers, prison buildings, prison staff, and parole boards.
Even if they hadn't, surveys of drug use alone suggest they'd be criminalising a double-digit percentage of the population they can't afford to put through the police let alone the courts. Given what the UK counts as "extreme" porn, I'd guess that's also a double-digit percentage of the population.
Maybe it does boast something else. I am genuinely unsure whether being able (after a judge orders it) to unlock encryption, similar to a front door, is a bad thing. I don't like it, it stinks, I'd hate to experiment with things like this and so I'd vote against it, but is it going to end up worse for the greater good? Can't we still just use proper/steganographic encryption when an authoritarian regime gains power, and until then is it better? I can't say that beyond reasonable doubt. That safer society can be argued to be a boast that makes up for it (I'm not saying it definitely is. I'm unsure. Very unsure).
And for the other, anti-monopolistic market authorities, the "boast" is a level playing field.
The point isn’t that it has judicial safeguards. Everyone thinks judicial safeguards are a good thing. The point is, once you weaken encryption to enable the bypass even through judicial safeguards, you weaken encryption for everybody and bypasses will be made, with absolute certainty, without judicial safeguards. If you enable encryption to be overcome through legitimate means, it will be overcome through illegitimate means. Which means encryption is effectively dead.
The same Apple that lets the CCP operate its entire App Store in China?
It's easy to criticize a government when you know there will be no retaliation. To judge a company's true stance on such issues see how they behave when there's a prospect of real financial loss.
The UK and China are not comparable. China is the worlds most populous country (for now). The UK has a population of under 70m. China is a totalitarian regime where even an imperfect freedom is preferable to no freedom at all. The UK is a democratic country which should be upholding the right to privacy.
If Apple weakens crypto for the UK, it affects people in other countries as well. iMessage is not exactly popular in the UK, so it is disproportionately used for transatlantic communication compared to WhatsApp. If Apple complies with the law, they are violating the privacy of users in the US as well.
> The UK and China are not comparable. China is the worlds most populous country (for now). The UK has a population of under 70m.
You say they are not comparable, and then compare them in a way that backs up the point you're trying to argue against. It _is_ two-faced of Apple to give China a pass here, undoubtably in large part due to the massive loss of sales were they not to comply and be removed by the Chinese regime.
> China is a totalitarian regime where even an imperfect freedom is preferable to no freedom at all.
If the Chinese gov't has the keys to unlock "private" conversations, how is this "imperfect freedom" in any way different from WeChat or other state-sanctioned messaging systems?
> The UK is a democratic country which should be upholding the right to privacy.
I mean sure, but Chinese people should have the same rights of privacy as anyone else. It's not suddenly less-bad because their government is totalitarian.
> If Apple weakens crypto for the UK, it affects people in other countries as well. iMessage is not exactly popular in the UK, so it is disproportionately used for transatlantic communication compared to WhatsApp. If Apple complies with the law, they are violating the privacy of users in the US as well.
How is this any different than what Apple currently does when an iMessage user outside China messages someone inside China? My understanding is that it acts precisely as you describe, but I could be wrong. And likewise, from what I understand iMessage is likewise unpopular in China so cross-country communications would potentially be a major source of iMessage activity as well? ( I don't see any statistics on this)
>> The UK is a democratic country which should be upholding the right to privacy.
>
>I mean sure, but Chinese people should have the same rights of privacy as anyone else. It's not suddenly less-bad because their government is totalitarian.
The extent to which Chinese people oppose the government being able to read anyone's conversations is quite different vs. the UK. Unfortunately, it's not likely that this or the Chinese government's behaviour will change any time soon. But in the UK it is an active political question that is still undecided.
And as you said, the size of the market -> how much money they get probably has an influence on Apple's decision to operate in China, even with all the caveats. But it's also not the only factor. If it was, Apple wouldn't consider blocking iMessage/etc for UK customers.
What are you trying to say with the message? Are you saying if UK government turns bad tomorrow, it is fine for Apple to remove encryption there? But now it is not fine as UK is not bad?
Why does Apple even care about how bad the government is? Either they care about user privacy over money, or they value money over privacy. It is fine if they value any of those as I frankly believe it's not Apple's job to moral police governments.
> If Apple complies with the law, they are violating the privacy of users in the US as well.
Democracy requires privacy if you take a few minutes (shouldn't take too long) to think about it. And not even just a bit of it, it is often required for voting to be anonymous for similar or the same reasons.
Apple is clearly entitled to not provide iMessage in the UK if they feel that the extraterritoriality of a UK law will hurt their interests in other jurisdictions. That’s not anti-democratic, unless you think the UK rules the world.
This kind of black and white thinking is defeatist.
China: already a lost cause (before iPhones and messaging existed, in fact).
The UK: a leading western nation.
Let's try to have western countries not be like China? The UK could be a domino... if it falls the authoritarians in other western countries may be emboldened to follow.
In the end the choice is simple: follow local laws or drop features or devices.
The calculation is going to be different in each country, but there is no hypocrisy here.
The Chinese government messing with Chinese servers affect the Chinese market, which makes a lot of money so there is a strong incentive to remain in it. The UK backdooring FaceTime compromises it for the rest of the world and would actually put Apple in jeopardy in other jurisdictions with stronger privacy and data protection laws, for a comparatively minor market. It’d be more significant if the issue was with the EU or the US (both scenarios can realistically happen in the next few years, unfortunately). All they are saying is that they will comply if the regulations are put in place. Also, the British government is known for making noises along these lines before quietly dropping the whole project when it turns out that it’s actually not that simple. Different countries will lead to different risk assessments.
So yeah, there is no inconsistency, it’s just a matter of how you stay on the clear side of the law.
> It's easy to criticize a government when you know there will be no retaliation. To judge a company's true stance on such issues see how they behave when there's a prospect of real financial loss.
A company is not sentient, it does not have a stance. Its policies have no value except when they are decided and enforced by people. It’s dangerous to talk of corporations in terms of ideology, because these things can change and often cash trumps good intentions. In the end all that matters is how much the company and our interests align. The best way to have a company behave over the long term is if it makes sense for it to do so from a business point of view.
You don't think there's a prospect of financial loss? They literally just said they would pull their products.
Ecosystem features like those are huge contributors to platform retention. And if the cynic in you doesn't believe that, just ask yourself why they burn the money to keep staffing development and maintenance teams for iMessage and FaceTime - they aren't doing it out of kindness right?
> Ecosystem features like those are huge contributors to platform retention
In the UK, Facetime maybe. But iMessage isn't used to the level as it is in US. I can't imagine Apple's UK share would take a significant hit due to no iMessage when everyone already uses cross-platform Whatsapp.
There was an interview with the CEO of Signal and some UK conservative politician[1]. It's a pretty infuriating interview, as both parties try to debate two complete separate issues. Meredith Whittaker of Signal, rightfully tries to debate the issues with breaking encryption and it's ramification, but the politician has that sorted. See they don't want to break the encryption, they want the apps to pick up the messages BEFORE encryption and side channel them to some government agency, at least that's my understanding. They want on device access to the message, which the app have, they just also want that information searched, indexed and filtered and sent to the police.
It's sad that these politicians don't see the problem... Well either don't see or understand, or they full well understand and that's the point, they're just doing a poor job of explaining why.
Honestly, Meredith Whittaker came across very poorly. She made one point well, that one can't make a safe backdoor. However, she essentially refused to answer any further topics once the conservative politician said they wouldn't require a backdoor in encrypted messaging services.
Signal, Whatsapp, and iMessage threatening to leave the UK over this bill indicates to me that the bill is unreasonable. However, Meredith Whittaker did a very poor job of demonstrating that during this debate.
The problem is, there's not really a good way to debate "If you have nothing to hide, there shouldn't be a problem." The person pushing for surveillance is going to argue in bad faith that they're the good guys, and why are you stopping them from catching the bad guys? And there's no counter argument to that because it's very hard to find specific examples of where authoritarian governments overreach and go hard on innocent civilians. I reckon when that happens, they manage it keep it very quiet, since they are authoritarian.
On top of all of that, who wants to defend child pornographers? I can see why a person wouldn't even try to engage in that back and forth. There's just no way you come out on top, because basically your side of the argument would have to come down to "Everyone has something to hide, and we should recognize and protect that." It just doesn't _sound_ good.
What is "child pornographer" btw? Is it the one who produces? Is it the one who distributes?
Sharing CP != child abuse. Viewing CP != child abuse. Storing CP != child abuse. I want to defend CP viewers. Children can suffer during production (but not always: teenagers can film themselves, for example, i.e. make home videos just like adults). But no one suffers directly during the storage, viewing and distribution of files.
Anyone wishing to control a network of computers must choose either the tube or the box. That is, either you break the tubes and intercept traffic between opaque boxes, or you break into the boxes and leave the tubes opaque. If you assume opaque boxes then by arguing for opaque tubes Meredith Whittaker is arguing for full abdication of control of the network. Politicians don't like that. Some don't like it because in good faith they see their role as that of a protector, a threat detector. Some don't like it because in bad faith they know this is giving up a significant chunk of power. (And in reality every political leader falls on a spectrum between these two).
In practice, I think we'll waver between losing control of the box and gaining it back, as it sinks in that, truly, no backdoor can be reserved for some and not for others. I think the only viable solution is to break the tube: make it illegal to send encrypted messages over the wire. Period. Then we could keep the box inviolable, opaque, and secure. But it can no longer communicate secretly to others. In practice this will mean more "sneaker net" movement of illicit and/or secret data, and probably leaps and bounds in steganography. It would also end the internet as we know it, especially any and all e-commerce. But hey, that's a small price to pay for eliminating CSAM, right?
The interesting thing is that the government can have access to your conversations today. They just need to get judge order and search your phone (end-to-end encryption doesn't help if you don't delete messages). Same things as knocking on the door with a judge order. But the government doesn't want to deal with judges. They want unlimited access 24/7. That's the key difference and this is why this is bad for all the citizens.
The current conservative government is on its last legs before being banished for at least a decade. It appears they already have their eyes on the lucrative private sector and are willing to vandalise the UK with little thought to public protest.
I wonder if the UK ever decides to have a written constitution. On one hand they've been doing fine without one for centuries. On the other if this law passes the parliament that's it. There is no "constitutional court" to strike it down. No "president" to "not sign it". Etc.
> There is no "constitutional court" to strike it down.
The Supreme Court exists for this very purpose, although it is subject to Parliamentary Sovereignty[0].
> No "president" to "not sign it".
Bills do not become law until given Royal Assent by the King. The King may refuse to give Royal Assent to any bill, although this has not been done since 1708.[1]
The UK does have a system of judicial review and various laws are classed as “constitutional” (such as the Human Rights Act) and have special protection.
The UK courts are currently weak because the government threatens their independence. But the same thing is happening in other countries such as Israel, so it’s not something that a written constitution can prevent. Ultimately the only thing that can guarantee judicial independence is for people to believe in it, and for people to take to the streets. Most British people just don’t care, that is the problem.
The problem with relying on a written consititution is that it can easily be changed.
The USA is in a position where it seems to be perpetually stuck in a 50/50 split of the elected government, so it's not possible for major constitutional reform to be passed by one party.
On the other hand, the current government in the UK has a majority of 80 MPs - if there was a written constitution, it's likely they could simply pass a law to re-write it.
In the US you need a super majority (of legislators or states) to amend the constitution. You'd have to do something like that. But the UK government system isn't really set up like this, there aren't the checks-and-balances of the US. (It's just civil servants undemocratically dragging their feet/slowing down processes they disagree with... which is not democratic.)
He’s from the same Labour right faction of those as conservative as any Tory, just pretending to do conservatism in a nicer way. Just like Blair, who ended up doing more of Thather’s plan to chip away at privatising bits of the NHS (through PFIs on hundreds of hospitals) than Thatcher managed. Not to mention the Iraq war…
I can't help but think the other parties would put forward better policies that don't alienate the electorate nor swathes of international business. Some would say the current conservative cabinet are the low-grade remnants of the recent party turmoil, and their policies demonstrate this. Unfortunately for us, they really have nothing to lose over the next year.
We have been able to bug phones, homes, offices, whatever, for years to keep the country safe. If enough people had a major issue with that they’d start a new political party and vote it in.
Personally I quite like it. Evidence suggests there’s quite a lot of people trying to blow us up and stuff.
It’s really astonishing that Joe Biden seems to be generally to the left of the Labour party leadership. By USA standards Biden is a centrist moderate. We had the good sense to eject the Mark Penn “must go right-wing” types from mainstream Democratic circles; why didn’t Labour follow this path too?
It started well before the antisemitism nonsense, MPs in his own party were openly discussing a challenge before the results were even in for the 2015 leadership election.
First they came for protests, now they want to tackle encryption.
I don't think they like the ability of people to share unofficial information and organize themselves in huge numbers.[0]
People were sharing anti-propaganda propaganda stickers on telegram too.[1] (2:40 the bus is covered in them, also saw a vid where an entire police car was covered in them.)
I am reminded of a conversation on Security Now where Steve Gibson recommended that each website that has traffic inside the UK to have a red banner at the top of the page to sentiment of something like, "The UK Government is forcibly watching traffic on this website."
Watch how quickly that turns the conversation around.
This argument lawmakers have here is entirely built upon the straw man. I understand the deep yearning to protect children but all this won't make the criminals stop using the tools that are currently legal. They will continue to use them. Making them "illegal" won't make them stop using them because they can still be acquired.
I agree WhatsApp VC is rubbish. Google Meet is way better. I don't know about Facetime but it probably is way better too.
But that's not especially relevant - people still mainly use WhatsApp for video calls despite the bad quality because everyone is already using WhatsApp.
As far as "We have never heard of PRISM", that could be true. When you receive a national security letter it isn't like they give you detailed info about the specific operation and why you are receiving it. It's more along the lines of "you're going to do this pursuant to US code section..."
Even internally at Apple is it common to work on projects that you do not know the name or details of. I was disclosed and worked on multiple projects that my own team members were not aware of.
Testing voice recognition models in preps of the initial HomePod is one example, but quite a few software services built for unreleased hardware would do this.
What's more, a National Security Letter is not necessarily addressed to the company's executives and lawyers. A National Security Letter could be send to somebody lower in the organization, closer to the targeted data (for instance a sysadmin) with a clause threatening them with prison time if they tell anybody, even their boss or company lawyers about it.
They can essentially conscript anybody in the company to work as a spy using (probably bullshit but still intimidating) legal threats to keep them quiet.
Is there any evidence that low-level people have been targeted, or is this just speculation around what could happen?
Yes, "evidence for a secret program" is a bit tricky to produce, but the one I know of - Doe v. Ashcroft - the president of the company was compelled to produce data. I'd be very surprised if this wasn't the universal approach.
This sounds scary, but unlikely. Do you have a source that this tactic is used and effective?
My immediate reaction to such a letter would be to contact the company legal department regardless of whether the letter said not to, simply because I'd assume unless given very good evidence (and originating from a .gov domain isn't good enough) that it was a scam.
Edit: According to the EFF you can talk to an attorney about an NSL.
Letters can be intercepted, letters have tracking, letters of this magnitude will require certification, and at the very least will require a courier. All of that assumes the letter can be delivered, and also assumes that the recipient believes that the letter is authentic. If I sent you an authentic looking NSA letter, would you just do what you're told? Fat chance...
An email/phone call to a legal department is much less work and provides all the same protections.
I didn't consider that, but it aligns with my experience as well. From a C-Level perspective that is probably desirable as they don't want to get involved with that sort of thing in the first place.
We now know that PRISM was (is?) the NSA internal source designation for data acquired through FISA warrants executed by the FBI.
So actually Apple was being honest. They had not heard of PRISM, because that term was only used inside the NSA. And they were not allowing direct government access to their servers, they were responding to FISA warrants.
While technically true, it's also a complete BS. It's in the same vain as Torrent websites are not hosting copyrighted content - technically true but complete BS.
If we are being pedantic, one can claim that all data, can be very well anything because there is always a "one-time pad" transforming all communication into something malicious.
If the law is not precise, then it is on the government to improve it.
I agree. If we stated that torrent sites are hosting it then what other systems are implicated? eg: DNS itself provides a pointer to the torrent sites which then have a pointer to the data.
Prism also involved an NSA program that was done without the knowledge of the tech companies. They inserted a splitter on the fiber lines leaving data centers to copy all the traffic.
That could be true? IIRC, PRISM was less about direct access, and more about abusing every potential method of gaining information.
For example, imagine a Login page that said, "Password incorrect," versus "User does not exist." If you have "User does not exist," you could use that to figure out whether a given email address has an account with a service. That could be useful information to PRISM when looking for a target to subpoena or monitor. (This is also why it's now best practice to just say "Login incorrect" or something vague that doesn't say whether the username, or the password, was wrong.)
It goes further than just the error message. I think the original exploit was based on how quickly Unix would fail to login. (Bad user failed faster than bad password) and that allowed you to enumerate the user names.
It was not mentioned in the leaked documents that the Guardian published. Instead, it said that the data came "directly from" the tech companies (who responded to wiretap requests on specific accounts). Greenwald couldn't be bothered to understand what the documents said and hallucinated the phrase "direct access" like a low capacity language model, but we on HN are functionally literate.
Does anybody remember one December where Apple shut down all the developer systems. It had the feel to me at the time that it was a hack by it's suddenness.
I have always associated that period with the discovery of Prism. I would love to hear if anybody knows what I'm talking about.
Apple has perfected their virtue signaling game and all the simps love it.
I don't really mind it either anymore, a lot of people have this hero worship fetish and they can't help painting everything in black and white, even though they're all just different shades of gray.
Apple is definitively brighter on that scale then Google or Meta, but they're all corrupt multinational corporations that will do everything they can get away with to increase their bottom line.
You knowing this, by the tone of your writing, what do YOU do about it? What computer are you working on right now? Are you sure every component on it is legit? I take it you're using some for of Linux because they're not "corrupt" (well, depending on what distro as even that is a big debate).
In China they are running their own infrastructure on servers owned by party-affiliated businesses. I wonder how strong their encryption is over there?
China blocks many services with strong encryption but it doesn’t block iMessage.
Only Facetime Audio calls are disabled. Normal Facetime calls with video work fine with mainland china purchased iphones....so you have to call people using those phones with video even if you just want to talk to them and not see them. It's kind of silly because it's really unclear what actual problem this solves for anyone including the government. Perhaps the goal is to just make enough friction so that no one really uses it.
IIRC it's something to do with needing a license for audio-only telecommunications services. But I also might be entirely wrong about that.
Pretty sure that is related to competition with China Telecom. When I was working in China, it was the same: voice only calls had to be made via our phone and charged according to China telecom rates, while video calls could be made for free.
Seems nobody uses iMessage in China. The stats I can find don't even list it. Maybe they're ok with leaving it as a way for visitors to talk to people at home.
Chinese do use iMessage, although not necessarily deliberately. SMS is very commonly used in China and when a iPhone user want to message to another, the default is to use iMessage instead of SMS. So I did have tens of native Chinese contacts, including my computer illiterate parents, using iMessage back when I was in China and using an iPhone. Also, SMS cost a fee in China for the most popular mobile subscriptions, while iMessage is almost free.
So it is a good question why CCP didn't block iMessage, despite we know that SMS is heavily censored and monitored (see 金盾工程).
So the CCP is just content to leave this encrypted form of communication controlled by an American company while simultaneously locking down every other avenue of electronic communication? Because “no one uses it?” If no one uses it why not block it then?
People do use it, just not Chinese nationals, so blocking it would only inconvenience visitors. Or because Apple has some special kind of leverage. Can only speculate.
Why do the British still think they have power to do dumb stuff like this? Their economy is floundering, their consumer market is small, and their government is a joke.
In a word, populists. This is part of a stance of being 'tough on crime'. Whether or not the bill succeeds or fails isn't really important, what's important is that they can trumpet how tough on crime they are. If it fails it's not their fault, it's the opposition and how they care more about criminals' rights than your/childrens' safety etc. etc.
As to why the public themselves go for it, the media landscape in the UK is in a pretty bad way at the moment. An enormous amount of power is still wielded by the traditional press - specifically the power to set the national conversation.
70M is small, but not that small. Larger than any US state, larger than most countries in the EU.
It's also anglophone, which to anglophone companies (Apple still is, to a significant degree), means that its 70M are worth a bit more than the absolute number suggests.
There's a lot of good reasons to hate Apple, but when they're right they're right, and I definitely appreciate that they're willing to stand up for something like this, even if it is in their best interest.
Every generation has its PGP / Clipper Chip / etc battle. Eventually we'll lose. But hopefully we're not ready to give up yet.
I read today that the younger generation grew up with and is so used to surveillance they don't care. It's just a thing. But old codgers like me most definitely care.
iMessage and FaceTime are available in China, but it hardly matters because most communication in China happens over WeChat. If all the important stuff is happening over the heavily monitored app, leaving a few scragglers is not a big deal. Especially because, it's not like a criminal gang is going to have the money to arm every member with an iPhone with China's wages (and the ones that do are likely white-collar and detectable elsewhere), and it's also not like Apple is allowed to actively advertise or even discuss themselves as being "more secure than WeChat."
EDIT: Also, ~80%-90% of iPhone users in the country likely have iCloud Backup on. iCloud Backup is not E2E-encrypted in China, and is hosted by a Chinese company instead of Apple. If you want full E2E, you need to use iMessage and hope everyone in the party doesn't have iCloud Backup on... and that's a pretty niche threat now.
Kinda apropos, I’ve used international data roaming in mainland China and was (initially) astonished to find it totally bypassed the GFW and I could run IPsec and SSH unencumbered, none of the unabashed MITM fuckery I saw from using strong crypto over Chinese hotel wifi for more than a couple of seconds. That was a few years ago, I don’t know where their interception regime is at today, but it was a reminder that propaganda begins at home.
LTE and 5G standards allow end-to-end encryption of the roaming traffic. So, it's by design. If your operator chooses to enable e2e (and usually they do), there would be just two options: block your traffic entirely or let it pass as it is.
iCloud Keys are managed by a Chinese company, so in practice, yes. iPhone E2E stuff like iMessage, by definition, does not have keys that anyone else (including the government) possess.
Though it hardly matters when ~90% of iPhone users likely have iCloud Backup on. In which case their messages will be backed up to the cloud to where the government could get keys.
Well, to be fair, the alternative is to simply leave China entirely. Considering how much malware and spyware the average Chinese phone has... Better a compromised iPhone than some of those things.
If you don’t know apple turned over private text messages of congressional officials and NYTimes reporters WITHOUT protest or question to the Trump Administration. And before you say otherwise Google and Microsoft both did not.
They do not care about privacy, it’s all marketing.
My understanding is that Messages and Facetime are still E2E encrypted in China. However, if you back things up in iCloud, China does have the keys to decrypt that. The difference is the UK wants to violate E2E encryption in Messages/FT.
I could be wrong about the current state, but I need to see evidence first.
Someone should bring up lawyer and doctor privilege as an argument. Phones are intimately personal devices that people take to their homes and use for a lot of very private stuff. Phones cannot fulfill their job as a personal device and be a police snitch at the same time just like lawyers cannot advise their clients and inform the police at the same time. There needs to be a personal computer privilege.
But it’s never been that way in the UK. If you were suspected of planning a murder/bombing/atrocity/etc they could bug your phone even 50 years ago. And personally I’m quite glad about that!
Back then phones were as personal as a public phone booth. The way phones are used today resembles cybernetic enhancement more than tool use. That's where the concern for privacy comes from. There's a bell curve to technology adoption and some people have hard time processing this, but things are generally moving in the direction of increasing cyberization. Government messing with privacy can have two effects: (1) halting the cyberization by ruining public trust in the technology once for all or (2) planting seeds of an unimaginably totalitarian society before most people realize where this is going.
Why not a similarly brave stance in China? In China Apple claims it "obeys all local laws" except in the UK it wants to be courageous? I guess it's a brave state unless the $$$ are good.
Apple claims that messages sent with their Messages application are end-to-end encrypted, even in China, and that they have made no concessions to government of China.
But... Everyone in China has their iCloud backups stored in China and Apple holds the keys to these backups. While Apple cannot read E2E encrypted Messages content, they do upload an unencrypted (but encrypted with keys they control) backup of all of the phones Messages data.
It seems likely China can ask Apple for a backup of a citizen's phone and Apple will comply with that request. Or they have access to these data centers.
That is a good point. It's also possible that Apple has made some kind of concession but has found a weasel-y way to do it that let's them feel like they can claim no concessions have been made.
Maybe the difference is that Apple feels they may be able to talk the UK out of this idea.
Yea, pretty interesting point.. seems like either the uk market isn't big enough for them to implement a solution or they don't care about the market and they care about people in the uk but they don't care about people in China.
in advance I agree with a more cynical point as well, but I think we need to consider that there's equally plausible and more positive potential reasons
- if there is success in the UK, relevant persons at Apple can make a more compelling argument to push back on China and other governments a lot more
- perhaps apple is still pushing hard on china and using the UK as an example that they're willing to drop a long time strong market over the anti-encryption legislation
I'm not saying either is the case, but there are benign or even positive reads on why apple is more aggressive with the UK
It's not super interesting. Apple's reliance on China in many ways, and thus hypocrisy, is reasonably well known.
"At least" companies can put up a fight in the UK (or US, see Apple in the San Bernardino case) and they have a reasonable chance they can lobby to block the efforts.
Apple isn't foolish enough to believe that China will fold, it is obvious that if Apple refuses and iPhone will be banned.
However there is a great chance that if they threaten to pull out of the UK that the law won't pass. Even if it does it is good marketing. Who knows what will actually happen if the law comes into effect. The UK market definitely isn't small, for all we know they may back down. Or maybe their reputation is more valuable.
so the government that arrested people for jokes, knocks on people's doors over mean tweets, and politically ousts people for having the wrong opinion now wants to destroy privacy for the 84th time in 10 years
wew, its just one big firework waiting to blow. glad i've got the spectator seats
I do like how Apple has backed themselves into a consumer-protecting hole here. I’m sure they would love to have access to that much conversational data themselves, of which no other company has access to.
If Siri ever improves too significantly I’ll get nervous.
I don't think so. There are people in the world that don't want to pry into other's lives. Collecting un-needed data on people is Nixonian.
I think Apple managed to understand and attract those types of customers. I find a generational disconnect between the 70's computer users and the user in the 00's.
I don’t think it’s possible for a company as big as Apple to act solely on principle. We’re just lucky the profit motive aligned with consumer privacy for once.
Untainted (without ML-generated content) human communication is becoming more and more scarce. It’s only a matter of time before they take advantage of that.
They'll forbid any app using ML generated content from providing iMessage extensions, or possibly it’ll receive a warning mark in iMessage when sent. Could also work with content pasted from those apps. Some sort of “provenance check”. They’ll pitch it as a consumer privacy thing, I promise.
That’ll also be the same update where E2E encryption falls off the the marketing.
There is very much a motivation to access massive amounts of user data, especially when you control the OS that would be the gate keeper to that data, as well as how the “choice” is presented to the end user.
I don’t think it’s possible for a company as big as Apple to act solely on principle.
I agree but think that Apple is the exception that proves the rule. Its origin story is as good as any Shakespeare.
The second act of Apple started small and focused and re-built itself while keeping the Wall Street mind set at a distance. I think your sentiment misses the fact that Apple doesn't really need Wall Street. Their stock can do whatever it wants the lower it goes the more they can buy. Why would Apple lower their standards?
On their conference calls they always talk about reaching cash neutrality as their goal. I never really understand what that looks like
Untainted (without ML-generated content) human communication is becoming more and more scarce. It’s only a matter of time before they take advantage of that.
I'm not sure if I understand this comment. Apple's size relative to other nation's GDP enables them to hire real humans at an amazing scale. This different than a company that doesn't have a culture of support built-in already. They also have the advantage that many countries would partner with them to raise their own GDP.
I think their approach is to grow the AI to help the existing workforce scale. Get more work done using fewer live bodies.
They'll forbid any app using ML generated content from providing iMessage extensions, or possibly it’ll receive a warning mark in iMessage when sent. Could also work with content pasted from those apps. Some sort of “provenance check”. They’ll pitch it as a consumer privacy thing, I promise.
I hope so that is why I'm Apple customer. I also think you are misreading the times. Apple is about to hand over a lot of authority to the customers.
There is very much a motivation to access massive amounts of user data, especially when you control the OS that would be the gate keeper to that data, as well as how the “choice” is presented to the end user.
Don't project your ambitions on others.
I do believe that Apple's ability to maintain their performance is mystifying and wonder how long it will last.
The difference is that they make money on hardware and the app stores, not on advertising (they have a small ad product but it's probably closer to a rounding error).
Since Apple does not disclose how much money they make from advertising specifically, conflating it into their "services" business, we have to guess a bit:
"In 2021, Apple generated 3.7 billion U.S. dollars with its global advertising business." [1]
"Apple’s services business, which includes advertising and subscription revenues, grew 24% year over year last quarter to a record $19.5 billion"
"It’s likely that a large share of this category’s growth comes from advertising" [2]
"Apple ad business could reach $6B by 2025, with $4.1B from Search Ads" [3]
For reference, 2023 Q2 revenue from "services" amounted to 20.9B USD, while
hardware product sales amounted to 73.9B USD. [4]
We're thus talking about ~4%, not exactly a rounding error.
Conversational data between human beings is exactly the sort of tokens you’d want to train a massive in-house LLM on. Then you could have embeddings for the local stuff on-device, and still call it “private” models.
The magic data dust is only useful if they can use all of it though. I just see some incentive there that I can’t imagine them ignoring in a couple years.
> Apple previously withdrew plans for its own CSAM-scanning feature for iCloud Photos, following pushback from customers and human rights groups. Apple’s solution was more privacy-preserving than what is now proposed by the UK government.
Preface: I don't have an opinion on one side or the other.
Question: what were the arguments lobbied against apple that caused them to withdraw? Also, is CSAM the motivating factor for UK or do they want backdoor for more general national security surveillance (or whatever you want to call it)?
> Also, is CSAM the motivating factor for UK or do they want backdoor for more general national security surveillance
Baroness Beeban Kidron has been lobbying for stringent anti-CSAM measures in tech for years [0]. She's lead a major pressure campaign in the US, Canada, and the UK for years [1]. Her charity 5Rights and WeProtect both have been able to back Labour and the Tories so she's able to lobby across the aisles.
It doesn't hurt that the publishing company her parents founded (Pluto Press) has a strong niche in the political space.
The Molly Russell suicide also played a role [2], which she leveraged to highlight the need for restrictive anti-CSAM measures, especially as it became a top tabloid story in the UK.
I, broadly, think the House of Lords is a good thing in principle. Having a board of experts who can trounce things like the Illegal Migration Bill's potential for human rights abuse is a good thing.
FYI, the House of Lords can't reject the same bill more than three times, and can't reject a bill that calls an election, so its undemocraticness isn't as bad as you might assume.
There are many issues with it but I don't think many of them are much worse than those that already exist with the House of Commons. (Lobbying: exists in both. Unelectedness: the first past the post system is pretty bad here too. Ability of unelected people to introduce bills: private members' bills exist too).
All of these are problems I'd like solved but I wouldn't support removing the House of Lords.
I think it was a preemptive attempt by Apple to avoid the current situation. They looked at the what was supported by 4th Amendment and designed a technical system that was the best compromise of no access and the ability to search for known images.
They would rather have maintained the status quo but thought they had hit on something that would work. Since they didn't need to do anything pulling the software was an easy fix to the controversy. The governments are the ones that are pushing this so let them take the heat.
Apple has a working solution and they can just wait and see what happens and react accordingly.
One common argument was that bad state actors would get non-CSAM, but still "undesirable" (as defined by the state), material into the database and use it to target dissidents, badthink, etc.
(Personally, I think this was an overblown risk, since the plan was for the images to first be reviewed by Apple before being sent to the authorities. Presumably if an Apple employee/contractor saw a photo of, say, Tank Man, they wouldn't pass it on. But it's reasonable to be concerned.)
I agree. That was just an argument using Apple's CSAM implementation as a vector for a bad state actor. The same misuse could be facilitate any number of ways.
I have come to the conclusion that part of corporate combat involves apparent grassroots outrage.
If a system without a wide open backdoor is proven effective it becomes more difficult to argue for a bigger backdoor.
Apple's "solution" was the technical equivalent of Microsoft having Windows Defender scanning every single file on a user computer and attaching a "hash" / tag to it. Then downloading a database of "forbidden hashes" from an unspecified government authority.
Their initial plan was to only compare against the database when you upload to iCloud (in the comparison, OneDrive) and notify the authorities, but of course once the infrastructure for local scanning was in place, extending it to other "scenarios" was just a small update away (I'm sure RIAA/MPAA would love to hear about your old mp3 folder you kept moving over from pc to pc over the years).
The fear was that Apple would be coerced to add in non-CSAM hashes to the hit list, much as Apple is now compromised fully in China.
Thus any undesirable material - a campaign poster, a meme, a video leak of classified info - would be yanked from customer devices and trigger this automated reporting flow.
Apple already gives the Chinese government access to iCloud data for Chinese citizens. Being bullied into searching for eg anti-Xi material is a small jump.
The other concern is that the hashes themselves are prone to false positives. This is true of the source hashes and concerns in the algorithm. The National Center for Missing & Exploited Children data has been known to match on pictures of tricycles and monkeys (clearly not CSAM). Do now we are in a world of pushing automated child abuse charges against people because a faulty algorithm.
Of course it's just about surveillance. CSAM is just an excuse that is a sensitive topic so it's easier to justify which masks their actual intentions.
Apple's solution was to run every photo through a perceptual hash algorithm with a neural network inside of it and then compare that against a list of known hashes. This is actually not materially different from the system that every image host has used for a while now. The main difference was that they were going to use a bunch of cryptography to ensure that...
- Images could be scanned only after a threshold number of illegal images had been flagged
- Law enforcement agencies could not use the system to flag non-CSAM images
Part of the way they achieve the former - and what makes the system unworkable - is that it's client-sided.
To be clear, every service already scans for CSAM. But the naive way of doing this - comparing SHA-2 message digests - can be trivially manipulated into producing false negatives. Just flip a bit in the image and it becomes a new image. Perceptual hash algorithms instead use notions of image similarity to protect against this, but it also introduces the possibility of false positives - i.e. images that look innocuous but match against CSAM and get you flagged. This is not merely a problem to be solved, but a consequence of such systems being differentiable, which is necessary for them to actually work.
For remote scans, you can at least keep the perceptual hash algorithm secret and avoid leaking information about what images do and don't get flagged. This will frustrate attempts at adversarial training. However, when you put the perceptual hash on everyone's phone, it can be extracted, and people who don't like the idea of running a spying dragnet will deliberately break the hash just to tell you to fuck off. So people were making tools that would modify one image to look like another, which could be used to either hide CSAM or, worse, making cat pictures that get your iCloud flagged.
There's also a bit of moral insult involved with having your phone do things that aren't in your benefit, even if this ostensibly were to keep the feds from demanding explicit backdoors[0]. This system is laughably easy to defeat if you're jailbroken - just turn the daemon off. Hence why Apple made no attempt to integrate it into macOS where users have root access[1].
Law enforcement would never firewall their capabilities to "just CSAM." That's not how the law works. You can't sue your drug dealer for not giving you the weed you paid for, and for similar reasons, law enforcement is never going to only wiretap pedophiles and not drug dealers. There's a few exceptions to this[2], but generally, you should assume that if you're about to give the feds the legal capability to spy, it will be used for every crime down to and potentially including routine traffic stops.
Apple's bulwark against that was to have the system only scan for images that were flagged by multiple independent child protection agencies. However, this isn't a guarantee; the western world routinely collaborates in very not independent ways. There's no guarantee that, say, Europe's pedo-fighters wouldn't have had their arms twisted by their countries' intelligence apparatus, operating in concert with the American CIA who was arm-twisting NCMEC. Apple promised they'd be reviewing all reports before forwarding them to law enforcement, but in this situation there's nothing preventing them from having their arms twisted here, too[3].
[0] For the record, there was no evidence that Apple was using this system to enable iCloud end-to-end encryption. They'd already turned that off because of a UX problem: too many people were permanently locking themselves out of their own data and Apple couldn't break into it for them.
[1] Root access on macOS is complicated by the fact that SIP and boot security exists. They could at least theoretically have made the kernel refuse to terminate the CSAM scanner process at all, but that would be a speed bump at best. Users absolutely can turn off SIP or boot modified kernels on Macs that would remove whatever protection Apple added to the CSAM scanner. And Apple currently has no interest in locking owners out of macOS like they did with iOS.
[2] The NSA actually does try to keep their spying tools to themselves, because they have to fight nation-states, not criminals. Even then, they'll still slip hints under the door and tell the FBI they can only use them if they can make up a story that doesn't expose how the NSA's spying apparatus works.
[3] National Security Letters are one hell of a drug.
Apple could also have their arm twisted to add backdoors, but they've already shown that they'll yell loudly if that happens. Furthermore, the existence of software signatures means that such a backdoor would generate irrefutable paper trails, which is the sort of thing that intelligence agencies hate. It's much easier for them to just quietly extend an existing quasi-backdoor than to create an explicit one that loudly screams "DO NOT USE YOUR PHONE TO DO CRIMES"
Most don't use the naive way to scan for images, NCMEC distributes hashes as PhotoDNA which is a perceptual, not a cryptographic, hash, which is tolerant to some level of changes.
An interesting narative, but there are alternative perspectives to consider as well. It's not necessarily a matter of whether or not the UK needs Apple to break their encryption. In fact, they may not need to do so at all. Companies like certain Israel-based firms specialize in selling a myriad of cyber tools, often at quite reasonable prices, that can potentially bypass the need for direct encryption breaking.
It often feels like these public back-and-forths could be theatrical performances, designed to shape public perception more than anything else. State-level actors generally operate with a much wider set of tools and resources at their disposal than what is commonly understood. The fact that the UK is a member of the Five Eyes intelligence alliance is also noteworthy here. This alliance allows for expansive information and resource sharing. The reach and depth of these resources can often surpass the need for individual corporate cooperation.
It's essential to look beyond surface narratives and remember the complexity of global surveillance practices and cyber strategies. Yes, a company's commitment to privacy is admirable and crucial, but it's equally important to stay aware of the many other methods that state actors have at their disposal for obtaining information.
This isn’t about targeted attacks, though. This is about a blanket bill to monitor everyone’s communications.
The tools you mention still generally require either physical access to the device, or are in some way targeted. You’ll likely never be able to stop these, but a blanket communication encryption ban both serves an entirely different purpose and can much more easily be stopped.
Potentially going against the grain of HN opinion:
It strikes me as odd that people are now suddenly condemning the UK gov for trying to maintain its ability to keep its people safe, particularly when spying is not an unusual or a new power per se.
The UK government has had the means and the right to spy on its citizens for hundreds of years, via phone taps, espionage, bugs, steaming open letters and all sorts.
It’s served us very well as a country and has also saved many lives - just look at the stats on thwarted terror offences using intercepts as evidence. We are a democracy and if enough people didn’t want the government to hold these powers there exists the means to achieve that via electing a new government.
An American company is now threatening to stop selling some of its products in the UK if the government refuses to let it sell products that weaken their ability to police the nation.
In the context of what’s important to the country perhaps we should choose security over the profits of Apple’s shareholders.
The problem is that for hundreds of years, the vast majority of your communication was in person, or it was not recorded. Now almost every thought you have is permanently recorded. I have no problem with actively suspected criminals having their current communications intercepted. What I have an issue with is having everyone's historical communications stored and open for taking
I wonder what happens when all the Conservative Party MPs discover that this sort of legislation means even more of their internal party conversations — currently on WhatsApp but who knows what they'll use next — get leaked?
Will Apple still support SMS? Will users be notified that they’re “suddenly” dropped down to a less secure channel or will the simply have to pay attention to the text color and actually know what they mean?
Lol fat lies, apple pumping propagandato try maintain an image within the market.
They still operate in aus. Therefore they can legally be backdoor'ed, and almost certainly have been but are legally required to not notify their users or make note of it internally. Couple with some 5 eyes legislation and you got cross borders no warrant monitoring.
This grandstanding by Apple is nothing more than propaganda for the plebs. Go read country laws. Apple toes the line like everyone else. Your a suckered if you think otherwise.
One thing I don't get: how does a government think they can ban encryption? criminals can always encrypt their communication if they want to. How could they even be detected?
I dont think they realise the hacking I've seen on computers is at the HW level.
Take this PC, a fairly new dell, all patched with the bios et al, and certain websites (dailymail, youtube and reddit) with adverts deliver something that disables the dell mouse.
Only option is to reboot the pc but I have to take the mouse out of the usb and plug it back otherwise the dell usb mouse still wont work after a reboot.
So I love their hubris, but I dont think they see some of the hacking that goes on because its at the hw level.
How is it supposed to work, if the UK laws were implemented?
Only phones activated in the UK or with UK set as the activating country have to follow the rule? Only if there's a SIM card? What about iPods or MacBooks? What if you bring a US phone to the UK, does it have to magically detect and suddenly allow backdoors?
If you leave the UK with your UK phone, is it allowed to be encrypted then, or the UK wants the ability no matter where you go?
My understanding is that there will be a new authority, Ofcom, that will regulate the community of messaging application vendors. One of there powers will be to enforce the scanning of messages for CSAM or terrorism related content and Ofcom can force them to use an "accredited" service to scan for the illegal content.
I suspect the scanning of this content will be handled solely by accredited services, these likely will have close communication with one or more government entities who supply information on what content is or is not illegal. At this point there isn't really end-to-end encryption anymore: every message will be submitted to one (or more) accredited services. Even if it's encrypted, each message now has the government's blessed service as a recipient.
It's important to remember that we're talking about text messages as well as images, a system that compares only hashes will not be sufficient. The bit about terrorism related information is also worrisome as we know the list of what is and isn't terrorism related is a very political debate.
Is there a tech analysis of what exactly UK is trying to pull? "Ban encryption" sounds pretty vague. If me and my friend negotiate a cipher and writing each other encoded Post-notes - are we going to jail? Or should we provide a copy of cipher to nearest policeman?
Torys are intent on policing and micromanaging citizens by violating their private lives and their values to cherrypick who to judge, shame, and prosecute as part of a greater vision of self-righteous, "moral" "conservatism" in the spirit of Iran's morality police.
The same company who showed all governments that mass local photo scanning is possible, now wants to counter security legislation?
The damage done by apple with their previous plans for CSAM scanning is huge, and apple will be partially responsible for all the upcoming anti privacy / encryption laws.
On every single message outside the walled garden, Apple should have a note on iMessage appear "Due to Regulations XYZ 2023, this message may not be secure and may be open to read by third parties"
Blocking the MS-ABK deal isn't about being a global superpower, but about MS and ABK doing business in UK. It's the same with EU, Netherlands, Japan and US.
Whatever services MS want from Activision they can purchase on an ongoing basis without blocking others like Sony just to block the competition.
Who wants Call of Duty to go xbox only like Halo?
Microsoft is a platform provider, the content creators that use their platform should be free to publish to any platform. Otherwise Microsoft bump the prices and say take it or leave it but you can't get it anywhere else.
what if instead of breaking encryption and violating privacy rights, we had governments enforce the death penalty for people that rape children? Why are we trying to violate the rest of society through our refusal to take out the trash?
They're complying, in that China has not required them to alter iMessage in China. It's still E2EE. But iCloud backups aren't, and each country's respective government has access to those.
Strange that you got downvoted for stating that people should have power to hold politicians accountable.
Look at the top comment and see how illogical it is:
> I'm happy to pay a premium and be locked into their walled garden for some things if it means supporting a company that has the power to shift policy in favor of human rights (privacy in this case).
Morons like this actually believe taking away human rights and privacy will make them safer. News flash, the danger are people in power that can't be held accountable. Lastly, be skeptical of anything Apple says as they work together with governments to take away your privacy. Psychopaths will say anything to get you to trust them.
This is just tories fabricating an issue so they can come with a solution during their campaign. They’ll fix anything but their own corruption, excessive taxation and the dwindling quality of life. Perhaps they’ll increase taxes a bit more to, paradoxically for a conservative party, keep the socialists of the uk happy.
In the meantime, it might seem very counterintuitive, but it is very evolutional and GOOD regulation!
It will force custodial monopolies companies like Meta and Apple, either go away OR provide programmability for users to incorporate their own encryption layers with possibility to control your own keys yourself.
It just ignores this part of local legislation, like everyone else. The government tries to pressure companies to provide encryption keys, but this is unfeasible as it doesn't have big enough leverage, especially now, when hostile international relations shield foreign companies from legal threats. So the only thing they can do is an internet block, but most users do have some kind of vpn installed already to circumvent such obstacles. It also weakens government negotiations position, because one you use the nuclear option, there is no more ways to threaten the company.
Huh, interesting. I interpreted OP as "Messages is pointless today, as it already has backdoors". I hadn't even considered your reading of it until you pointed it out.
Stuff like this is why I'm an Apple fanboy. I'm happy to pay a premium and be locked into their walled garden for some things if it means supporting a company that has the power to shift policy in favor of human rights (privacy in this case).
In fairness, Meta has said the same thing re: WhatsApp, so while Apple deserves _some_ credit here for sticking to their principles, they're not exactly leading the pack. Which brings home the point that if _Meta_ isn't onboard with your privacy-obliterating proposal then you know you've really lost the plot...
In even more fairness, Meta has literally skipped the EU for Threads instead of increasing privacy. In their case, it's just a careful balance of virtue signaling, it had no qualms about pushing their spyware to hundreds of millions of people, violating basic human rights in the process.
Your comment would have been stronger if you left out the piece re: the Universal Declaration of Human Rights, because it's an eye roller. Article 12 is:
> No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.
I'm no fan of Meta or their privacy violations, but saying that voluntarily downloading an app (which by the way does clearly outline their privacy practices) is equivalent to having my humans rights violated just really waters down what "human rights" should mean.
that's all true until an app becomes the only way to interact with certain groups of people. i certainly wouldn't call my usage of Discord voluntary. it doesn't actually matter how often i read their privacy policy and shake my head. either I break off contact with a large part of the internet that's important to me or I begrudgingly agree "voluntarily" to their terms and conditions.
People miss the forest for the trees here. Democratic society needs a way for people to communicate freely. Those ways are dying out fast.
In modern life, people don't discuss politics in townhalls, pubs, or whatever anymore. Online communication is by now far too siloed and supervised to the point of being unusable. Media are one-way streets, pushing the agenda of billionaires.
But you need to be able to have controversial discussions about "heavy" topics without fear of being ostracised. You need to have a back-channel to give feedback to "higher ups". Etc.pp.
A society where the individual is degraded to a mere drone (even if only loosely via framing) controlled by the "hive mind", without any recourse to voice constructive criticism is a lot of things, but certainly no democracy. There looms a dystopia on our doorstep.
To be honest, I think you've got at least part of it backwards.
> But you need to be able to have controversial discussions about "heavy" topics without fear of being ostracised.
Uh, why? People should have freedom of speech, but it feels like a lot of people want freedom from the consequences of the their speech. If anything, I think a lot of our current social disharmony is a direct result of the extremely recent development of people being able to mouth offline with very little social constraints.
Mike Tyson probably said it best: "Social media made y'all way too comfortable with disrespecting people and not getting punched in the face for it."
Not even that, if I’m not mistaken Facebook imported contacts from users agenda and then allow third part apps to access the information, so even if you didn’t have a Facebook, you data was leaked.
Have you ever actually tried to convince a non-technical friend or family member to use a different platform for communicating? Now try that with 10 people at the same time. The only reason Discord ever became successful is the huge vacuum of good UX that existed before it. A fragmentation of Skype, Steam, XFire, Teamspeak, Vent and Mumble. Jabber if you were playing certain MMOs. Signal and Keybase aren't even close either. Matrix is a little better, but still far off, and they have the Mastodon problem where explaining federation turns off normal peoples brains.
And aside from that, a lot of software communities are on Discord now. People are gating download links behind it, use their threading feature for support and put their knowledge base in channels. And not just small communities and developers. ASUS has made their main communication channel Discord too.
I understand! It's just that I never had to download Discord for anything.
I agree it's difficult. I've had success showing keybase to people because of how barebones simple it is - but it's not a solution for everything of course.
You are a very ignorant if you think there is no barrier to getting someone to try an app with a substantially less good UX than what someone's currently using, and that that barrier doesn't border impossible in a group of 10.
I can't even get my wife to install Element - and she works in tech as a senior python / automation engineer. People want neatness on their phone and low total complexity, and installing a messenger for a single person is over the line for most.
You have insults I have numbers. We are not the same…
> You are a very ignorant if you think there is no barrier to getting someone to try an app with a substantially less good UX than what someone's currently using, and that that barrier doesn't border impossible in a group of 10.
Yet many people do have multiple messaging apps.
And Discord the one you think is so popular is not even in the top 10…
This isn't compulsory, it's just personally beneficial. There are plenty of beneficial contexts you need to opt into, and include some cost or compromise.
So what you're really complaining about is that other people will not follow your choices. If your relations have such a hard line that a single app is "the only way to interact with certain groups of people", they are the ones to blame, not the maker of the app.
It’s just that there was a time in Europe (about 80 years ago) when among other cruel things, the honor and reputation of a lot of people were actively destroyed for arbitrary reasons.
>> No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation
If I cannot attack the honor and reputation of a person, and by extension, artificial person like a corporation, what good is freedom of speech? How can I call someone out for being a scammer, or for doing horrible things? How does one seek public redress of a grievance against a company, or an agent of the government? Does arbitrary apply to timing, or the circumstance? Who is the arbiter of arbitrary? This is not a human right. It is a nice tool for a tyrannical government to strip away rights from people. Oh, you've violated Mr. Arresting Officer by accusing him of falsely arresting you. Now you get to deal with the false arrest and a human rights crime. Likewise, Ms. Bought a Mansion with Public Money can no longer be accused because she has a human right to not have to trifle with public accountability.
Arguably it's not the voluntary downloading of an app that's the problem; it's the mandatory government imposition of privacy interference across all apps.
I think that the problem is that Meta has normalized this violation so it doesn't feel like a big deal. It feels watered down even for someone like you who appears to be very aware of the problem.
It's like exploited workers and people living in quasi-slavery, many of them don't understand their rights and "voluntarily" waive them. My parents didn't really understand the implications of "voluntarily" accepting Meta's privacy statement - once I explained it to them, they were in complete horror.
Meta already have Facebook, Instagram and Whatsapp in compliance with EU legislation.
Threads is not an attempt to push spyware on users, that would be incompliant in the EU. (And what is this spyware you believe is in Threads?)
More likely they know exactly how much work it entails to be compliant and they decided to initially skip the EU market to speed up their launch.
I don’t think it’s because of not wanting to preserve privacy I think it’s because it takes a shit ton of time to comply with all the orders the EU has pinned on them.
And in even more fairness, privacy was not the reason Threads didn’t launch in the EU. It is in violation of the digital markers act, not GDPR. The leveraging of market power in one platform to boost another platform is what is at fault, and seeing the meteoric rise of Threads on the back of Instagram’s popularity the stratagem worked.
Is WhatsApp still requiring access to your phone's address book before it allows you to use it? It did last time I tried it. There is no chance in hell I'm giving it all the phone numbers, emails and names of people that trusted me by giving me their details. At the time I set up a virtual android environment (samsung called it knox) just to setup WhatsApp in a way it couldn't access my phone's address book. It is extremely unethical on Meta's side to continue with this requirement. So I'm dubious every te they're presented as champions of privacy protection.
I currently have WhatsApp installed without access to my address book (on Android). It doesn't require it, but it makes it harder to use and locks out some features. Among other things, you can't start a new group / direct message, but you can participate / reply if someone else messages you first.
I use an app called "Open In WhatsApp" which lets you paste / write a phone number and it will open a conversation with that number (by using android intents). You can also share contacts from the Phone app to this App, so you don't have to copy/paste manually.
If only there were a strong, centralized app store that could perform app review and prevent applications from degrading user experience beyond the extent actually necessary without that permission.
Good thing the EU just outlawed that model and forced everyone to support "third party app stores" that are thinly-veiled shells for Facebook/Google/etc to bypass that review process.
the "I wanna sideload" crowd are nothing more than allies of convenience for FB and others. Facebook was already experimenting with getting users to manually sideload the full-telemetry build and now they can just say "oops not supported on safari, install the native app".
I recently had to install whatsapp. No asking for contacts at all. The web app (via a qr code) is ok, and i run that in a vm set up just for wa.
I don't let friends and fam know i have it...I prefer keeping out of dramas, 'news', and memes. I did check that i do not appear as 'x is online' on anyone's phone (but only checked with the person i needed to wa with).
The app isn't bad, tbf. It's just not for me.
I think Meta _needs_ WhatsApp to be their loss-leader (not really, but you know) when it comes to security. They always get to say "oh but WhatsApp is E2E!! We do care about security!"
It's debatable whether or not that was true, and even if you believed it it was up for another debate whether that security was by design or because Apple's vanishingly small share of the PC market made it uneconomical to write malware targeting OSX.
This isn't to say that Microsoft was doing security properly back then either, they weren't.
>Apple didn't give two stones about security until it became a marketing/selling point
I don't much care. Companies don't have ethics, they have fiscal goals. If Apples fiscal goals happen to line up with my own goals, so much the better.
I would tend to trust a company who is doing something to selfishly support the bottom line way easier than I would trust a company who claims to be doing it for the common good of all humanity.
Google famously started off trying to not be evil and also be a profitable company simultaneously. It didn't work out great in that one of those goals became slightly more important than the other.
The ironic thing is that WhatsApp very heavily pushes users into Google account backups, which are unencrypted, giving government agencies all the access they want.
I remember some talk about those getting encrypted, don't k ow if that has happened yet.
They seems to be end to end encrypted now. It asked me a password to encrypt the backup with. It's a recent feature and you must trust a closed source app. No idea of I can download the backup from Google drive with a web browser and decrypt it with some common decryption program.
25% of the entire population is a pretty popular application! Not “hardly anyone”. Also, I’d like to know the volume of messages shared, I have WhatsApp, but wouldn’t miss it it it disappeared. iMessage disappearing would be very inconvenient.
I'm surprised it's even 25% in the UK, and I wonder how that's measured. Maybe the 25% includes things like "opening the iMessage app to read SMS messages sent by two-factor authentication services" etc.
I’m not surprised, SMSing people is still common, and sliding from that to iMessage is totally frictionless for iPhone users. With iOS market share being around 50%, I’m surprised (skeptical even!) that it’s not closer to 50%.
But who is that 25%? iPhone users have money. I’m sure there is some significant overlap between iMessage and WhatsApp users (I know I and everyone I know use(s) both). But if you threaten politicians with losing their blue bubbles they’re gonna fall in line.
> But if you threaten politicians with losing their blue bubbles they’re gonna fall in line.
This is not a thing outside the United States, and I doubt that most British politicians are even aware of 'blue bubbles'.
For example, the recent controversy regarding the release of Covid-related government communications was centred on WhatsApp, with no reference to other platforms like iMessage.
I prefer iMessage over the other messengers, but I still haven’t met anyone who cares. A lot of people don’t even know why the bubbles are sometimes green.
I've noticed that too and it is super annoying having family in the UK that will only communicate via Whatsapp. Is this because cell providers outside the US charge for SMS? Even the ones who have iPhones end up using it because it is too much of a pain to use separate messaging apps.
It's because SMS is simply inferior to Whatsapp, Messenger, Instagram, Line, kakaotalk, Snapchat, Signal, etc unless you own an iPhone. All of them are cross platform and aren't intentionally broken for people who don't have or use iMessage. I personally avoid giving out my phone number as much as possible as all the other platforms have much better tools for communication, blocking, media, etc.
Signing into your personal accounts on a work machine is a terrible idea, as it opens up your personal accounts to search if there is a lawsuit (against you or against the company). That's assuming it's not already strictly forbidden by the company rules to prevent you from sending yourself confidential data.
Also outside of programmers, IT, and designers, you rarely get a choice in the company issued laptop. It never makes sense for a company to issue more expensive Apple computers instead of ThinkPads or Dell computers.
> Signing into your personal accounts on a work machine is a terrible idea, as it opens up your personal accounts to search if there is a lawsuit
The machine is mine and I do work on it. Thanks for the concern but I’m not worried about getting sued. I’m sure it’s happened here previously, but it’s not like the US.
> Also outside of programmers, IT, and designers, you rarely get a choice in the company issued laptop. It never makes sense for a company to issue more expensive Apple computers instead of ThinkPads or Dell computers.
I’m not one of those professionals and I can get my job done quicker on a Mac. My employer doesn’t dictate how I get things done and paid for the machine I want. I think this is a good thing.
Any job that requires a crappy trackpad isn't a job worth having.
> My employer doesn’t dictate how I get things done and paid for the machine I want.
That is very much not your machine. If your employer ever gets sued, or runs into some investigation, or some internal dispute escalates enough, all the data on it is now likely available for them for review.
Given large enough company and enough people, you'll get that "preserve documents for discovery" email one day. It happens outside of the US too.
When you need to travel between workstations I work docked all day, once in a while I need to be mobile, but it’s much easier to work on one machine vs two.
I've never really understood what people mean by this. For me, iMessage is indistinguishable from SMS, except for the color of the messages, and the fact that I can use iMessage over WiFi.
Using it over WiFi is big feature alone for a lot of people who are often in spaces with good WiFi but poor cellular reception. Some carriers support SMS over WiFi in the same way they do "WiFi calling" which can alleviate that.
The other major feature I see for other messengers over SMS/MMS is much larger attachment sizes. It can be challenging sending an MMS with attachments >1MB. Meanwhile, I can send a 100MB file/video or an 8MB photo over Signal. WhatsApp allows for 16MB attachments. Sending a quick video in-line with the chat thread in MMS is miserable and gives an incredibly trash quality video while most other chat apps you can stick a decent quality 30 seconds or so video without any issue. Photos sent over MMS are usually junk while one could get a decent 4x6+ print off an 8MB photo.
If I send an SMS message to a group of people, they all see a message from me. They don't know who else got it. And if they reply, they reply only to me. Is your experience different?
WhatsApp (and I think iMessage) allow me to create a group with a name/purpose and send messages to the group and receive replies to the whole group.
(P.S. I went from dumb-phones to Android and have limited exposure to iMessage's feature-set).
I imagine your experience of sending a message to multiple people and it not making a "group message" was your older dumb phones weren't switching to MMS, it was keeping it as pure SMS. In the SMS world, there is one recipient. In MMS, it's like an email, you can list a lot (100+ in some cases) of receipts and they can all see the list.
Just checked my provider's pricing ( in the UK) MMS are capped at 300kB and cost 30p (0.39 USD) per message.
That kind of pricing made WhatsApp an infinitely preferable alternative.
Phones automatically switching to MMS would be catastrophic.
Data (WhatsApp) is essentially free in comparison. £10 (13 USD) per month gets you 20GB: which doesn't care how many messages/recipients/photos you send.
Dang, that's some highway robbery pricing right there.
Here in the US its common to have at least 1MB MMS max size. Back in the day 300-600KB was often the max size, but that's definitely changed over the years. Maybe its just splitting it up and re-assembling it behind the scenes, I'm not sure.
I can't speak for all history, but from about 2004 or so in the US MMS and SMS were often bundled and billed the same, especially for networks which had rolled out 3G/EV-DO. Having a plan with 500 messages usually meant 500 combined SMS and MMS message.
> If I send an SMS message to a group of people, they all see a message from me. They don't know who else got it. And if they reply, they reply only to me. Is your experience different?
Interesting. Here in the US, plans that don't include unlimited talk+text are hard to find, even among the budget MVNOs, and that's been the case for at least a decade.
It's easy for me to forget how different my circumstances are from most other people. I'm far more likely to be in a place where I have no WiFi than a place where I have no cell service. And I don't personally know anyone outside the US, so I don't send international messages. I guess I've never paid enough attention to the fidelity of pictures I'm sent or am sending over SMS to notice a quality difference.
It is mostly because international SMS and MMS is not free. People in the UK travel more to other countries, and communicate more with people in other countries.
Not to mention MMS is vastly inferior to WhatsApp and iMessage for sharing media and other information.
The price of built-in text messaging whether SMS or iMessage (or a combination of the two) largely went to zero for most people in the US texting especially other people in the US by the time other apps gained traction. And services like appointment reminders and verification codes pretty much all use SMS in the US--whatever its potential security vulnerabilities.
So I use 2 or 3 other apps for mostly a few international people I know but basically everyone I know just defaults to SMS/iMessage.
To be fair a slightly better comparison would be MMS, but even then it doesn't compare. It's simply because data is cheap, the app is free, and the rich media features that came before iMessage/RCS existed (voice, photos/video, attachments, group chats, voip/video calling, etc), and because it's multiplatform (ios to android usage is near 50/50 or 60/40 depending on the country)
SMS was historically expensive and today is still worse with regards to features. (Also international texts as others mentioned) And once you have to choose a specific system, why would you use one that's limited to just part of the market? If you want to use a communicator to talk to people, you may as well go with one that's accessible to everyone.
I'm not even sure anyone apart from Signal is really sticking to any principles. This system would be annoying to implement, costly to legally manage, and introduce some ops overhead and security exposure. There's way more incentives for WA and iMessage here to oppose it than just protecting their users. But only one is a cool talking point.
Has Meta threatened to boycott? I know Signal has, and now Apple. However my searches for Meta/WhatsApp seem to show that there’s been contention over their introduction of E2E but they haven’t actually threatened to pull out over it.
That said, I’d be very happy to see that they did threaten a boycott as well.
> Some countries have chosen to block it [WhatsApp]: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.
They said they won't comply. UK would have to simply block WhatsApp.
For me at least, it is the opposite. It is not that I think Facebook is principled, it is that I don't think Apple is particularly principled. They care about money. Unlike many of their competitors, they are not fundamentally an ad-tech company which means your personal data has much lower value to them. This has caused them to build their brand around privacy because that is something their competitors have difficulty doing due to their business models. Apple's pro-privacy stance is just as much a capitalistic position as a principled one.
Interesting. I was going to say that many people use WhatsApp anyways, so if both of them go down that might actually lead to some pushback against these regulations.
Didn't Facebook (now Meta) buy WhatsApp, and it was originally its own thing? I guess you can say Meta kept WhatsApp operational, but I don't necessarily consider it a privacy win for Meta to just buy up some privacy-centric technologies with their billions.
> didn't eta just move to make WhatsApp not end to end encrypted?
If so I missed the memo (that isn't unlikely, I'm not saying you are wrong!). Any references/links for that? A quick search doesn't turn up much, but Google isn't what it used to be…
This is pure speculation. WhatsApp is banned in some authoritarian countries such as China and UAE. It probably remains unbanned in others simply because it is so important for business.
What do you mean? WhatsApp rolled out E2E encryption between 2014 and 2016.
It’s by far the largest messaging service that is e2e encrypted by default. I think it’s more than fair to call it ahead of the curve for a service of its size.
But, let's go to the start - WhatsApp started in 2011.
Encryption arrived in varying degrees until there was some pressures to change the amount of privacy messages had for advertising purposes.
Lots in the news at the time.
2018 "Another point of disagreement was over WhatsApp’s encryption. In 2016, WhatsApp added end-to-end encryption, a security feature that scrambles people’s messages so that outsiders, including WhatsApp’s owners, can’t read them. Facebook executives wanted to make it easier for businesses to use its tools, and WhatsApp executives believed that doing so would require some weakening of its encryption."
2018 "Acton said he tried to push Facebook towards an alternative, less privacy hostile business model for WhatsApp — suggesting a metered-user model such as by charging a tenth of a penny after a certain large number of free messages were used up." https://techcrunch.com/2018/09/26/whatsapp-founder-brian-act...
> a company that has the power to shift policy in favor of human rights
Well you're _technically_ right; they do indeed have that power. However, it sounds like you're suggesting they are using that power _to_ shift policy in favor of human rights (emphasis is of importance).
Whilst I can't prove that their intention has nothing to do with human rights, it's only in the same way that I can't prove the last U.S. invasion of Iraq wasn't about protecting the world from weapons of mass destruction.
It is patently obvious to me that this is a business decision fuelled by the drive to keep profits up, that just happens to coincide with happily with a PR friendly action. It's great that they're doing this, but it's just a coincidence that it aligns with what we want.
> but it's just a coincidence that it aligns with what we want.
I agree with your comment, but I think it is naïve to assume that Apple does PR by happenstance. Effective marketing/messaging is the bedrock of their business strategy.
So yes they probably did this primarily for raw profit/power reasons, but the positive PR is a close secondary.
It doesn't have to deal with compliance issues, which are not only getting more complicated, but also increasing in number.
"Sorry guv, I don't have the keys for that, so I'm not responsible for what went on as I couldn't have known".
Though the biggest money maker is the idea itself. Many believe that Apple is "on their side", and these kinds of plays re-enforce that. The moment it's more profitable to sell you out than "stand up for privacy", they'll do it.
>Stuff like this is why I'm an Apple fanboy. I'm happy to pay a premium and be locked into their walled garden
Such statements rarely end up ageing like wine in the long run. Why would you be so happy about being owned by some big corporation just because it's your favorite big corporation?
Self preservation has taught me not to trust any big corporation, regardless of how good their PR is.
Apple has been around for a pretty long time as far as tech companies go. I have grievances with them, but mostly they do a better than average job on privacy.
>Apple has been around for a pretty long time as far as tech companies go.
So have 100 year old car companies, and I'm old enough to remember many people saying the same things about their favorite car brand in the past ("I'm only buying X brand because they make quality stuff and they never broke my trust") and I also know they changed their opinion recently after a couple of decades.
And changing car brands is way easier than changing smartphone ecosystems, should one of the two players decide to screw you.
My point is people should not be fanboys of any corporation. History has proven they can always turn against their customer, even if Apple so far hasn't done it, but there's time, company management and culture can, and will always change with time, and with constant shareholder pressure for infinite growth there's plenty of opportunities in the future for the tides to turn, mark my words.
>I have switched from iOS to Android and vice versa multiple
times while still driving the same car.
Did your app purchases form the Google Play Store automatically transfer to your Apple App store, and did your photos off your Google drive transfer to your iCloud, or vice-versa?
Read my comment again, I ware referring to the ecosystems not being interchangeable, not the phones themselves.
The incompatible ecosystems is the tie-in that keeps people tied to their respective walled gardens, not the physical HW phone brick itself which is interchangeable. Once people invested enough money in one ecosystem, they're less likely to switch to the competition as all their data and purchases are trapped.
Not sure why you got downvoted, it’s purposely hard to switch ecosystems, I’m in the middle of switching from android to IOS, but still have over a decade of data in gmail, google photos, and google drive. And I waited until my wife was willing/ready to switch because it’s harder to share across ecosystems. Fortunately we don’t own many apps (mostly things with external subscriptions like news, streaming media, etc) so purchasing replacement apps wasn’t painful.
> still have over a decade of data in gmail, google photos, and google drive
And apps are available for iOS that let you access all of them. If you install Google Drive on the iPhone, your drive shows up along side iCloud in the Files app and open/save file dialoga
Yes that’s what I’m doing for now, but I left android because I didn’t like sharing so much data with google, now I’m still sharing that data with google, and now apple too.
Fanboys don't like you pointing out the hard to swallow truth pills. They expect comments to validate their lifestyle choice, not contradict them. Any contraction is a direct offense to them.
> I’m in the middle of switching from android to IOS
At least for you it's easy, because all google apps exist for iOS, but switching away from iOS is absolutely impossible as there is no iCloud or iMessage for Android, so all your data remains hostage with Apple if you try to move.
> Did your app purchases form the Google Play Store automatically transfer to your Apple App store, and did your photos off your Google drive transfer to your iCloud, or vice-versa?
Which apps would those be that you have to buy on iOS and Android separately? Most non game apps are subscription based that work across platforms.
My photos are already sync to Google Photos on my iPhone and if I had an Android device, I would just download the Google Photos app.
Music bought on iTunes has been DRM free since 2008 and Apple Music is available on Android.
Apple, Google, Amazon and a few other platforms and most of the studios participate in MoviesAnywhere where you buy movies from one platform and it shows up as purchased on the other platforms
This is why, every where I can, I only use a company's web app and not their installable app. The few apps that I use as apps, I buy for each platform. This is also why I avoid subs for most apps. Not paying a sub for both platforms for a calendar app.
It’s not that hard to pivot ecosystems. Unless you have extremely high-end devices and an extremely low end car, you can almost always move from Apple to another ecosystem for a lot less than changing cars (assuming you’re buying new devices and a new car OR used devices and a used car). You can even make the move gradually over months or years.
China had all the bargaining power, in that non-compliance meant no more iPhones, and Apple going out of business. Apple wasn't ready to move production entirely to India or another country.
It's very strange because like OP, huge fanboy. Early on it felt wrong to also burden Cupertino with being a cultural leader, but somewhere around $1.0T+ market cap, that has evolved.
Vaguely remember a charity used to have a thing where they'd auction off a dinner with Steve, then later Tim. Given time with a leader who has literally revolutionized supply chain in CN and a CEO who made Apple his own following the most notable CEO ever, going after various gotcha points would feel like a wasted opportunity. But think I'd have to ask why a company designed in California of the 1960s and 1970s doesn't have line in the sand policies when dealing with a genocidal government and perpetual backdoors in software and politics, but will take a move like this. Would hope for something more than a canned answer.
Capital. Once you grow to a certain point, politics and ideas are only good for marketing. Companies will happily work with banana republics and dictators if it outweighs the losses from paying your PR team a little more for a few months. Companies are not your friend, as in the end they only care about your money.
You must love Meta then, who refused to compromise to enter China (and boy, they tried). While Apple happily shares data with Chinese government to make billions.
Zuckerberg was courting China hard around 2015-ish, to allow them to operate. He learnt mandarin, spoke at Tsinghua University in Beijing, met Xi, etc.
Of course, we don’t know real reasons why they didn’t open there, but end result is that Apple shares data with China and Meta doesn’t.
Seeing that every non Chinese social media company was banned and that Zuckerberg later tried to get back into China
> In 2018, Facebook attempted to set up a $30 million subsidiary in Hangzhou to incubate startups and give advice to local businesses. Permission to run the startup was quickly withdrawn
Do you really think that Facebook is out of China for any moral reason?
Yea, and even if it is actionable and "real" - they're .. i think, correct me if i'm wrong, by law a for-profit company. Meaning none of this is noble, this is just aligned with our interests because they believe that to be the most profitable image for them currently.
Which is all well a good for now, but profit directions can change, and i suspect it's easier to change a direction guided by profit than a direction guided by morals. Ie i suspect their current "good direction" is less stable than some would suggest.
The company doesn't have a hundred thousand armed officers with the ability to arrest / kill me for anything they can claim they found. Government comes equipped with legal violence, the company comes equipped with targeted ads
It's true, they merely have enough cash on hand to fund a small army, enough legal power to crush you in court with anything you attempt.
Make no mistake: if Apple needed to get someone killed, they would. They're not some miraculous pinnacles of good in a sea of evil. It's just that they merely need to flex the law at you to destroy your life forever. Apple is no different, they'll hire the Pinkertons and break your fingers should they find the need to. Carnegie Steel did it, Apple's not above it either.
You lack creativity. You have multiple billions available for that. Where do you want your remains to disappear ? Hell, even sharks with lasers isn't far fetched with that amount of money.
But all playfulness aside:
* Pay professional hitmen.
* Pay a hobo enough to go beat you up in your home.
* Pay enough people enough money to kill you pretty much anywhere.
What are they going to do, plead in front of a judge that sees 90% of his murder cases unsolved or solved through guilty pleas that Tim Cook gave them a wad of cash when all they saw was a subordinate of a subordinate of a subordinate ? There's nothing fancy about it. Especially with an economy that drives people to drastic measures: a huge wad of cash that you're guaranteed to keep is enough for most people.
You mean, pay an undercover cop who is advertising himself as a professional hitman?
> * Pay a hobo enough to go beat you up in your home.
He'll pay the hobo, and hobo will run off with the money
> all they saw was a subordinate of a subordinate of a subordinate
This is even more crazy than the earlier suggestions. You think there is going to be a chain of subordinates at apple who will all go along with a murder conspiracy? Try arranging a surprise party with five people, see how well people can keep a secret.
"My favorite multinational trillion dollar worth conglomerate that has a history of not caring about human rights when it comes to the production of their devices surely would never do that!"
You're naïve. Less than 100 years ago the Pinkertons were still murdering workers and nothing happened to them.
Depends on your crime. If you outed his human rights overlook, then he will send you on a vacation to the place with the Human rights violation. And somehow you will get in trouble with the mob there because for some reason they don't like you for threatening their jobs.
If you don't take his vacation offer, he will invite you to a party on his dime. If you don't attend, he will use your location data, make it seem like you were in the wrong place at the wrong time. And depending on the severity and nature of your crime, the outcome will be just low ball enough to make it seem like apple needs to do something bigger than focus on human rights, like making better devices for your security.
It doesn't even have to be good. It can be a "robbery" where you get professional-grade double-tapped on your front lawn but nothing's taken, or a "suicide" where you hang yourself from a tree whilst simultaneously shotgunning yourself in the chest. The news will just parrot the local cops' PR statement.
Are you saying that the internal accounting controls at Apple are so lax that Tim Cook can pay contractors off the books out of his own travel expenses?
I also imagine the post-arrest interview of the $5000 hitman going like this:
"You're charged with Murder 1, which is a capital offense in this state. Are you ready to die for $5000?"
Yes, misuse and misappropriation of funds is such an uncommon event for fortune 500 companies. Especially the CEO, they have to justify every little expense!
> we know that governments do this sort of thing all the time.
they also have better tools than hiring hitmen
contract killing is a fraction of all the homicides
OP asked how the multi billionaire CEO of a trillion dollars company could kill someone
He could simply hire an hitman, it would cost him peanuts
If the victim is also using one of the products the company makes, they would also be in possession of all the information an hitman would need to complete his task, no investigation would be necessary
Costs vary: In Australia, contract offers ranged from $500 to $100,000, with an average of $16,500. One undercover investigator, hired as a hit man more than 60 times in 20 years, lists his largest proposed payoff as $200,000 in jewels (and that was just the down payment) and his stingiest as “seven Atari computer games, three dollar bills, and $2.30 in nickels and dimes.”
Some of the highest-profile hits nowadays happen in Russia, where the rise of wild-west capitalism has led to a boom in contract killings, with victims including politicians, editors and journalists, businessmen, even poets.
How much do you think it would cost to hire a Russian blackop/elite troop fleeing from his country?
There are plenty of unhinged people in this world who do illegal shit for enough money. Do you think some Ex-Wagner mercenary would pass up that offer?
If they choose to sure, but I dont see what incentive they would have. After all, the post is about a company removing products from a market rather than being subject to a law that would allow the gov to force them to do so.
All of these are their own accord. That's even what OP's article is about - if Apple is legally required to break their encryption in the EU, then they will discontinue their product.
Let's not pretend this is a privacy-motivated decision. Where Apple doesn't have negotiating power (eg. China) they have been forced to compromise user safety and privacy[0]. We live in a post XKeyscore world, pretending like this doesn't also happen in America or the UK is a bedtime story for helpless capitalists.
Which is a good reason to avoid giving your data to either one in the first place. If I don't have a choice though, I'll take the company over the government any day.
Downvoters: instead of reflexively downvoting, why not try reading? Here is some recent history:
Blood for Bananas: United Fruit's Central American Empire [0]
The lawyer who took on Chevron – and now marks his 600th day under house arrest [1]
(Debatable) Business Plot, aka Wall St Putsch [2]
WTF? I can buy a phone from another company if I disagree with Apple's approach, or use no phone at all. Try doing that with your government sometime and see how it goes for you.
Nobody worries about the government giving customer data to Apple, do they? Why do you suppose that might be?
the tendency to retroactively ascribe positive attributes to an option one has selected and/or to demote the forgone options. It is part of cognitive science, and is a distinct cognitive bias that occurs once a decision is made. For example, if a person chooses option A instead of option B, they are likely to ignore or downplay the faults of option A while amplifying or ascribing new negative faults to option B. Conversely, they are also likely to notice and amplify the advantages of option A and not notice or de-emphasize those of option B
https://www.wired.com/story/apple-photo-scanning-csam-commun... This scandal about scanning your end-to-end encrypted photos on one of the ends, kinda defeating the purpose and reducing everyone's privacy. It lead directly to the legislation that Apple is coming out against today, which I would call ironic but is very predictable. How dare governments require us to do this thing we already planned to do.
That CSAM detection method was something that apple published a whitepaper about to get feedback. The feedback was strongly negative, so they never implemented it.
I would hope that the grandparent was not referring to this thing that never existed as "the client side scanning ability that is built into the OS."
Actions like this is why I used to trust Google. Fighting government overreach should have been a critical part of technology and data.
But it seems Google was more political in its relationships than idealistic. And now the entire sector is compromised.
Apple will be one day compromised too. In some crevice of some unknown government office is a person working on a ten-year initiative to get access to inaccessible information in the tech sector. AT&T was compromised this way. Microsoft was compromised this way. Google was compromised.
Apple is a participant in the NSA's PRISM mass surveillance program. They've been compromised for a long time. I doubt any decent sized corporation isn't.
I am going to give them the benefit of doubt in the case of PRISM. It is not clear whether PRISM is the result of forced/voluntary participation, or the result of vulnerabilities found in their systems. Because of the diagrams in the PRISM powerpoint mentioning some of the integration is "coming soon", I feel it is definitely not legally coerced, otherwise everyone would participate at once. Also due to the integration having varying limits from service to service tells me this not given to NSA on a silver platter. Lastly, the outright denial of these companies further tells me that they are unaware.
PRISM is the result of vulnerabilities found and purchased from hackers. And integrated together in a project codenamed PRISM. Each source has different levels of information sharing. And with Dropbox being a less integrated and more primitive resource for search, I know the entire PRISM program was a gum and shoestring project.
It is possible that their involvement in PRISM is involuntary. However, as a member of the "screeching voices of the minority" [0], I will not extend them the benefit of the doubt.
Exactly this. When some government makes a big public proposal to invade privacy, Apple will speak up with bravado. But they won't say a word in all the other cases when a government demands to invade privacy when those demands are made in secret.
The right to repair is a direct extension of "banging the rocks together" - i.e. toolmaking, the main thing that defines us as humans. How can that not be a human right?
> the main thing that defines us as humans. How can that not be a human right?
Plenty of animals use tools [1]. And this isn’t how we define human rights. Bipedalism is quintessentially human, that doesn’t make tunnels a crime against humanity.
I'll push back on this. I think that Right to Modify[0] is a human right. I think that software freedom and user agency (especially where it relates to vendor lock-in) is closely related to human rights, in the sense that it can directly impact people's ability to express those rights.
To say that right to repair in specific isn't a human right because the actual human right is agency/ownership/modification/whatever, is a bit like saying that encryption isn't a human right because the actual right is privacy and encryption is just a way to maintain privacy.
The end result is still that people aren't able to exercise rights. I would argue very strongly that having a degree of autonomy over the devices/objects that you own is an intrinsic right.
And if you look back at the history of Open Source software, you'll find that conversations about Libre/Free software have been regularly grounded in discussions of human rights from the start; from rights to ownership, to modification, to communication.
Sure, I'll agree they're closely aligned but software freedom and "user" agency (I'm guessing as opposed to general human agency?) require software to be relevant and therefore I wouldn't say they're universal human rights.
I sort of see where you're coming from but I would somewhat disagree, I think that software is just an expression of device ownership and modification rights more broadly. Maybe that is kind of what you're getting at, that right to repair is a higher-level concept on top of a more universal human right? It's software-specific in the sense that software makes it harder to use those rights, but I also think I should have the freedom to take apart my toaster or to modify a shovel that I buy. But...
I don't know, I don't want to argue too much, and I don't want to give you a pedantic reply when we seem to be in agreement that privacy, agency, repair are all important and whether you or I would stick them in a specific category probably doesn't change much about that or necessarily mean that we actually disagree on anything practical related to those concepts.
I think it's bad that Apple isn't willing to fight for right to repair or user agency, but it is good that Apple is willing to fight for privacy regardless of its motivations. I don't think Apple is a universal advocate for human rights and I don't like deifying the company and I don't think its positions on privacy excuse its positions on repair or agency. But it's pretty objectively good for Apple to make a statement that it will pull these products out of the UK rather than comply, and it would be silly for anyone to say that the statement is meaningless just because Apple has bad positions on other freedoms. Apple's positions on right to repair do not make it any less good for Apple to have issued that statement about encryption.
From a security point of view, there is no difference between being able to access your data and actually doing it. You may picture such or such company as better than another one (and maybe it is, though I strongly doubt it) if you want, but behaving yourself as if that company really cares about your privacy is extremely naive.
I would rather suggest firmly believing whatever company you chose is spying you (be it true or not), and adjusting what you can accordingly (settings, apps, usage, etc.).
I used to think like this about Google, back in the "Don't Be Evil" days.
I end up living day by day, knowing that nothing is permanent. Right now, I'm with Apple, acutely aware that someday, this will change -- probably when some snake finally manages to slip through similar legislation in the US and it becomes profitably unattractive for Apple to maintain it's privacy stance.
Apple's privacy stance is a marketing thing, after all.
Even with the new iCloud e2ee, which is not on by default, if you enable it, Apple and FBI/DHS/et al can still read all your messages (unless each and everyone you iMessage with also enables it), because iCloud still escrows iMessage sync keys in non-e2ee iCloud Backups, breaking the e2ee in iMessage.
For 99.99%+ of iMessages, they ARE NOT E2EE and can be read at any time by Apple and provided to the state without a warrant. This is by design. HT202303 explains this, because for some reason Apple doesn't like to publicly lie.
This article is brand marketing, nothing more. It worked on you.
Apple has the ability to ship private devices. They don't because they choose not to fight city hall.
Thanks for the reply. So am I misunderstanding their data security overview page?[0]
>For additional privacy and security, 14 data categories — including Health and passwords in iCloud Keychain — are end-to-end encrypted. Apple doesn't have the encryption keys for these categories, and we can't help you recover this data if you lose access to your account. The table below includes a list of data categories that are always protected by end-to-end encryption.
The table includes Messages in iCloud with the caveat that the key is stored in iCloud backups if enabled, but the e2e key is still private, no?
iCloud Backups are enabled by default, and the Messages in iCloud key is in the non-e2ee iCloud Backup.
That means the iMessages being synced are encrypted to an endpoint key which is held by both the endpoint and the middle transit service (iCloud/iCloud Backup). That's end-to-middle-and-end encrypted, i.e. not end to end encrypted.
Even if you turn on iCloud Backup e2ee (it's an option now) then your iMessages to everyone who hasn't (99.9%+ of people) aren't e2ee because the other end of the conversation is backing up their endpoint sync key.
Pretty sure it's just section 75 that Apple / Meta take offense to.
(1) OFCOM may require a provider of a regulated service to pay a fee in respect
of a charging year which is a fee-paying year.
(2) Where OFCOM require a provider of a regulated service to pay a fee in respect
of a charging year, the fee is to be equal to the amount produced by a
computation—
(a) made by reference to—
(i) the provider’s qualifying worldwide revenue for the qualifying period relating to that charging year, and
(ii) any other factors that OFCOM consider appropriate, and
(b) made in the manner that OFCOM consider appropriate.
Seems that the UK market is not big enough. In China Apple had no problems with ceding control of their local servers to the Chinese government [1]. Not to mention of using forced labor (mostly Uyghurs, against which China is committing acts of genocide [2]). So, Apple and human rights my ass!
None of those investigations showed that parts supplied to Apple came from Xinjiang. Only that one of their suppliers operates a plant there. These suppliers are huge conglomerates with plants all over China. Apple is one of a tiny handful of companies that actually audit their supply chain, but they only audit the chain that actually supplies them.
From now on though any kind of audit in China is going to be almost impossible. They've started arresting researchers who perform due diligence reports for revealing 'state secrets'.
I don't think it has anything to do with the market size, which after the Brexit fiasco and the pandemic has shrunk considerably, but it has the most to do with the UK labelling itself as a democracy while behaving like a dictatorship when it comes to surveillance and other "think of the children" laws.
I have seen some arguments that they don't care about your privacy and it's all right there in the privacy policy if you're willing to read it. A video on the subject [1]
I hope they really do this. All too often Companies will say this then when the law is passed they back down. Why, stockholders.
Has Apple really removed iMessage and FaceTime anywere else ? Are these available with 100% full encryption in China ? If so then they are just blowing wind :(
FaceTime was banned in the UAE up until ~2021 when it suddenly started working again for non-UAE phones[1]. For the longest time iOS/macOS has had feature and region flags at a lower hardware level that could dictate system policy for things like FaceTime as well as the camera shutter sound required in Korea/Japan.
Well here is something for you to read, if you are a Chinese Citizen living in China, then these messages can be seen by the Government because iCloud data must be stored in China.
Yeah, ok, try harder than a Reddit thread where supposedly authoritative commenters can’t even spell encrypt.
The idiot pmendes is wrong (if he was correct then no one knowledgeable on the subject would classify that system as E2EE - something more to read about). The encryption keys are not managed in iCloud which you can read yourself:
iCloud backup is an opt in feature - you use it full well knowing it effectively damages the affordance of E2E regardless of locale if ADP is not available.
> The idiot pmendes is wrong (if he was correct then no one knowledgeable on the subject would classify that system as E2EE
What exactly is pmendes wrong about? Are you referring to this comment:
> In iMessage the message contents are in fact end to end encripted. Each device encripts the message using the recipient keys and then sends the message. The problem is that iCloud manages the keys by itself so you have no way of knowing who is the exact owner of the key. Addionaly, on group chats, they don't even need to be a man in the middle. They can just add their key to the list of encription keys for that chat, and receive a copy of each message.
What's wrong there, aside from spelling? The nit I would pick is that in "iCloud manages the keys", I would change "iCloud" to "Apple" or "Apple's IDS".
How, specifically, does iCloud backup damage the affordance of E2E? This doesn't make any sense.
If you have somebody's GPG public key, you can encrypt a file to that public key and then put up the encrypted file on a public FTP site [0], whence they can download and decrypt it. iMessage does nearly exactly that, except that 1) Apple's identity service effectively solved the key distribution problem, 2) the messages are, even though already encrypted, themselves also transmitted by encrypted channels to Apple's servers during transit.
[0] Well, I wouldn't do this if I were at all concerned about the contents because it doesn't allow for forward secrecy.
If you’re going to classify every critical fact as a “nit” then nothing is ever wrong.
IDS vs iCloud is far more than a nit. They’re completely different services. iCloud is run by a 3rd party in China, this is well publicized, whereas IDS is not. So that’s like not a minor detail.
iMessage does not depend on iCloud. You don't need an iCloud account. These are unrelated.
Contact key verification is a more recent addition, and again not dependent on iCloud.
> How, specifically, does iCloud backup damage the affordance of E2E?
Just saying, you can sync your data to whatever encrypted or unencrypted service you want if you choose to. This may diminish the value to the end user of E2EE but it is unrelated.
I'm not the one that brought up iCloud first. Take that up with the original commenter.
I was looking around to see what china has access to after seeing this post... However, people mention their messages can only be visible if icloud backups are enabled? Seems risky to trust either way.
People should read actual documentation and expert or academic writing rather than a bunch of terminally online nerds on Reddit.
I’ve seen nothing credible that the E2E for iMessage and FaceTime isn’t as advertised. If you backup to the cloud you’re electing to disclose your data outside the boundaries of encryption.
Then they should read the leaked government documents from around the world that contradict what Apple says themselves. Failing that, they should at least be able to read the actual code to corroborate its security, but that's not an option either.
> Then they should read the leaked government documents from around the world that contradict what Apple says themselves.
On what specifically?
> they should at least be able to read the actual code to corroborate its security, but that's not an option either.
Why does that matter? You're already trusting Apple hardware. Public access to source code doesn't make security systems safer. I'm not sure what a journalist or even 99% of webshits on this forum would do with trying to audit crypto.
For starters, PRISM and XKeyscore. Both are damning indictments of the state of surveillance a decade ago, and are so damaging that pretty much every FAANG company denies knowledge of their existence. PRISM was about the outreach the US government has with domestic companies, and XKeyscore showed just how far those connections could be abused.
Simply the fact that these leaked documents exist and Apple denying them is a contradiction. Everything else is speculation, but my brain can imagine a lot happening over those past 9 years.
> Why does that matter?
Accountability purposes.
> You're already trusting Apple hardware.
Ideally I don't do that either. I'm not a fan of closed firmware interfaces and if possible, I'd like to audit the code for those as well.
> Public access to source code doesn't make security systems safer.
The majority of networked servers online today beg to differ. Over time the industry actually found that it's much safer to use an open and transparent OS than it is to trust a black-box with UB that may-or-may-not be fixed.
> I'm not sure what a journalist or even 99% of webshits on this forum would do with trying to audit crypto.
This speaks to a lack of either experience or imagination, I can't tell which.
> For starters, PRISM and XKeyscore. Both are damning indictments of the state of surveillance a decade ago, and are so damaging that pretty much every FAANG company denies knowledge of their existence.
Where's this denial by Apple? Or is your argument that because Apple doesn't admit colluding with the NSA they must be doing it. Well that is not falsifiable and what evidence are you speaking to of Apple specifically colluding with the NSA.
> I'd like to audit the code for those as well.
Firmware is not hardware. That would still not address the hardware issue.
> The majority of networked servers online today beg to differ. Over time the industry actually found that it's much safer to use an open and transparent OS than it is to trust a black-box with UB that may-or-may-not be fixed.
Yes, the neckbeards chant since CatB. There are extremely few people not putting their trust in blackboxes. Slapping linux on a box doesn't magically make it transparent - nor does linux have a security record you want to brag about.
That's from a decade ago. Verb tense matters. Apple denied knowledge of PRISM before the leak, I don't see where they are denying it since. In any event, you may not believe it, and it's not the craziest thing to be suspicious, but you also unfortunately don't seem to have any counter.
> What a conspicuous coincidence that all of the exonerating evidence.
What exonerating evidence? How does one generally exonerate themselves that they don't know something or were never privy to it.
And I'm no fan of Apple (I'm also not a fan of baseless conspiracy theories), and the flip side of this is what benefit would it be to the NSA to disclose a program to a multinational company of >100k employees if they didn't have to.
You're trying to make it sound like there is a smoking gun that Apple has been lying about their NSA involvement, moreover insinuating in ways that don't seem to serve the best interests of the NSA - and I might even believe you - but so far you have put up nothing but innuendo.
> Is there a hardware issue?
Ok, so you audit your firmware. Why do you trust the hardware you're going to run this audited firmware on? You have thus far proven my point about general public access to source code.
The bottom line is that for the threat model faced by most developed nation citizens, Apple's privacy value proposition is pretty good. If you're up against a large nation-state that is willing to spend some resources you're fucked - and auditing your firmware isn't going to change that.
The fact that there are companies that have the power to shift public policy is the problem. Because they use that power for their interests that don't align with our interests.
They also have the power to redefine what a text message is. They don't care about people like you think they do. They're a company who makes decisions to make money.
They will only continue to do so while it remains profitable, at some point they will give in. They are a capitalist company, profits are their only concern.
The way I put it, is Apple's business is selling high added value products (i.e. luxury, overpriced whatever you want), while most of the other big tech companies have you and your data as their main product. I'm fine paying for the luxury and privacy that comes with it
Apple, while imperfect, is the only large tech company with anything like a principle when it comes to user privacy. Meta/IG would sell literally any piece of information about you they can get their hands on and monetize.
I'd have thought it'd be more profitable to sell access to you based on criteria rather than outright selling your data. If they sell your data then anyone can buy it and use it whereas only they can sell your attention.
Flags aren't for rule breaking, they just mean enough people found your comment inflammatory or otherwise objectionable enough to hit the "flag" button.
If you have a link to the flagged comment, I can probably help you understand why HN users would have flagged it. This comment is getting downvotes because complaining about downvotes and flags always does.
It doesn’t show as flagged for me. Time heals all wounds, if your comment is not objectionable it will usually be vouched for eventually. Which is one of the reasons it’s a rule here that you don’t complain about voting. Such comments also rarely age like wine :)
You can complain, you'll just get a small karma hit for it. It's inevitable because it's off topic, but karma on HN is more or less meaningless anyway.
It's usually brigading by other Apple fanboys. Their master can do no evil. You could see this alot during the local device scanning controversy, but anytime you try to point out obvious facts (like Apple building the largest Ad empire for example, while virtue signaling about Meta's ads on their platform) you get downvoted and flagged.
Ridiculous stuff like this is why I'm cringing everytime any apple fanboi appears.
Are you dense? Apple and all other major companies are cooperating with FBI/NSA/CIA all the freakin' time.
These things they throw up is actually smart play from these global police agencies, so sheeps like you feel safe.
(Android is the same thing)
Sure, but what exactly is "privacy" ? That's the debate.
No sane person believes that there aren't really nasty, organized criminals in the world, and no sane person believes that said criminals have a right to absolute privacy, including e.g. (the moral equivalent of) search warrants.
I grew up in a time/place where organized crime was rampant, and if you looked at the wrong person you could be killed. It was time of racism-by-default, thievery-by-default, a lack of investment in R&D, science or infrastructure, and the opposite of a meritocracy. It was an "invasion" of their privacy that ended that era, and now we can comfortably watch Scorcese movies about it over an internet those people would never have invented or invested in.
> No sane person believes that there aren't really nasty, organized criminals in the world, and no sane person believes that said criminals have a right to absolute privacy, including e.g. (the moral equivalent of) search warrants.
That's just a strawman.
Plenty of sane people believe that the threat of criminals doesn't outweigh the threat to individual liberty posed by the state. Additionally, plenty of sane people wouldn't support a ban on locks and safes because it makes it harder to search criminals.
I'm pretty sure I have my sanity in check, and I don't think there's any reason for the government to require private enterprise to break encryption to ease prosecution. It should be hard or impossible for governments to spy on their citizens.
Not only is privacy a fundamental right within a society but it is fundamentally necessary to the advancement, and thus long term survival, of any society.
> how the advancements in western society since (say) WWII have been neccesarily dependant upon privacy?
Alan Turing. Hell, Tim Cook, a gay man who grew up in Alabama. Neither would have survived childhood with their personalities intact had their governments had the access No. 10 Downing demands.
Privacy preserves society’s margins. The margins, by definition, are where change—growth—is nurtured.
However your examples don't demonstrate how the many advances in western society depend upon privacy nor do they prove that it would not have advanced without privacy.
Alan Turing, for example, was open about his sexuality (an issue that bit him when a robbery took place long after his work at Bletchley).
Even had he been somehow taken out of the picture pre WWII it remains true that Marian Rejewski, Jerzy Różycki and Henryk Zygalski would still have cracked the Enigma cipher, built the prototype Bombe's and passed that knowledge to the likes of Tommy Flowers and William Thomas Tutte.
Turing proved a result applicable to the third part of Hilbert's second problem, the Entscheidungsproblem.
He was very nearly beaten to that as a first by Kurt Gödel who provided an answer to the first two parts of Hilbert's second while indicating an approach to the third part.
Turing was beaten to the goal by Alonzo Church who demonstrated the halting problem for lambda calculus is not effectively calculable.
Later, in 1939 J. Barkley Rosser highlighted the essential equivalence of "effective method" defined by Gödel, Church, and Turing.
So, uhh, yes - it would be nice to see some notion of proof come to in to play here.
No discovery depends solely on one person. The example illustrates the mechanics in the last line.
Fringe elements are analogous to a society's surface. They, by definition, interact with novel elements and ideas at a higher rate than population.
Internal interactions exchange information. That distributes information (physical and intellectual). It also increases entropy and thus homogeneity. Conductivity (the term of art is legibility [1]) and complexity are at odds with one another.
Ceteris paribus, internal homogeneity shouldn't change the surface. But humans have agency. Homogeneity precedes conformity which drives more homogeneity. Furthermore, less diversity means fewer novel opportunities/interactions between the fringe and the unknown.
Privacy preserves a diversity of fringes which drives social complexity. A society without privacy is simpler, and thus less capable of innovation, than one with it. (There is obviously an upper bound to this phenomenon. A perfectly opaque system is static. But humans, as social creatures, resist isolation more naturally than conformity.)
1. You cannot compare Cook to Turing with a straight face, even in terms of their sexuality situation.
2. Privacy is about as personal of a fight to Cook as shareholder obligations are, and only one of them gets him removed if he neglects it.
3. It's pretty fucking stupid to suggest the UK would do the same thing to Turing today if they had the data. You may recall that there are now laws in both the UK and, yes, Alabama since 2009, that render this a hate crime and deter it properly. It's the lamest possible excuse to support corporate power.
Ultimately Tim Cook will say whatever gets investors horny. If you're dumb enough to factor his sexuality in to your trust quotient, you deserve to be XKeyscored.
As long as humans remain irrational and tribalism exists, privacy is essential to anyone that isn't the most boring, vanilla, cookie person that has any amount of non-mainstream ideas or beliefs or qualities.
I think, in theory, a society with much less privacy could work, as long as there is information symmetry, not a panopticon controlled by a few. We're not building an open society, though. The asymmetry is the point.
If we ignore the open society angle and assume surveillance would be one sided:
Competition is needed for healthy capitalist economies. Competition doesn't work well with great power imbalances. To the extent entrenched business could leverage the government, as they do, powers of surveillance could be used to stifle competition. That leads to stagnation.
Democracy, even in a limited form, doesn't work when everyone is under surveillance by a powerful minority. I don't think I need to elaborate since there's been plenty of writing on that.
Also, people don't want to make civic contributions to a society that treats them like cows with a criminal bent. There's not much argument against being a hedonic leech in a society that benefits from your labor but resents your autonomy.
I haven't answered your question exactly, but it's a bit difficult when framed that way. I still think it's pretty clear that the massive power imbalance that comes with dragnet surveillance can catalyze corruption, stagnation, and malaise, thus impeding progress.
> I think, in theory, a society with much less privacy could work...
I suspect what many forget or were perhaps unaware of is the in practice society with near zero privacy existed for the vast bulk of actual human history.
Before the car and the radical change in individual human movement that came about for those parts of the world privileged enough to have a car for every family the majority of people lived within a small radius surrounded by people who effectivel knew every detail of each others lives .. if not as it happened then almost certainly by the next quarter as word spread.
Again, as I mentioned above, I like privacy .. but it remains true that the greatest expansions of modern western economies; mass production, feeding quantitively more with less land and labour, etc .. all happened without any essential dependance upon privacy to bring these changes about.
It is the upstream grandparent claim that privacy is fundemental and essential for any advancement in a society that doesn't pass the smell test and certainly hasn't yet been well argued for.
I absolutely do agree that it is desirable .. but essential (in a strict sense) for the continuation of human society .. that needs better argument to pass muster.
> Before the car and the radical change in individual human movement that came about [...]
The lack of privacy in a local town is surely quite different in its effect on society than a centralized and wide reaching dragnet surveillance program. That local lack of privacy was limited in reach, not automatic nor persistent, and mostly symmetric... which doesn't present the same potential harms.
Wrong. Society is just a social construct, but your right to independent thought and agency, which is powered by privacy, are fundamental to the human experience.
No one is taking away anyone's right to independent thought, at least for now.
Patent was talking about privacy though, I'm not sure our species has a fundamental right to it, how did you arrive at that conclusion?
It's a bit difficult to grasp the reasoning behind humans not having the fundamental right to privacy. It seems like the right to privacy/private communication with other sentient creatures is corollary to the right to independent thought, since humans are fundamentally social creatures with an inbuilt need to communicate.
I'm a bit curious how you arrived at the conclusion that you're "not sure our species has a fundamental right to [privacy]". That seems like the absurd claim requiring actual support here, seeing as how privacy is the norm, not the other way around.
> I'm a bit curious how you arrived at the conclusion that you're "not sure our species has a fundamental right to [privacy]". That seems like the absurd claim requiring actual support here, seeing as how privacy is the norm, not the other way around.
I didn't claim anything, I said I wasn't sure. Our species is much much older than the concept of privacy. Back when we were still sitting in trees there was no privacy. We decided that that was a thing quite a bit later. Now in the year 2023 I would say many of us see privacy as a fundamental human right though obviously and a privilege to defend.
Having rights is a social construct. Unless you believe in divine order of things. But I suspect there’s nothing in physical reality that requires any rights to be granted to anyone.
The religious far outnumber the not so I think your "unless" is really the norm. Talking only the various Jesus fanclubs you have a literal divine right to privacy in confession and it's sinful to reveal another's secrets. The Adam and Eve creation myth establishes privacy when God created clothes.
The very concept natural rights is basically divine rights but without explicitly mentioning a god.
If having rights is a social construct as opposed to a universal truth, then you don't have any rights at all. You have temporary granted permissions, but not fundamental rights.
As such, it's much more likely that they are universally true rather than a result of human consensus.
There are no rights in the first place without a society. The entire concept of "a right" is meaningless if it isn’t in the context of a social contract.
The parent post defined “human rights” as something from holy scriptures. What I can see from being part of highly functional societies is that they require me to give up a big part of my privacy. Like, I can’t get salary without bank account. Which I can’t get without submitting a whole lot of personal information to the bank. There’re many essential services which you can’t access anonymously. At a very least “privacy” isn’t a binary category. There could be more privacy or less privacy, but not 100%.
One more example: some societies will require you to disclose any relation to politically exposed persons. Which is beneficial to society, but I can’t withhold even if I wanted.
This is a scam by the UK government, they don't care about child abuse, they are going to look into all sorts of communications to leverage people. If they cared about child abuse they would have shut down the open child prostitution rings running rampant. They openly let it continue because they are afraid of being branded racist since the gangs running them are minorities.
"We must scan all of your personal devices and personal moments for illegal terrorist, cannibal, pedophile, and unapproved political content to safeguard our children against this vague, unrelenting menace. If we happen to stumble upon other content that could be automatically construed by AI to be criminal evidence, we will refer it to the proper prosecution service automatically. The net result is society will be safe. If you committed no crime, then you have nothing to fear. If you disagree with this, then a crime can always be located." - How I imagine this being pitched
The police didn't investigate child sex crimes because the perpetrators were non-white and the police didn't want to be seen as racist. What about that is incorrect?
Who said there wasn't? I'm not saying only minorities do this, far from. I'm saying the police openly ignored it because they were minorities in this case. This is not some sort of accusation that only minorities abuse kids, only that the government that wants your data to protect kids is questionable at best in this area.
It certainly doesn't imply that, strongly or other. It's referencing a specific trend, but not negating other possibilities at all. Context is important.
What? The police openly admitted after an investigation that this is the reason they let it continue for generations. I'm framing this in reference to the UK government claiming they need the information to prevent child trafficking while their track record indicates that even with data they ignored it for decades. Please save the hysteria and wild accusations for Twitter
I'm not against breaking e2ee when the right protections are in place.
1, Court signed order
2, Public Disclosure during prosecution or within X time of the court order being signed (including when no wrongdoing found)
3, Some sort of test to ensure it's not misused (imminent threat to life, serious crime like murder/child kidnap or something)
The idea that a judge can order a cavity search to find information leading to the location of a suspected murdered/dying child but not to read a message between the suspected murders phones seems ludicrous.
It appears at least in some of the US/UK, a judge can order you to give information including your phone password and if you don't, you become a criminal for that.
I'm just suggesting the question should be around where do we draw the line on privacy vs protection and what safeguards do we put in place.
I.E on a scale of speeding ticket to terrorist attack.
There are stronger and weaker forms of it, though. Apple holds the keys to the identity (public key) exchange in iMessage. You're still trusting Apple not to mitm your conversations, but at least they wouldn't be able to do this retroactively. Other E2EE messaging apps at least have a way to verify each others' pubkeys, but it's not required and very few people do it.
That's cool. The article mentions a more common attack I forgot about, someone managing to add a new device to someone's account and get messages on there.
The device manufacture could silently push an update to reveal the keys? but the semantics of a label aren't important.
I think the important question is should an authority be able to read messages and if so, under what circumstances?
What if it's to prevent a major terrorist attack, should you have an absolute right to privacy so much so that, a means that would never be used against you can't exist even if it means the loss of a lot of life?
As I say, I think the question should be around where we draw the line and what protections we put in place to ensure it's not crossed.
And shortsightedness on the side of lawmakers is baffling. Nobody takes responsibility for vision, we just go along with implementing solutions without considering broader impact or history. If the government has all your correspondence and the government falls into the wrong hands, you're toast, assuming you do not align with the leadership. We're writing that possibility off, but someone gets to brag that they've written legislation to stop the bad guys -- and maybe they did, but the cost was our collective freedom.