I used to backup all family photos taken by my wife/kids to my NAS until one day I browsed the gallery looking for something and found a nude full frontal my daughter (17) received from her boyfriend.
I said nothing - stopped the archiving and deleted everything except for my own photos.
They deserve their privacy and do whatever on their devices without snooping - if we go away on vacation I now ask them if they want to share any pics they want so I can keep it.
I think you did the right thing. But still, I think it could also be used as moment for education. For the boyfriend: “You have 0 control over where the image ends up when you send it to someone, perhaps even before you send it.” For the Daughter: “You are in possession of nude material from a minor. This carries some risk, you should be aware of it.”
But I get the struggle, I also turned of logging in my AdGuard home instance, at least for my wife’s phone. But it was also a lesson for her when I jokingly told here the website she was on right now… The ISP and Google probably know this too. And, I can find it in my Router without AdGuard, and that Router can be hacked. It’s also a good time for education imho.
Modern tech is so interesting. When I can’t get a hold of my wife I sometimes find myself checking the hallway’s motion sensor, seeing it just registered movement, the unify network reports to Home Assistant that indeed her phone is connected right now. Ah, gas usage, bathroom lighting and moisture sensors indicate she’s taking a shower (boy it seems to be a long one, 0.4 m3 of gas!). I’ll just wait 10 min and if she doesn’t pick up I’ll flash some lights or let the Sonos speak: “Pick up your Phone”. Yeah it’s pretty creepy but sometimes I don’t even think about it… But writing it down like this I could write a “state” sensor for my wife that could, with high probability, tell me what she’s doing and log it.
We had that talk about the Internet with my kids due to some incidents at school and that made the news but fortunately NOT (oops corrected) involving them - racist remarks on Facebook - oversharing of details - stalking - posting bikini photos on Insta - accepting friend requests from only people they know - privacy settings - cover your laptop camera - don't click links you receive - spam - the usual.
Friends of ours came to me how to deal with nude photos taken by their underage daughter sent to her boyfriend - told them to contact the parents and go to his house immediately - watch him in the act of deleting the photos.
Fortunately he was not stupid to share it with his mates.
Also told them kids do stuff without thinking through the consequences and not to be too hard on them - think she only lost phone privilege's for a week.
I do have a IP camera in the house but it is angled and to watch our parrot when he is on top his cage and alert us if he decides to take a walk.
"Big brother" is not what's happening here. At no point does Apple inform a parent if their child is sending or receiving nude photos. It merely provides a prompt that gives them the bare minimum information to understand the decision they're about to make.
Microsoft Outlook sends me a prompt when I forget to attach something to an email. Something like "Hey it looks like you forgot to add an attachment, would you like to proceed?" The prompt gives me two choices, cancel or send anyway. Similarly to the nudity check, Microsoft scans the content of your email to understand if you intended to send an email with an attachment or not. Would you consider this to be "big brother"?
This isn't just a feature for children. Adults can turn this feature on to prevent themself from ever accidentally sending nude pics.
I presume they use that Microsoft technology to check for presence of too much skin tones in the shape of a human body ???.
The problem is if the ML flags something and forward it to a human (even anonymized) to decide on for quality control purposes - that is why Alexa,Google and Apple smart speakers are banned in my house.
Well, we have an Alexa. It is always unplugged except when someone wants to listen to music using it. Agree so that a Bluetooth speaker would do the same thing, worth thinking about.
In this case, it's exactly the same. Microsoft (and Google) is reading the content of your email, searching for words or phrases that may hint to the fact that a file should be attached. If it believes that you're erroneously sending an email without an attachment, it'll send you a confirmation prompt.
I never saw it that way... But you are right, how would Outlook know that I want to send an attechment without reading my email in the first place. Still hope Outlook isn't reading the attachment so...
There was no malice intended by me and I learned a life lesson regarding privacy - I expected for myself but did not even think about granting the same to my kids.
So I don't log internet activity - no net nannies - just use Unifi WiFi to make sure bandwidth is distributed fairly to all devices and to block spam/ads via Piehole.
You made the right choice to not say a word about it. Absolutely nothing good would have come from telling her.
And it's wonderful that you learned a lesson about the privacy of your children. I'm positive there are many parents who would have freaked out and gone further in violating their child's privacy and autonomy.
If they're going to do this at all, "It's your choice but make sure you feel safe" is probably the right tone to hit here. It's all carefully worded to remain non-judgmental while still saying "We've got your back if you don't feel comfortable".
Also, I have to respect that it's an opt-in feature by the parents/guardians, but the feature still allows the kid to view whatever the image was without actually alerting the parents. In a way, that tradeoff is the best-case scenario for this sort of automated scanning — it allows parents to put safeguards in place while still giving kids their privacy and not blocking anything. Curious to see if that becomes leverage to normalise scanning in general.
The quality of some of the comments here are really poor.
If you're an adult, one is forcing this feature on you. No one is scanning your photos against known CP. No one is receiving or looking at your photos, or flagging you as a freedom-fighting, enemy of the state aggitator.
This isn't their Anti-CSAM rollout. It comes with no cost to you and may help warn some kids before making a mistake.
There's no room for subtlety in some discussions, downvote to prove it.
As an adult I want this feature. I’ve received dick pics unsolicited. Sometimes I want to see, sometimes I don’t. I’d like blurring by default for those moments when I’m screen sharing or looking at my phone in public.
Kids receiving unsolicited pornographic content is sexual harassment and nobody regardless of age should be subjected to that. Giving people tools to opt in seems great to me.
This is the beta test for their Anti-CSAM rollout, which, of course, will be impossible to disable.
They already scan all photos uploaded to iCloud serverside (because all such photos are not end-to-end encrypted and are readable by Apple, including all your nudes in your camera roll), so we're not talking about iCloud - they will roll out clientside scanning of all your local data and your OS will report on you to the police without your approval or consent.
Mark my words, this is coming. Apple has said as much, and while they made subsequent statements to try to smooth over the bad PR, they have not said that they are not going to - and they did say that they are.
"This is the beta test for their Anti-CSAM rollout"
"they will roll out clientside scanning of all your local data and your OS will report on you to the police without your approval or consent."
"Mark my words, this is coming."
You present no evidence for any of this. Apple was stung by both expert and public responses. It could come, it could go away. If you know something please share otherwise admit this is speculation, like 90% of these comments.
Sending or receiving is explicit images is dangerous for children.
"Think of the children" is often just a cover for ulterior motives, but it is sometimes an adequate or even excellent reason to do something.
This feature from Apple is great. It seems to respect privacy, is fully controlled by the user or their parent, and will increase safety for those who choose to use it.
A common tactic used by child predators is to catfish a child, get the child to send explicit images of themselves, and then blackmail them with those images into doing much worse things. I think this feature will make it less likely for this to happen, and has practically zero downside.
You are welcome to downvote for any reason you like (but you're in a minority).
I followed each of those guidelines, I was on topic, I didn't comment on how people voted on comments (that was more of an informal survey) and most importantly, wasn't boring.
Yeah, I'm actually surprised that there's not a way to turn this on for non-child accounts. I've definitely known people who're in the dating scene who'd like to have a choice before seeing a nude picture from someone who has their number.
For that matter, I could see the appeal of just the blurring feature in general. Checking your texts in public might be a risky proposition even if the photo is totally wanted.
The invisible ink screen effect (blurs the message/photo until you touch it) in iMessage is essential if you're in a long distance relationship or away from your partner and still trying to have some fun!
When you are together with a significant other, sometimes you video call them, and sometimes they're in the bathroom, or shower, when they answer (what are they supposed to do, not answer a call from their beloved?). And during these calls they tease eachother, shift the camera in a non conventional angle, private parts exposed. Or they send naughty pictures to eachother, starting small scale, maybe escalating a bit.
All of this is normal, natural, playfully, I would even say: innocent fun. Nobody plans for it, maybe not even asking each other to do it, it naturally evolves with the relationship, partners doing it out of their own free will. Adults do it, teenagers do it. Again, it's normal!
Whenever this happes through a system that is not fully E2E encrypted, those pictures/videos get stored, analysed, accessed by employees, or even 3rd party contractors that label them for text to speech ML purposes.
My advice, use Signal (since it's always e2e encrypted) to avoid the creepy 3rd party eyes, and carry on with your life.
As for the presented "nudity filter" and it's functionality... you can't stop a determined teenager from doing what they aim to do. Assume they're smarter and more tech savy than you are, and this will just be a barrier they will find a way to overcome. Better to keep them safe (give them Signal e2e encrypted) rather than giving them prohibition (this never worked, and it most likely never will).
Unless your significant other is a literal child and marked as such on their iCloud account, this'll never affect you in any way. No data is sent to anyone anywhere.
> Messages now includes tools that warn children and provide helpful resources if they receive or attempt to send photos that may contain nudity.
> Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn’t get access to the photos.
This obviously does not affect me but that was not the purpose of my post. What my message is: if you have something that is normal for adults and you try to forbid it to teenagers, you will lose. They will have their revenge on you, and they will find a way to circumvent it.
For example, if I was still a teenager and had to deal with this, first thing I would do: search and download a collection of thousands of "Harambee nipples" pictures that trigger this filter, send them back and forward between me and my mates non stop to annoy the crap out of my parents and generate noise. Second, I would try to add a "sticker" into one of the pictures until I find a shape, size, and position that will bypass the filter, without hiding the nudity. When such a discovery is made that person will be a hero among teens and it will spread like wildfire.
I don't believe this feature will have the intended effect and I estimate it will take less than 2 hours before it's "broken". My conclusion is that if you want to keep them safe, you have to think vastly outside the "inquisition" box and try education, not prohibition.
"Messages now includes tools that warn children and provide helpful resources if they receive or attempt to send photos that may contain nudity. [...] Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn’t get access to the photos."
If a device I own (even if just nominally) does something I've instructed it to, and nothing more - that's fine by me. Computers doing their job, honestly.
If a device is programmed to report on me to someone and I haven't explicitly asked it for this - that's not fine, no matter the intent.
I don't love it but at least it's not that disastrous on-device "CSAM" scanning they were planning to roll out with this. That must have been one of Apple's biggest blunders in history. Apple announcing they would scan our phones for government banned content. Pants on head crazy.
>>And that the on device option was better for your privacy not worse.
No it wasn't. Previously pictures weren't scanned directly on my phone, then they were about to be. That was a downgrade of privacy, not an improvement. The fact that it was only going to be applied to photos due to be uploaded to iCloud is irrelevant - you are still using my own device to scan my own photos, and if it fails some completely opaque check that I have no control over, you will report me to law enforcement(who will then do god knows what). Not to mention that during the analysis process my pictures will be shown directly to a human in a centre somewhere who will judge them if they are legal or not. Again, I'm really struggling to see how this is an improvement in privacy.
>>Until you realise Apple has been scanning your iCloud photos for government banned content for years.
Source? Apple has maintained for years that they don't scan pictures already uploaded to iCloud, since they are encrypted and Apple doesn't have access to them.
Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.
Right, but they don't actively scan everything uploaded to iCloud, is my understanding - they will decrypt your iCloud storage when asked, but they don't scan the contents by default. At least that's how I understand it.
>Previously pictures weren't scanned directly on my phone, then they were about to be. That was a downgrade of privacy, not an improvement.
Absolutely not true.
When you scan on the server, like Google does, that information is open to abuse by anyone who issues a warrant, and I wouldn't be willing to bet that the warrant is a hard requirement.
>Google’s Nest Will Provide Data to Police Without a Warrant
Which part isn't true? Pictures aren't scanned directly on the phone anywhere, that's true of both Android and iOS. Apple's implementation would change that, again, making their privacy implementation worse not better.
>>When you scan on the server, like Google does
I don't see how that's relevant. Apple wasn't scanning on their servers(at least that's what they say publicly, I don't doubt that NSA has access to the data anyway).
>>Apple designed their system so that they don't have access to the scan data or the scan results.
That's explicitly not true - your phone would scan the photos, if it failed the scan, then the photo would be uploaded to their verification centre where a human would look at the pictures in person and decide to forward them to law enforcement or not. We can argue whether the "verification centre" is Apple having access to scan data/results, but I feel like that's splitting hairs. It's a downgrade for the privacy that you used to have with Apple, not an improvement.
>>Google's system is worse for privacy.
I have zero idea why you keep bringing Google up. I'm saying that Apple's proposed system is a downgrade of their privacy implementation, not an improvement. What google does or does not do is not relevant to that argument.
> That's explicitly not true - your phone would scan the photos, if it failed the scan, then the photo would be uploaded to their verification centre where a human would look at the pictures in person and decide to forward them to law enforcement or not.
You seem to be discussing the system Apple never implemented.
I am discussing the system that has just been announced.
Yes, in the system just announced, it's an opt in parental control where Apple has no access to the images being scanned and doesn't know the scan results, while Google is busily scanning all the contents of everything in your Google account.
Every major cloud provider scans their hosting for CSAM automatically. It's the cost of doing business. Either you do it or the feds do it for you. The automated way is a lot cheaper and easier.
If Apple could do it on-device reliably, they could encrypt their storage and there would be no "think of the children" -reason to allow government agencies access to the files anymore.
Again, as I said in my other comments - Apple has explicitly said many times that they don't scan your icloud photos by default, because they are encrypted already. They will hand your encryption keys to law enforcement when presented with a warrant, but by default nothing is scanned.
Apple can scan whatever it likes on their servers. That is worlds away from scanning content on my phone. How on earth can you claim that is better for privacy? It's the exact opposite.
Their proposed system was to scan stuff on your phone as part of the upload-to-iCloud process. I.e. only stuff that was about to be on their servers (where it could be scanned) anyway would get scanned, and you could thus completely opt out by turning off their iCloud Photo Library stuff.
To me this winds up feeling totally fine as a trade-off, particularly if it wound up paving the way for actual end-to-end encryption of the photos, though I see that it bothers some people.
If Apple announced end-to-end, you might have a point. But they didn't, they still haven't, and there are zero indications they will ever offer end-to-end encryption for photos and videos. Given this, this would have been a clear step backwards for privacy, and it was Apple leading the charge this time.
Oh, I totally agree that the end-to-end part is speculation and tea-leaves reading. If it happened it'd take this approach from completely neutral to a net positive. (Again, in my eyes.)
Discord's detector is a bit weird, yeah, but it can be turned on or off by the server administrator. I turned it off because we don't get much abuse and we got some false positives.
You expect to get warnings that your phone will optionally blur your breadboard photos, unless you prefer it doesn't? Do you expect this will be a frequent problem?
This is the least charitable perspective you could adopt. If you looked at the prompts, it's clear that this system is to help initiate a healthy conversation with your parents/trusted entities about how to deal with and understand these types of photos.
1:1 conversation couldn't be more different from NSFW internet searches. A malicious party can use coercive tactics to have the child send them compromising photos.
I am tired of being charitable to these companies.
Children shouldn't drive cars, they shouldn't do drugs and they shouldn't use smartphones.
Would you be fine with a beer company making a "safe" version of beer for kids so they get hooked and once they turn 21 they switch to the real stuff and become alcoholics?
Making these devices more "child friendly" is the wrong solution to the problem. Actually, it's the absolute BEST solution if you are Apple and want to calm the parents down and sell more phones and services at the same time. Unfortunately I'm not Apple and my objective regarding my kids is not to maximize profits and push complex technology to people down to the last semi-conscious toddler
Not sure what reality you’re living in but almost all kids in high school have phones.
This feature is targeted firmly at that demographic who generally aren’t going to listen to what their parents say on this matter. But whom will suffer the worst if their photos are spread around the school.
Being comfortable with sexuality and nudity is distinct from sexual harassment, and I think this feature is fundamentally designed to prevent the latter.
Some nude content is wanted or warranted by the user, and some of isn't. Apple isn't blocking this content, only optionally hiding it—almost identically to Reddit's NSFW feature for mature adults. The "Ways to Get Help" might overstep the bounds for some situations, but it might also prevent others from getting worse.
Often, children are uncomfortable with sharing about being exposed to uncomfortable content. While that may be a fault of parenting, it doesn't make this tool any less useful.
It would be interesting if you could opt into this for all conversations. I know people who get a high enough amount of unsolicited dick pics that they would consider turning it on. Of course the controls and messaging would probably need to be different (for example add "Always allow from this sender" and less emotional support)
I don't own any iDevices but it kinda bothers me that the world seems to close in on itself while at the same time breaking its own back bending over backwards to include _EVERY_ONE_[1].
So it’s an opt in part of parental controls - if parental controls are involved we’re already talking about devices that are under the control someone’s parents.
The analysis is done on device, and no data is sent to Apple, and no data is decryptable by Apple (presumably this also applies to SMS but SMS is unencrypted and routed through carriers).
If the device does decide that a message contains nudity it blurs the image and provides “age appropriate prompts” which sounds like suggestions to contact their parents.
That means they have not given up on the whole direction of on-device image scanning, the idea of clandestinely reporting your device images [1], that they (temporarily) abandoned due to a backlash. There must be an internal censorship struggle within Apple, and it's unclear which group will win. I personally skipped their M1 laptops when the news of [1] came out. Was waiting for M2 14" pro after they abandoned it, but now will probably go with an XPS13
Only the images being sent, from a child's phone, with the optional feature turned on, are scanned and when approved for that message, blurred. There is nothing about clandestinely reporting your images to anyone.
Before the whole Anti-CSAM debacle, one legitimate concern was that iPhones are more commonly used for CSAM than other platforms and there was a perception that Apple wasn't doing enough about it.
Their stance on privacy was opening them up to attacks by the 'think of the children' types, weakening both its brand and political capacity to withstand pressure to weaken encryption, create backdoors etc.
This feature is an impressive counter volley in that space. It doesn't compromise security or privacy of normal users, it's optional and should be reasonably effective. But in creating this, Apple insulates itself from the suggestion it is doing nothing to help protect children.
So this feature could actually give Apple a face saving way to discard the creepy Anti-CSAM idea altogether and win it enough clout to keep fighting government pressure generally.
It's a net win for privacy. We need to pick our battles and not reflexively go after Apple with 'this is a dry run for on device scanning/police state' type baseless accusations.
Why always only nudity? Why not everything violence, for example weapons, dead people, whatever? Children should eventually have good sex when they grow up but they should not kill people. Still they get exposed to a lot of violence and comparatively little healthy sexuality. Why can they drop anvils onto each other's heads in cartoons but not fuck each other?
Worst-case for both as far as the effect of seeing the image(s) per se is very similar, which is that maybe you have an image in your head you wish you could forget. OK, not ideal, but also not the end of the world, and sure, the two are similar in that regard.
The difference is that images of violence aren't as strongly associated with things like sharing of personal images that haunt a person later, with really disgusting and effective forms of bullying, or with efforts to get kids to do something harmful IRL. Could they be? Can I concoct such a scenario? Sure, but these other things are in-fact happening a lot (ask any teacher or other person who works with kids ages ~12 and up)
That's why specifically when it comes to Internet messaging services and kids, nudes are a bigger problem than images of violence. I agree with you more broadly, but the context is important here, and is why in this narrow case the two things are meaningfully different, and nudity is the bigger problem.
Why is nudity such a big thing? Sorry, but I don‘t get it.
Maybe i do not understand this, because i am german?
My children (as they were little) got in contact with content that distrubed them a lot. The most troubling of them videogame- and movie-previews with violent and gruesome content, that got to them were the enviornment was somehow unconstraint. Sometime they could’nt sleep! Darn - i myself really hate that stuff i have to look at everyday - and i am 52!
Maybe nudity is a problem - ok - do something against it! But the other stuff - we don’t care? Really?
This feature isn't really aimed at nudity-in-general, but rather at specifically sexual targeting of minors. The scenario it's most totally thinking of is "a creepy older person tries to trade naked picture of themselves with your 8 year old". Though also "before you share this picture of yourself with your boy/girlfriend, think for a second first".
Can't speak for Germany, but in the US there's been a lot of furor around school-aged people having some pretty severe troubles after nude photos they shared with a partner got shared more widely -- suicides, etc. We could certainly argue that the fact they're facing trouble because of such things is itself a symptom of a problem in American society, but a first-level solution of warning people before they get into the situation seems like something Apple can do, whereas broad cultural changes are outside their power.
So, this article is not about using Messages in a way that could get you safe through Russian filtration camps (in case you haven't manage to evacuate before your city was occupied) where you phone is thoroughly examined and both having some suspicious messages and not having them at all could get you into trouble (meaning torture/death). It's about nude photos.
So, a kid receives a message containing nudity. How does Apple knows that the viewer is a child and not an adult? Because the only logical answer I come up to this question is that Apple is using phone's camera to see who's using it at that very moment. Which raises the question of "isn't this an invasion of privacy from Apple?".
Also, yeah, all the people working at Apple are all saints and they would never abuse this. /s
Please read the linked article before making baseless claims:
> The communication safety feature [...] is available to child accounts signed in with their Apple ID and part of a Family Sharing group. This feature is off by default.
In Family Sharing you can designate AppleIDs in your family as being child accounts. An AppleID is mandatory to use iMessage, so thats how Apple "knows".
This feature appears to be off by default, which is good. Helicopter parents will undoubtedly turn it on, but without this feature, perhaps those parents wouldn’t even let the child use the device.
That's so American. Ban every nipple and every inch of other "prohibited" body parts.
At the same time in Europe you go to a nude beach and there are whole families with kids and literally nobody has problems with it. Clothing is optional. No one bothers nude people, no one bothers clothed ones.
In Germany or Sweden you can get your [banned by Apple] out in a public place not designated for nudity per se and this just normal.
You raise a great point here. I’m born and raised in Europe, still live here now, but I’ve spent a great deal of time in the US, and the differences in how we perceive children might be the biggest gap between the two cultures.
I much prefer the European view of children being raised as young adults, seeing the world for what it is, rather than the American idea of creating a puritan parallel society scrubbed of all the “bad” things.
Looking back on this I’ve found that the European model raises better critical thinkers – partly due to the school system as well – and you end up with a more sustainable parent-child relationship when the children are grown.
In a world where US culture basically shapes the design of the tech products and social media we all use, this scares me.
> In a world where US culture basically shapes the design of the tech
products and social media we all use, this scares me.
This is a key point. Today we are all so progressively aware of
"cultural hegemony" and wallow in the shame of colonialism. But we
don't see its new and more powerful forms.
The prevalence of digital technology means that companies like Apple
and Google imposing their values is a real problem. And let's be
clear, these are often very parochial values.
It could be said that Europe's digital split with US is only about
things like privacy and data control on the surface. Underneath are
more subtle cultural forces at play.
> The prevalence of digital technology means that companies like Apple and Google imposing their values is a real problem. And let's be clear, these are often very parochial values.
And sadly, beneath all the headlines about AI sentience, this is what Blake Lemoine was actually trying to draw attention to, and that executives consistently dismiss these kinds of concerns [1].
What of the consequences when these corporate values become embedded in AI that plays an ever greater role in our lives?
I certainly prefer the non-Bowdlerized world for children, myself, having grown up as such. But if there is any thinking superiority to that technique it appears to not be in action considering that California (not unique in America in cultural puritanism) outperforms most comparable European countries. It certainly beats Germany at innovation, maternal mortality, income, and wealth and has comparable gross product, and life expectancy.
Considering that Californian children are no less sheltered, I find it hard to believe that this effect on thought is meaningful.
It's really easy to confuse pet cause X when it's really a proxy correlation for more fundamental things like being poor or having low intelligence, either due to environmental damage or neglect.
Right, but the US is happier than most European countries by the last measure I saw https://worldhappiness.report/ed/2022/happiness-benevolence-... (comparable to Germany, better than France) and when you take a state like mine (California) it beats all but the Scandinavians, which, sure I grant you they seem to have a good and happy society.
So, if you take Germany, Californians are:
- happier
- more wealthy
- more innovative
- have fewer mums die
Now, either this means this better critical thinking described isn't manifesting into any life outcomes, or (and perhaps this is a question worth asking), it isn't better after all.
I think another key factor to happiness in California is the weather. It's hard to find a place in Europe that is top-tier economically and has nice weather year-round.
No matter how good life is in other respects, it's tough for some people (myself included) to feel good when the weather is cold, gray, rainy, and dreary for months on end.
Sure, I was asking if it was worth the better critical thinking if it didn't lead to better life outcomes. It looks like, if there is better critical thinking in Europe it's not leading to better life outcomes. So that means one of two things:
* better c-t does not lead to better l-o
* the c-t is worse
If you feel that World Cups are a better measure than maternal mortality rate, then have at it. The measures we choose are arbitrary, so we can just lay them all out. I'm happy to accept relevant metrics if you have them to show. Personally, I think a society with better critical thinking overall would be able to, given resources, reduce the number of dying new mothers but if you believe there is no relationship I am happy to just accept metrics you offer.
California and Germany are both rich and relatively successful. The reasons for it have nothing to do with their respective stance on nudity. It makes no sense to compare metrics here, they are useless, because there is no causation. You just have to look at poorer states from both the US and EU with the same ideas about nudity but wildly different economic outcomes to see that.
Their success is based on cars, an engineering tradition, the US dollar, the biggest local market, DARPA, the protestant religion, competing nation states next door and a hundred other things.
What I personally think the handling and censorship of nudity might actually influence is democratic discourse, mechanisms of what shocks, what should be censored, what works for populism, what is seen as freedom and what ultimately gets elected.
Money is the end all and be all for some people. It's an incredibly harmful ideology that not only hurts citizens but also the environment and ecosystem
I think you'd have to further sub-divide California to get the full picture. LA and the Bay Area are worlds unto themselves in terms of culture, and probably do have more in common with Europe than Iowa. The rest of the state is much more typically American.
Having raised a 3 year old in the Bay Area (SF and Albany/Berkeley), I've anecdotally noticed a lot less of the puritan/sheltering attitude you're talking about than in other places in the US. I'd say a moderate version of 'free range parenting' is actually the dominant trend these days around here.
Picked an appropriately sized thing to compare considering the variance in US-state laws and EU-nation laws causes whole-US and whole-EU comparisons to be noisy. Also I live in Cali and know it better than I do the rest of the US.
The 11.7 is the PRMR measure from CA-PMSS, which is far stricter than the definition used in Germany. The corresponding rate using the same definition as in Germany was ~6 / 100k in California.
Easy mistake to make, though, since you have to be aware of definitional differences.
The easy mistake is taking for granted US skewed units of measures, I simply posted official stats from International organizations that normalize data across different countries and use standard methodologies.
Anyway your numbers are not better than Germany, but also:
What country has the highest rate of maternal mortality?
The U.S.
Key Findings: The U.S. has the highest maternal mortality rate among developed countries.
Average US mortality rate is 17/100,000
Average European mortality rate is 8/100,000
Remember that in Europe live at least a hundred million more people than in US an it includes much less developed countries than any US State.
Yet the US average is two times worse than EU and 8.2 (EIGHT!) times worse than Belarus (BELARUS!), Poland or my country, Italy.
You know who's got the same stats of US?
Russia. Yes. That Russia.
US maternal mortality rate is the same of Russia, despite spending billions over billions more on healthcare.
Despite being considered a third World country from the American POV.
To put things in perspective for you: Turkmenistan MMR is 10, in Iran is 16, in Tajikistan is 17 (same of the USA).
Can a certain kind of mentality against women bodies considered "unholy" be the cause of it?
It seems like you have written quite a few comments engaging with people here in an attempt to demonstrate that, by your own data, a difference in maternal mortality of 1 in 100 000, when comparing a single American state to a single European country, may be evidence that European nudity-tolerant attitudes to child-rearing may not result in improved critical thinking skills in European adults.
If this doesn't seem like a stretch to you, by all means continue.
I am told it comes as quite a shock to the unititiated. I usually don't think twice about the giant billboards of bible verses and sermon snippets as I drive to work (Northeast US) but when do notice them I am struck by a small pang of dystopian dread.
My biggest shock was in Canada (Ontario), you drive around and you see churches and dentists everywhere, people come talk to you on campus to talk about religion, even my roommates would try to talk to me about religion every chance they got. Definitely a culture shock for a European.
Yeah for some reasons I had not seen as many dentists as in Canada haha, and I think in general this is true of north america: people REALLY care about their teeth (teeth whitening, flossing, dentists scamming you left and right)
Making shameful normal things like breasts seems to go hand in hand with the kind of self loathing bullshit that shapes people into useful corporate or government drones.
Move at the sound of the bell. Punch in punch out. Be ashamed of your body. Loyalty to company/flag is supreme. etc etc
I believe for a large part of the world homosexuality is far less controversial than homophobia, and thus it makes sense that Disney would want to showcase same-sex couples to declare loudly that they are not homophobic.
Whether this is sincere or opportunistic it's open to judgement; one could cynically notice that the large part of the world that really doesn't like homophobia is a very high potential market for Disney.
March 4th, 2022: “Disney to Still Fund ‘Don’t Say Gay’ Bill Backers, Will Support Gay Rights with ‘Inspiring Content’”[1]
March 9th, 2022: “Human Rights Campaign Refuses Money from Disney Until Meaningful Action is Taken to Combat Florida’s “Don’t Say Gay or Trans” Bill”[2]
March 11th, 2022: “Disney pauses political donations in Florida, CEO Chapek apologizes for silence over ‘Don’t Say Gay’ bill”[3]
I don’t think it’s the first time, either. Not for them nor other big names[4]. I wouldn’t find it unreasonable to believe they’re being opportunistic.
I'm struggling to understand what exactly is "bad" about homosexuality? Why does a consenting same sex relationship between two adults need "scrubbing" from Disney?
What you wrote certainly calls homosexuality controversial, but then immediately includes it in a list alongside violent behavior and drug use. Rhetorically, that equates it with the “bad” things.
Unless, of course, you’re trying to state that violence and drug use is “controversial.” But that isn’t clearly implied; you have to read much more into it in order to come up with that interpretation.
I was trying to figure out "good multiplications" and I think what you meant is "good products." and this is a lovely mistake to make since multiplication is taking the product of something. Anyway, nothing useful to add just was trying to figure out where that phrasing came from and was delighted.
It's not quoting, it's paraphrasing. The comment was very suggestive in that homosexuality is bad. If that was not what you meant, you might have chosen less suggestive text.
While I understand the criticism - this is not the point of Apple. Unsolicited nudity is usually used for harassment. I assume that also in Europe sending unsolicited nude photos to adult and kids isn't okay.
Apple's new favorite reasoning for invigilation is kids' protection. Ban all knifes because someone can use a knife to harm a child.
They wouldn't do it if it wasn't profitable in some way or another. But would they choose to give other reasons for their actions, there would definitely be more backslash about their snooping.
People in general know that protecting the innocent is a good thing and Apple exploits this to push their agendas under the guise of children protection. Again.
… you mean Apple provides a feature in its parental controls to allow parents to limit nudity being sent to their kids devices?
Like yes: the explicit purpose of parental controls is to allow parents to place limits on what their kids can do on a device, and in America specifically protect them from the concept of people existing under their clothes.
But I fail to see what Apple is doing here that you find offensive? Did you read the article and description of how the feature is enabled and works or did you just decide “Apple is evil and is choosing what I get to see”?
Of course. "Think of the children" is the apex of hypocrisy.
If the US (and American tech companies) cared at all about children, or human life in general, they would do something about guns.
As a European, seeing videos on YT glorifying guns and gun culture is extremely shocking. I would let my kids run around naked all day, anytime, rather than letting them watch one video showing people firing an AR-15 like it's the most normal thing in the world.
Your European country exists as it is today because brave citizens worked with or near guns to defend your ideals. Gun education and gun safety is not necessarily glorification.
YouTube is a cesspit though and I can totally see some overcaffeinated "content creator" churning out videos that misrepresent and pervert the otherwise sane tenets of firearms education.
But - there's the "think of the children" hypocrisy again. Think of the children, and "do something about" (i.e. ban) guns. Or ban videos about guns?
Kids can look at genitals all day but heaven forbid they watch a video about a gun? Or play video games that feature guns? Where's it end? We've exchanged one instance of hypocrisy for another. No thanks!
Free humans own weapons or at minimum aren’t barred from doing so, and can legally defend themselves from legitimate threats when civil authorities are unable or unwilling to intervene in the moment.
Culture is a different thing to discuss, sure, and glorifying violence is bad - which is what some of those videos do, though not all of them.
Still, we see countries like the Czech Republic where firearms may be carried concealed, nearly anywhere, with licensing on a shall-issue basis: there are relatively reasonable conditions not far from US-based shall-issue permit requirements (for the US states still even requiring a permit), as well, and CZ doesn’t seem to have issues.
Education about gun safety, and just general personal responsibility, used to be a thing in the US but the “education system” here fails us these days.
Can you tell me even one possibility for how Apple will profit off of letting parents choose to not allow children to send or receive nude photos? Besides for increasing the chances parents choose Apple devices for their kids, which is really something you could say about any feature.
Apple is one of the few companies - with more money than God, or at least the Vatican - that I’m willing to assume might do something on “principle” rather than “to maximize profits”.
The fact that their principles might happen to be at-least-mostly-revenue-neutral is another discussion, though, I suppose.
Do you realize the irony of saying that Americans are overprotectionists, using Europe as a comparison, when the EU may be the most sprawling, onerous, and protectionist regulatory state to exist on this planet?
From the other perspective: democratically elected representatives banning harmful chemicals used in children's toys vs. private companies reading and censoring your conversations.
I have to assume you're intentionally misrepresenting what is happening here. Why would you do this? Do you not think your point is strong enough to stand on its own without lying about it?
Everything happens on-device, with no use of Apple servers, and this is clearly called out in the release above. And beyond that, this is a feature that is only enabled by the parent, who society already accepts has the power to censor what their child views.
What you said would be like saying that the GNU foundation is reading your conversations when you type something into emacs.
It's reasonable to consider that the majority of well-received (wanted) nude photos may also have been unsolicited.
Pay closer attention to the innocents swept up by generalizations.
Laws which make a just action treated the same as an unjust action are foolish and, themselves, unjust. Anyone who uses "usually" to justify an enforcement which applies always is being disingenuous and careless with rhetoric which revolves around possibility and treats it as hyperbole. There are profoundly important reasons why people are to be treated in a certain, naively trusting manner until proven guilty.
Sending an unsolicited picture of a concerning rash on your kid's butt to his doctor should not be intercepted by Apple.
If this feature were designed to prevent unsolicited nudity, it would be far easier to turn off. It probably shouldn't trigger for people in your contacts at all, and it should have an override button for everybody else. This is not the case, and the feature is even triggered when sending messages with such photos, so this is clearly not its intended purpose.
Start off by reading the linked article. The feature is opt-in, and has to be manually turned on. It only triggers on child accounts where this is enabled. It is designed to protect children from being taken advantage of by stopping both incoming and outgoing nudity.
No idea how one-child sending nude pics to another child can be described as taking advantage of anyone. Who's the perpetrator here? And while the feature is opt-in, it's opt-in by parents, and has no granular, per-message overrides when the child knows what they're doing. While a general dickpic content warning is a good idea, there should be a "see anyway" button.
I'm sorry, I'm no happier about this feature than you seem to be, but...
> there should be a "see anyway" button
If you read the article and look at the pictures, "View photo..." is right underneath the blurred photo. It may not be called "see anyway", and it may not be a button per se, but it has the exact same effect.
Even on the receiving end, I could imagine someone sending dick pics or whatever after having already befriended the child, and already being in their contact list. Better would be a “ignore for future messages from this person” button, but keep the warning for the first time.
I think you need to qualify "in Europe", as a European it is certainly not true of any place I ever lived "in Europe".
That aside, it would be nice if we could survive just a single thread about a child protection feature without endless absurdist arguments and strawmen being offered.
I appreciate the reply to this comment indicating nudity is often unsolicited. That's very true. A productive thread regarding a feature like this might include discussion of how to limit its scope, or how it might be safely extended to other kinds of harmful content a child might want to avoid. A pointless thread might be one where the heart of the tech industry continues to ignore the fact many new parents ban all unsupervised Internet and new media usage for their children well into their teens due to problems like this remaining unsolved for decades now, and the industry being utterly incapable of having a sober conversation about it.
UK is definitely on the American side of things in this. Lived here for a long time now and there's definitely a lot of that "oh no a child might see a nipple, how awful". But yeah, in Germany, Czech Republic, Slovakia, Poland, Spain, maybe even France - people would care a lot less. Especially in Czech Republic and Slovakia - you go to a public swimming pool, people just change in the open, children included, no one minds or cares.
(obviously that's just my own travel experience, I don't doubt there are places in all of those countries where that's not true)
The UK is culturally isolated, geographically isolated, and politically isolated. They are about as European as Mexico is North American which is to say it's technically true, but practically false.
Why do children need to be "protected" from nudity? What's so special about nudity?
The reason people try to control children's access to tech and social media isn't because they might see a nipple (lol) but because it's addictive and prevents them from doing anything else.
If we realy want to "protect the chidren" we should start by dismantling FAANGs.
They don’t need to be protected from nudity in and of itself. However if they are talking to an adult or a stranger randomly messages them, it would be nice if the device blurs pornographic imagery.
If they are looking for porn themselves (e.g. are of the age to care for such things), they can download it themselves via Safari.
But generally speaking there’s no need to trade this stuff on iMessage.
This is precisely what I meant by pointless: outside of 20somethings and certain locales (particularly large cities), alternate lifestyles exist where parents simply may not want this for their children.
I'd suggest arguments for or against any particular culture or lifestyle, or attempting to deny they exist, exceed the scope of discussing the feature itself.
I don't let my kids on the net unsupervised not because of things like this, but because of social media and scammers. Honestly, I'm more worried about them getting a call to fix their computer because it "has viruses" than anything else.
> I think you need to qualify "in Europe", as a European it is certainly not true of any place I ever lived "in Europe".
Nudist beaches or naturalists' resorts are everywhere in Europe, even where it's less obvious, like in the former communist block.
EDIT: if we are talking about kids nudity, like changing them in the open, in front of everybody, that's never been an issue in my almost 50 years on this Planet as European living in Europe.
I'm floored this is at the top of the comments. But I guess "I didn't read the article. America bad." is what gets upvotes.
I have a feeling that sending nudes to a child's phone isn't exactly lauded as "natural" in Europe. Of course I'm a prude, unnatural, and I guess violent American though so what do I know.
Not to mention it's an opt-in parental control feature for children's accounts.
In art class in an American high school it was common to discuss and share digital images of famous nudes. I’m sure this system won’t interfere with that at all because of course the only nudity people might see or share is bad, right? In case my sarcasm isn’t clear: of course it will interfere with art classes. Imagine if the art under discussion is photography? Then students and teachers will be sharing real nudes which are completely appropriate under the supervision of adults. Pornography like artistic creativity is contextual and the eye of the beholder. Here we see the eye is a black box of AI. This is horrifying. Likely it will mean this type of education simply won’t happen in these spaces. This is a direct and immediate limitation on free people.
The primary problem with censorship isn’t usually the core motivation. We can all agree that adults shouldn’t send porn to kids, and kids shouldn’t send nudes to anyone, etc etc. Sadly we don’t all agree on what that means - the law is a poor substitute for thinking and understanding context.
Automatic policing will lead to automatic prosecution, we have seen this with Facebook intercepting messages and automatically dispatching relevant authorities. In the case of suicide risks, many agreed this was a public good. Is it a public good when it is done for pot which is illegal federally even though it is legal in California? Do we really want a private company making these choices? Do we really want them in the middle even if it’s “on device” when the device isn’t even open for meaningful inspection?
It is the infrastructure and the corner cases which create a chilling effect on society. Once this is done in one place, it will expand to others.
Do you really think that Saudi won’t push for this on every woman’s telephone? Why limit it only to children? And why limit it only to being opt in?
Do you really not see the future you’re endorsing and that apple is building based on profit motives which are dressed up as “obvious” moral protection schemes?
Remember: the husband in Saudi is the legal guardian/censor of his children but also his fully grown (usually but not always) adult wife.
This can technically expand to textual rather than image censorship as well, we should stop it before that happens. Your device is your own personal space as much as the device is yours or as permitted by law (eg union organization on company time and company computers is sometimes protected).
Providing solutions to do automatic policing on your own device is insanely totalitarian. It isn’t a solution to doing it on company servers which also should not happen. We should not be building this technology and we should regulate the companies to uphold basic civil liberties, especially when they are a de facto monopoly.
I struggle to believe that you read and understood this article before sharing this view. This is an opt-in feature, specifically for use with the existing parental control system, that enables pretty light-touch and overridable prompts if a child receives something that looks like nudity. It's clearly targeted at unsolicited nude pictures being sent to children, and differential cultural views on public nudity have essentially zero relevance to that.
Now instead of an interesting discussion about this feature—why it might be good or bad, the privacy or social implications of it, and so on—we just have a chain of fucking nonsense comments unrelated to it.
So you're OK with your child receiving unexpected messages of erect penises? Because that's what this feature is actually about if you read past the headline. Not "banning" anything but putting users and parents in control.
you can extend this to everything. are you ok your children reciving gore pics, are you ok your kids receiving anarchyst cookbooks, are you ok your kids receiving comunist propoganda? are you ok your kids receiving any random messges from anonymous users? if not i guess just whitelist contacts your kid is allowed to interact. but i myself prefere to talk about world and explain how things work and what you can expect to encounter when growing up rather than growing your kid in petri dish. eventualy they will see errect penis :)
I have a 5 y.o. daughter and two nephews: 6 and 11 y.o. I like seeing them grow in real world.
There will always be perverts sending nudes to kids one way or another. Where I grew (before the internet) we've had them arrested near schools.
Pretending the world is all unicorns and rainbows is not the way to raise your children IMO - the hard truth is going to hit them sooner or later. Talk to them, treat them as young adults, and explain what is good and what is not, and why.
Fun story: I'm into crossfit semi-professionally and I've recently been very pleased to hear my 5 y.o. telling her friends that they should not eat that much fries and coke if they want to stay healthy and suggested they try fish and chicken instead.
>Fun story: I'm into crossfit semi-professionally and I've recently been very pleased to hear my 5 y.o. telling her friends that they should not eat that much fries and coke if they want to stay healthy and suggested they try fish and chicken instead.
I actually find it scary. I believe that children need to experience the maximum of stuff, and as a French they need to taste all the food, whatever unhealthy they are. Schools are a already pushing a lot of propaganda on children about saving the planet. I can see that the next step will be to tell them that meat pollute and they should be vegan to save the planet.
My child is sometimes telling me things like this and I tell him that he is totally healthy and does not need to worry.
that's a good stand to have, and I would argue that the kid tried to concur:
>> eat that much fries and coke
>> try fish and chicken instead
"that much" (!= never) and "try" (!= have to) seem to be key in the dialog here, although from these excerpts only we get a highly decontextualised version. nonetheless to me it sounded like they always indulge in the same food
> experience the maximum of stuff ... need to taste all the food
Now replace food with drugs, how would you feel? coz TBH some food out there borderlines on being drug-like in terms of addictiveness, both when ingested and how they're being advertised.
The way we raise our child is try food, sure, but be mindful of what you eat. we're certainly not opposed to bad food once in a while, forbidding it would only make it that more alluring. also being healthy right now doesn't mean bad habits can't creep in, and healthy habits are tough to instill after the fact once bad habits have taken hold.
For the healthy food you just need to train them to eat vegetable, and that is easy if you start young. My kids have no issue with vegetables at all, with food in general actually, they are not fussy. But I will not disallow junk food, anyway they are not so fond of it.
>Now replace food with drugs, how would you feel? coz TBH some food out there borderlines on being drug-like in terms of addictiveness, both when ingested and how they're being advertised.
Well, I also have a big one, nearly 18, and I don't understand him at all. He is spending his free time at the gym, he tells me he goes with his friends. At his age I was trying drugs and binge drinking. I'm in total disconnect here...
This strongly depends on the age of the child, and their maturity. I have an eight year old, who currently doesn’t even have a phone. Once he does I will absolutely be whitelisting contacts, approving app installs, and turning on functionality to limit access to content unsuitable for a child.
That’s not to say I won’t also be talking about the world, and things they’re likely to encounter. Those conversations are happening even now, he’s not being raised in a petri dish, but equally I’d quite like to avoid throwing him directly into a vat of toxic waste unprepared if I can avoid that. Ultimate children are just that, children, they’re not renowned for solid decision making skills.
to clearify apple feature looks reasonable to me. 4 and 7. ... and now im questioning my previous comment in my head about how certain i am about what i wrote :)
Between 15 and 23. It's easy to live in a utopian ideal where you say "We'll talk to them about all of this, and they'll be rational and see sense", but having spent the last 3+ years firefighting the impact of their spectacularly stupid behaviour (all 4 kids are above average IQ, but all of them have ASD diagnoses, all of them fall into the 'loud, confident and wrong' category) - a lot of which has involved impact with people whose opinions are along the lines of free speech absolutists mixed with 8chan lunacy - whenever I see people crowing about how terrible all these ideas are, I think they probably haven't spent sleepless nights comforting their child's mother because one of the kids has spent another night in a hospital, institution or police custody because of undue influence placed on them whether it's because of social pressure, or because they've been convinced by a nutter to send nudes and use that against them.
I'm well aware you can't protect them from all these things, but a lot of people who have young kids seem to think that as soon as they get to school it'll all be OK. My experience has been that the games get harder and stakes rise drastically, to the point where I often consider that killing myself would be a logical way out.
It is not OK that children have access to adult phones and adult Internet.
Apple should fix this by creating a kid mode, where the user can only contact people approved by parents and visit sites approved by parents or conforming to kid safety requirements. In this case the children will be perfectly safe.
Why Apple doesn't do it? I guess because it is unprofitable, it requires major investments and doesn't promise any returns.
Kids must be banned from the Internet completely instead of trying to patch different issues here and there.
These features exist! Apple includes parental controls under "Screen Time" and they works very well. You can whitelist contacts and websites, and even extend them remotely, as an administrator of a child's device.
You can do everything you said when you manage your kids accounts (manage contacts, whitelist internet websites, define appropriate content ratings and so on.
> You can do everything you said when you manage your kids accounts (manage contacts, whitelist internet websites, define appropriate content ratings and so on.
Have you tried? It's a dumpster fire.
Child requests app purchase > I get notification > It wants my AppleID password > I go to password manager to get password, but this dismisses the notification.
I believe it's getting a bit of love this OS cycle, but it's hopeless currently.
This is not enough because in reality (I guess) most parents do not bother with setting this up (or don't know how to do it) and as a result kids stay unprotected. Maybe there should be a law requiring to setup such settings and a one-click option to switch to restricted mode without having to toggle every checkbox.
They do. iDevices can be setup with a managed Apple account which allows whitelisting contacts, accessed websites, and permission to be granted before installing apps.
Nah, opt in feature like this is better. Because it is actually useable for parents. The all in everything locked kids modes are too much of annoyance for everyone involved to be used.
Maybe the locked mode for children should be required by the law because many parents are too lazy or not competent enough to setup a phone properly for their kids.
If a parent is worried about something like this, then wouldn't it be smarter to go for a phone that has no image sending/receiving possibility?
Or if they are no longer available, shouldn't there be a parental control of a white-list of contact numbers that can show more than just the text in the message? This automatic filtering seems like a slippery slope.
Yes, I'm totally ok with that (I have three kids). What's going to happen? Will they faint? Die? They will probably laugh.
In Rome there were statues of Dionysus sporting a huge erection everywhere. It wasn't a problem.
In Italy today still, you can find postcards of him in the same position. (Granted, they are statues, not images in the flesh, but they're pretty realistic.)
I would have a problem with an adult sending nude pictures of themselves to my kids. But I would confront them myself, I don't need Apple's help in preventing it from ever happening like it's a horrible risk.
It's nothing. Really one of the least important problems ever.
Probably from what follows: “ It's nothing. Really one of the least important problems ever.”
Just because you feel that way doesn’t mean others do. Everyone parents differently. This is an optional feature that you don’t have to turn on and is off by default.
>Just because you feel that way doesn’t mean others do. Everyone parents differently.
The same applies to the parent poster I quoted, who was then told they must support sexual predators because of their different parenting.
They didn't tell you to how to feel or tell you how to parent. They stated how they feel and how they parent. You don't have to agree, but that doesn't mean they support sexual predators.
>This is an optional feature that you don’t have to turn on and is off by default.
Yes, that's been well established, and I don't think anyone in this comment chain has said otherwise.
Firstly, why specify 'erect penis' as though that's where the line is crossed? Does the question only apply to parents of girls, since they're less likely to have seen a penis before?
But yeah, I wouldn't mind all that much; it would raise a red flag for sure, given that many grooming operations work by starting out as raising children's curiosity, but I do not see any inherent issue with my child seeing a penis (erect or not!).
>At the same time in Europe you go to a nude beach and there are whole families with kids and literally nobody has problems with it. Clothing is optional. No one bothers nude people, no one bothers clothed ones.
At least the ones I've seen in Germany have a nude section and a non-nude section. It's usually not mixed.
You can also show your dick in non-nude sections. You have to be sensible of time and place but the likelyhood of someone complaining is very low if you do it near a lake or the sea. Pretty high when you do it in the city center or in church.
At least it was common in the region I lived, there are more prissy places though. I wouldn't do it anymore when everyone has a smartphone though even if taking pictures will get you in huge trouble too.
Europe is a big place, maybe that beach was that way but its definitely not the norm. Naturist beaches are usually separate or there is a section that that is for naturists. In Sweden (where I live), even topless sunbathing is unusual nowadays.
That's weird because in Spain - where nowadays topless sunbathing for women is absolutely normal in every beach - was "imported" by the first northern-European (legend usually say Swedish) tourists during late-Franco dictatorship.
Been to Gran Canaria recently (technically Spain), and every beach had topless women sunbathing (even the big city beach). Don't recall seeing fully nude adults though, as those beaches will be segregated IME.
Whenever the conversation veers this way, there's always people pipe up to talk about how American it is to have any concerns about nudity, and that this is portrayed to be a bad thing. Which seems extremely European or western centric.
Having issues with nudity is absolutely not "American". Nor are Germany or Sweden moral compasses by which everybody else should strive to live like.
Look around the world, there are many countries and billions of people around Africa, China, South East Asia, the Middle East, that are as "prudish" if not massively more so than USA. Even with a European or western-centric myopia, does it really hold? How about Poland or New Zealand?
In any case, I don't see what the point is. USA is overall not as liberal with nudity as some European countries. So what? Germany has blasphemy laws and laws punishing people who talk about things, is that normal or natural?
Photos are not natural. From that basic fact you can extend any activity surrounding them to be unnatural.
Kids seeing themselves or each other nude is pretty normal. Kids are curious. That being said, luckily none of the awkward curiosities we entertained when I was a kid had any chance of ending up on some kind of iCloud permanent record, so maybe this is for the better.
Who's banning anything? This detection is on option for parents on devices that they already have control over. And all it does is blur the image so you have to tap to reveal it as well as include a link to support resources if the child didn't want to see the nude photo.
> At the same time in Europe you go to a nude beach and there are whole families with kids and literally nobody has problems with it.
Well ok: if you specifically go to a nude beach, I'd expect you'd see naked people and nobody going there having any issue with it. That's why it's called a "nude beach" right?
But they're not common. In France where I go the most often, out of 8 kilometers of beaches on the district, there's like 100 meters (1/100th of the coastal area) where nudism is allowed. And it's not even on a beach: it's all rocks. The town hall picked that one spot, far away from the beaches, to isolate the nudists. It's tiny. There may be a few famous nudists spots around the country but they're not common.
Still France: there are only two cases I can think of where a woman showing its nipples in public is considered normal and that'd be monokini on the beach (but not full nudism) and breastfeeding. You don't see women walking topless in the streets.
Monokini on the beaches is allowed everywhere in France as far as I know but it's way less common to actually see women in monokini than it used to be when I was a kid in the eighties.
I'm in Italy on the beach right now and I'm stunned because I did not see yet a nipple. I didn't know they were prude. On the beach in France I would see nipples everyday.
Child abuse is a terrible problem, but big tech and politicians are using it as leverage to erode all of our right to privacy. What's next, let's monitor all text communication to stop perverts texting with your child? It will be hailed as another great step towards fighting paedophilia.
The problem with the "child abuse" angle is that protesting against these changes turns everybody against you, because normal people don't have anything to hide. "Why do you hate the children?"
We are losing all of our right to privacy with a massive applause.
This is the major problem when trying to educate people about anything privacy related... Living in Australia I've watched the government basically grant itself the legal right to do whatever the fuck it likes with computers and communications over the last few years, and its been a horrible outright depressing journey.
Once people internalise the "nothing to hide" argument they begin to reach for the question "what do you have to hide that makes you think this isn't ok" before they are even prompted with the usual arguments about it only being for finding drugs, guns, terrorists, pedophiles, etc
The fact this is on device and off by default is good because that's how this sort of customer feature should be built, like being able to install a DNS filter or other website filtering software. If you're worried, these options should be available to protect your children... but sadly its a small step from "these are available" to "why didn't the government make them turn it on before I bought it for my child and didn't turn it on before giving it to my child"
I'm not getting into the middle of the debate on whether or not this feature is good/bad, but I think this is a pretty easy question to answer.
There is now a feature that scans every photo you receive. There previously was not something scanning every photo you receive. What used to be a conversation between 2 people is now a conversation between 2 people and an Apple blackbox (assuming you turn the feature on).
As with most of these types of tech, the proponents of the "this involves erosion of privacy" aren't necessarily concerned with the exact implementation as described on release, but with how the blackbox works and how the blackbox (and the laws/regulations/obligations around it) will change in the future, and the inability to change the settings of the blackbox, effectively letting Apple become the touchstone for what is 'appropriate', even if it doesn't align with your view of 'appropriate'.
> There is now a feature that scans every photo you receive. There previously was not something scanning every photo you receive.
1) Seen the automatic-text-recognition-on-images features in recent iOS? It's already scanning ~every photo you look at, and not just in iMessage. If you put an image in Photos, it's also doing some object recognition on it (which you can see if you search for e.g. "books" or "chairs" in Photos). Dunno if it does that elsewhere, but the text-recognition thing, yeah, that happens all over the OS in all kinds of apps. Maybe (probably) you can turn that off, but this feature folks are complaining about is opt-in so that shouldn't be relevant for this line of reasoning.
And furthermore:
> Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn’t get access to the photos.
2) If you don't like this because you don't trust this statement to be true, then I have some bad news for you about who writes the entire closed-source operating system that runs on iOS devices and are the only ones who can deploy any code they like to them, basically at any time.
I would say a nude photo is probably worse because it could be indicative of some kind of grooming or other sexual abuse.
This is an opt in feature so it’s up to parents whether they think it’s appropriate or not, but I think it’s weird that people are getting mad at Apple for providing a secure, private way to do this when child abuse online is a real thing that happens and some parents may believe that it’s appropriate to trade off some of their young children’s privacy in exchange for some small degree of protection against that.
No, no, I like the feature. It's a nice tool in the toolset. But I'd like for it to allow blocking other kinds of photos. That they only fixed on nude photos plus their policy of not allowing nude-related apps in their App Store reeks of puritanism. That's what I don't like.
This seems so reasonable. Nothing is banned, just blurred. It's optional and it only applies to kids' phones.
When a nudist put their clothes back on, then that's it. When a photo goes out and spreads beyond the control of the kid, then that's just the beginning.
There are places in the US where it’s more like Europe.
In Oregon full nudity is 100% legal everywhere, with no restrictions. Yes, I have seen many penisis I didn’t want to see; sometimes from the naked bike rides, sometimes from people in mental health crisis, sometimes from people who are just too high to pull up their pants.
Technically it’s only allowed for “protests” in Portland, but nudity laws just practically aren’t enforced.
Other natural things I get to see without restriction: smoking meth, shitting on transit stop, lying in pool of own fluids, pissing on lamp post, and having psychotic break. Europeans can only dream of such unbridled freedom.
What an absurd, ridiculous, insane strawman argument. Apple is banning nobody and nothing. Apple is providing an opt-in mechanism for screening for potential nudity for parents that may want to give a phone to a child that they haven't had the talk with. Categorically different.
The linked article explains in detail that not only is this not a ban, it's completely opt-in.
It's just an NSFW screen that you can click straight through. For me, the benefit is that it gives children an extra thinking moment before they view something that may not be appropriate, or send something that they may not feel 100% comfortable doing.
I've lived in Europe all my life and I'd 100% enable this for my kid.
And it isn't just nude or iMessage either. Tim Cooks' Apple have been forcing their American culture / political stuff to the rest of the world for quite some time.
Part of the reason why I wish I could move away from Apple. Unfortunately Microsoft or Google doesn't seems to be competing and they aren't that much different either.
From studying history, I have the impression the normalization of nudity in public started after WWII in Europe.
Could it be that the precarious conditions (families disrupted, lack of intimate space, moral degradation) to which the populations there were submitted during the war shaped this culture?
If yes, what does it tell about the value of this culture?
here in germany the freikörperkultur was well established in the 19th century. Even more so, for most of human history, bathing nude was the norm. Prior to the 19th century there was no such thing as a Freikörperkultur, because there was no such thing as bathing cloth. The Lebensreform, or "naturalism" as the english speakers call it, was a reaction to the industrial revolutions disruption of natural human behavior, bathing, food, family, community and culture and has little to nothing to do with WW2, except that the nazis tried to suppress it as part of the gleichschaltung. As such it obviously rekindled after WW2, but it is not a post war phenomenon and it is most certainly not moral degradation. The very claim is deeply insulting. Please take your false testament about other peoples cultures history and kindly go away.
The safety of apple is absolutely horrible.
Sometimes you can get banned or scammed.
If apple does nothing about it the whole community will be gone.
Its a Seruis point.
As an European I agree. I am a bit afraid of this new feature since I might not agree with Apple what falls into the category of sensitive content and what I would consider as ok. In fact there is a big cultural difference. For me personally a nipple or even a fully exposed breast is not a problem (we even have that on public TV), whereas explicit shots of genitals I would agree fall into a category which should be censored.
It's actually a bit more complicated. My state doesn't require ID to vote, but some do. Some require a full photo ID and others don't. Photo ID is optional at the polls in about half of US states:
Where I live you just show up at the polling place, sign on the line next to your name and address, and that's it. Separately, here's how to vote without ID in California:
In the US, your voting district maintains a voter roll, which in turn is updated based on voter registration. Registering to vote means proving that you're eligible to vote, and your name and information is checked against the voter roll when you do vote.
In other words: in many places, you don't need to present an ID because you've already been identified. But it's a big country, and that's not the case everywhere: lots of states and municipalities do require you to present an additional ID, or at least to present your voter registration (and just your name and address).
The voter roll includes your name, your address, DOB, and maybe a few other identifying points. Depends on the state.
Assuming you know all of those things, you could go vote as your neighbor. But there are strong incentives not to: if your neighbor votes either before or after you, the double-count will be noticed and audited. Your neighbor will be able to prove who they are, and you'll have walked into an incredibly easy-to-prove criminal charge. Similarly, for in person voting, you run the risk of being identified when you come in to vote the second time (presumably you aren't going to vote just once, since there's no point in the crime if it counts the same as your ordinary vote).
Adding photo IDs would not meaningfully change the security model here, but would give pollworkers pretext to exclude lawful voters ("you don't look enough like your picture").
Edit: and, to be clear: this all makes sense because studies have consistently shown individual voter fraud to be virtually nonexistent in the US.
You don't need an ID to vote in Australia either. It works fine. What's hard to imagine? You get checked off on a list which includes all enrolled voters for that electorate.
Yeah and how do they verify that you are who you are? You come to a polling place, say you are X, and they just.....trust you? That's the part that's hard to imagine for me.
Yes. You’re ticked off a list, so that you can’t vote twice, and that helps spot anyone who does attempt to impersonate another voter.
It’s not 100% foolproof, but it turns out voter fraud by impersonation is very rare, so it’s good enough.
When you think in terms of “make sure every vote we count was legitimate”, then “not completely foolproof” becomes a solid argument for voter ID.
Instead, if you take a wider view and think in terms of “getting the best quality estimate of the will of the voting population”, the argument against requiring ID (in the US at least) is that it would distort the results of the election far more than a tiny amount of undetected impersonation fraud does.
This will vary by country. In the US, there are barriers to getting ID for some groups (you need to go in person during business hours, pay and wait an unknown amount of time, and this needs to happen weeks ahead of election days; this is a barrier to someone without transport juggling multiple jobs and childcare, for instance.)
Other countries see the trade off differently, or use different fraud prevention approaches. For example, I know India uses indelible ink stains on fingers to prevent multiple voting, and in the UK, there is no ID requirement (yet) but the ballots are serialised and the secrecy of the ballot can be broken to investigate fraud allegations. Neither of these approaches would be culturally acceptable in the US.
Personally I think voter ID makes sense even if only to quash allegation of voter fraud. Voter ID enjoys overwhelming (>80%) bipartisan support from regular Americans. The main obstacle to new voter ID laws is the Democratic Party establishment. They calculate a marginal decrease in electoral margins if new voter ID laws were to be enacted. Of course then they wind up faced with fiascos like January 6, but politicians are nothing if not short-term planners.
Support is strong when you ask a one-dimensional question ("do you, in general, support voter ID?").
The poll linked from the CNN article demonstrates this: there's also overwhelming support for making voting easier, but voter IDs (without free and accessible ID services) will make it harder. And that's the crux of the common Democratic position: voter IDs are perfectly fine if we ensure access to IDs. But that needs to happen before or with any restrictions on voting, to prevent disenfranchisement.
Right—I'm for it if we had universal free national IDs and services to ensure everyone had one. It'd actually solve a ton of other problems, too, so yes, please, do that and then check IDs at the polls all you like.
But that's a topic the Democrats are very divided on, and the Republicans really don't want, so it's never gonna happen. The result is that I'm not against IDing voters in principle but am in practice, given the current political situation and lack of demonstrated urgency or need that might justify the down-sides of doing it absent universal national IDs.
How hard is it to get a driver's license or state ID, which again, people already need for a long list of common tasks? Go to the DMV once every five years. Make an appointment online if your DMV does that, in and out in under an hour. I agree that it should be even easier, but it's not exactly a major obstacle.
> The main obstacle to new voter ID laws is the Democratic Party establishment.
Propose voter ID laws that include provisions to make sure that it is easy and free for eligible voters to obtain the necessary ID and Democrats won't object.
> Personally I think voter ID makes sense even if only to quash allegation of voter fraud.
I guess I just wonder how likely it would be to quash allegations of voter fraud. When we humans distrust a process, we seem to be able to come up with all sorts of wild stories to justify our fear and anger.
You have to register to vote in the US. Your name and signature are already on a list at the polling place. You can bring your voter registration card.
Voter ID laws are a hard sell in the US because of the extent to which they have been used for voter suppression. There is a recent North Carolina (??) example where the court striking down the law were about 1 inch from using the word "racist" in the ruling.
I didn't realize it was so few. My own state doesn't require voter ID. However it looks like the states that require ID tend to be smaller, and many larger states don't require it. 41% of the US population live in states like California with no voter ID laws, with an additional percentage in states where a photo ID (as opposed to a bank statement or utility bill) is optional:
Yes, obviously that's alright. It's nonetheless also alright to wonder about or even question the reason behind those different cultures.
As a European, it's just really, really strange to see how everything war, gore, death and guns seems to be alright for the kids overseas, but don't you dare show a nipple or say a bad word. I respect your culture, even like lots about it. But sorry, that's weird.
Right, but there seems to be a strange belief that the US in particular shouldn't have its own culture. My objection is to that. That's kinda American exceptionalism at its worst.
Europeans have their own share if strange cultural phenomenon, BTW. E.g. apparently they don't feed their guests in Sweden. [1]
I don't think anyone feels like the US shouldn't have its own culture -- definitely not more so than other places. If anything, the US injects its culture everywhere, due to entertainment being a primary export. And most of it is completely fine. People may shake their heads at certain things, much like you guys shake your head at the thought of not feeding your guests. But that's fine.
It's just that (sorry for using such a provocative word) the entire war/gun-fetishism is weird af for most outsiders. The way I see it, it's at least equal parts culture and excessive lobbying/brainwashing. And it shows in things like being fine with exposure to violence from a young age. Contrast that to people freaking out at nudity, and it makes for a silly juxtaposition.
I don't understand how the video relates to anyone thinking you shouldn't have your own culture. If anything that video claims that "leaving no one behind" and "decency and compassion" are what makes America America.
> Children in Europe don't play with toy soldiers or guns?
Occasionally for sure. But it's just on another level in the US. I can't pinpoint it, but I've spent years in America and visited a lot of toy stores :) we simply don't have aisles full of nerf guns and shit over here. You'll find them, but overall the topic is much, much less romanticized.
And in the context of movie ratings, every war/gory movie/videogame will be rated 16/18+ in most parts of Europe. I know, because I used to get some Star Wars shooters for Xbox rated Teen while on holiday in the US a couple years before I would've been able to get them in my home country :)
> If anything that video claims that "leaving no one behind" and "decency and compassion" are what makes America America.
Well, I kinda think those claims are mostly lies. We all know that Americans are not well-known for their compassion.
American culture used to emphasize frankness, family values, freedom, and individual responsibility. Now, it's rulings class has decided these cultural values interfere with their interests. So, they have decided to pretend these don't exist.
> And in the context of movie ratings, every war/gory movie/videogame will be rated 16/18+ in most parts of Europe.
That's funny. Frankness is not a trait I would have thought is very American. If anything, you guys are a bit too polite at times it seems.
I don't know if it's just the ruling class that has decided this change. I applaud everyone who keeps up "family values", but I will absolutely vote for shutting that shit down as soon as it's a front for "white man, breadwinner, white woman, housewife, two children -- everything else is worth less".
Unfortunately, a whole lot of those who were meant to keep up those values used them to suppress others.
I claim that it's possible to be progressive and still keep your culture intact. Family values can still be emphasized, but why not simply include all families, no matter how quirky they may come?
In any case, I applaud many things in your culture. I've always had a great affinity for the US. No country is perfect, and all will find some aspect about another culture they find strange. What's important is to keep an open mind, and not cling to traditions for tradition's sake.
> Interesting, are those ratings taken seriously?
It depends on the guardians, of course. I know I was not allowed to play games rated 16+ before I actually turned 16. I know friends of mine who played CS when they were 12. We all turned out okay.
It would be alright if one country didn't try to force its culture, prejudice and superstion on all others, because it's economically dominant.
I don't care if the US are obsessed with nipples. I find it ridiculous, but also kind of funny.
But when this obsession prevents people from sharing art on US controlled systems and services, then it's incredibly annoying. (It's not the most pressing problem in the world, sure, but it's grating.)
The average American watches 5 hours of TV each day, 2/3 of the shows are about physical violence. The entertainment industry is a propaganda for the gun industry because people feel that violence is everywhere, and that guns are needed for protection.
Research on the effects of viewing violence found a desensitizing effect, especially for children. People become less sensitive to the pain and suffering of others and more fearful of the world. This effect is much less pronounced for video games, which tend to not dramatize violence.
There is a reason we try to curtail violence in the media in Europe. When you engage with a fantasy many hours a day, it becomes your reality.
> The average American watches 5 hours of TV each day
That's an overestimate, it's probably closer to 3 hours per day [1]. Also, if you exclude the elderly (like +55), it probably becomes significantly less.
> people feel that violence is everywhere
It really is though. Right now, there's a war going on in Europe. Curtailing it wouldn't make it go away.
> Research on the effects of viewing violence found a desensitizing effect, especially for children.
Like the vast majority of social sciences, that area is full of low-quality studies.
Puritanism is a distraction here. What Apple implements is a in-depth analysis of all messages between its users and not its users who happen to communicate with iphone users. The message scanner analyses not only text, but also images. This will be used to build more detailed user profiles for advertisers and spying agencies, domestic and foreign (Apple is on good terms with China).
This isn't written on the page linked. Where are you finding this?
For other people who only read the comments and don't read source material:
"Turn on communication safety to help protect your child from viewing or sharing photos that contain nudity in the Messages app. If Messages detects that a child receives or is attempting to send this type of photo, Messages blurs the photo before it’s viewed on your child’s device and provides guidance and age-appropriate resources to help them make a safe choice, including contacting someone they trust if they choose.
Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn’t get access to the photos."
Ok, so you couldn’t be bothered reading the article and rather just slippery slope argument the feature blindly.
It’s an opt in feature, that’s part of the parental controls settings, and it does all processing on device. Apple is not Google or Facebook, and has demonstrated that it has no desire to access your images or data.
Your slippery slope argument is so pointlessly stupid it’s equivalent to arguing that the addition of speed limits is a slippery slope towards not being allowed to drive. Eg meaningless to the point where it erodes the meaning of the term in contexts where it actually matters.
> Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn’t get access to the photos.
I said nothing - stopped the archiving and deleted everything except for my own photos.
They deserve their privacy and do whatever on their devices without snooping - if we go away on vacation I now ask them if they want to share any pics they want so I can keep it.
Apple is turning out to be big brother.