Hacker News new | past | comments | ask | show | jobs | submit login
Face retouching settings should be off by default (blog.google)
149 points by fortran77 on Oct 4, 2020 | hide | past | favorite | 137 comments



Fun anecdote: When Alipay originally rolled out its "Pay by face" feature in stores, the uptake was far below their expectations. They did some research and realized that Chinese customers were so used to seeing their face with a beauty filter applied that the image of their unbeautified face was disconcerting and they avoided using face verification. Alipay rolled out automatic face beautification and their usage shot up.

https://qz.com/1657653/alipay-to-add-beauty-filters-to-facia...


The article doesn't say anything about usage shooting up after the change. To me it sounds equally plausible that this was a publicity move to get people curious about this "better than a beauty camera" feature and just use that as an opportunity to get the face-pay feature into the news.


I heard about the usage going up from a PM at Alibaba.


This discussion reminds me of a short story, Liking What You See: A Documentary by Ted Chiang (same guy who wrote the plot of the movie Arrival). There might be a pdf floating around on the internet, it's a 20 min read.

Basically, it's about a reversible procedure called "calli" that turns off the neurological response to symmetry, smoothness, and "aesthetic" of a human face. So someone with the procedure can recognize different faces, but they don't feel any better about a pretty face than an ugly one.

The story follows a girl who had calli done as a young child, and then decided to go to college with it turned off. When she's there, there's a push on campus to mandate the procedure, to combat "lookism." The story is written like a documentary film, so it has little snippets of interviews, news articles, etc. about the debate.

The range of emotions we get from looking at other people's faces is intrinsically worth saving. But, people get a huge advantage from being good-looking. It's effectively discrimination to consider people who are good looking more trustworthy, nicer, etc, yet all of us do -- it's hard wired. The world is not really fair to someone with a bad looking face.

These apps aren't calli but they do mediate how we see the world, ourselves, and others to a huge degree. I don't envy any middle school girl who has to navigate this landscape. Anyway, I highly recommend the story, it was such an interesting way of looking at the issue.


"calliagnosia" or "the inability to perceive prettiness". Ted Chiang's works are mind-blowing and I recommend this. It is a great piece on giving different perspectives on this issue from different parties' perspectives .. like journalism embedded in sci-fi.

As for the article itself, I find myself gravitating towards not trusting Google's (or such org's) motives in this. After all, if such a touchup feature were activated by default, it would skew Google's face dataset away from real faces and embed a notion of "Calli" in their face models.



> there's a push on campus to mandate the procedure, to combat "lookism."

That sounds incredibly dystopian.


Consider how dystopian it is that there are classes of people nigh-instantly dismissed because of their appearance. That's plenty dystopian, no?


What's even more dystopian is mandated, invasive, mind-altering surgery. As far as 'my body, my choice' goes, there's little difference between that and the forced sterilization China used to mandate.


No, that’s the natural way the world works, natural selection.


Just to play devil's advocate:

I mean, yes, this hypothetical procedure does sound dystopian on its face (haha). But then again, imagine if you just judged people by how nice they actually were, not by what their face looked like? Imagine if you were evaluated by the merits of your mind, of you, not the 'flawed' representation of you (your face) given to you by your DNA?

I've been helped enormously in almost every area of my life by the way I look and how I'm able to present myself physically. Much of that is because of my DNA -- something I didn't work for or ask for. It was given to me for free. Likewise, others are deprived through no fault of their own. Of course, that's the way it is. I like being attractive and having attractive friends and romantic partners. But what if it didn't matter? Would my life be richer?


> Imagine if you were evaluated by the merits of your mind, of you, not the 'flawed' representation of you (your face) given to you by your DNA?

Do you believe that accidents of genes (and actions taken by the mother at times) affect brain development? If so, aren't we just moving the problem to a different body part ?

(keeping the devil's advocate train going )


It's a good train to keep going, and the end of the line will be the realisation that allowing some people to receive better treatment for any reason, birth, wealth, looks, intelligence, or anything else, is inherantly unfair.

I hope that some day in the future we'll look back on the way poor people are treated now with the same horror we have today if we hear a black or gay person was refused healthcare or another service.


That's basically what John Rawls ends up concluding [1]. But the same line of thinking also leads to the conclusion that Free Will doesn't really exist, rather we humans are a product of our environment/genetics (and thus don't inherently deserve better treatment or rewards since all our efforts and subsequent results aren't 'really' our own doing). I wrote a little bit about this here [2]

[1] https://en.wikipedia.org/wiki/Veil_of_ignorance

[2] https://www.hugomontenegro.com/blog/the-absurdity-of-free-wi...


This veil of ignorance is such a simple concept, yet it never occurred to me. It’s actually making me reevaluate my worldview. Thanks for linking it!


I agree with the conclusion but I'm quite stumped on how to improve that today since aren't even aware of our biases in most interactions. Some definition of Universal Income/Access helps but incentives aren't aligned with the most powerful groups today.

My patchwork strategy nowadays is "Nature is always unfair on the individual level. Leverage your advantages to make life better for those less fortunate." and hope enough people do this to improve the world.


Does "better treatment" include higher wages for skills demonstrated? Or are you talking, like, equal application of law? That what I'd assume.


I'm not sure it is possible to separate those two things as long as our behaviours subtly change because of some quality we perceive in another person.

A baseline for human needs ( food, shelter, access to health and opportunities to develop oneself) can be codified into law but we'll always find a way to express those biases at the higher levels IMO.


> can be codified into law but we'll always find a way...

Which is the best argument for a minimal legal ethical codes. Any system can be hacked, especially those designed by humans. The road to hell is paved with good intentions. So...


Sure, that's not an issue. Those higher wages can be used to purchase trinkets like a bigger car, fancier house, whatever. As long as the important stuff like education, healthcare, transportation, legal advice etc. are equally accessible to all, and more importantly, you can't buy better healthcare, education, or legal support with more money.


more importantly, you can't buy better healthcare, education, or legal support with more money

That's Harrison Bergeron territory. Sorry, you can't go on a vacation to France because your kids might learn something.


Obviously, like all political ideas, if you take it to the extreme it becomes nonsense.

For example, we are currently living the idea of capitalism taken to the extreme, where one man can own a trillion dollars while others can't afford healthcare or education, and it's clearly becoming more and more nonsensical to allow this. Let's try swinging the needle in the other direction for a while.


The difference here is that a good brain is adding value to society. A good face (arguably) isn't.


I don't think that's quite clear in a capitalist world though. Good faces and whole industries around good faces have led to billions and billions of "wealth" being created through direct careers and indirect revenue streams like modeling, etc. ( The "utility" of those existing is a different animal but goes too far from the main thread)


Not once everyone has it done automatically and society accepts it. It’ll be something like a vaccine against the measles.


Given that it's a work of fiction, it's literally dystopian!


> The range of emotions we get from looking at other people's faces is intrinsically worth saving.

Citation needed? It's almost-certainly net-enjoyable - but, when you go on to acknowledge that the mere fact of those emotions leads to discrimination against some people, how do you justify this statement other than "those emotions are natural"?

To be clear, I am _not_ saying that I would support mandatory calli - forcibly changing someone's neurological responses is an astonishingly dangerous precedent to set; as another commenter points out, that would be terribly dystopian - but I don't think I can agree with your specific statement.

Regardless - this sounds like a fascinating story, I'll have a look - thanks!


No citation needed


> yet all of us do -- it's hard wired.

It's important to note that we're hard wired because it was evolutionary beneficial to "push out" people who look bad/incorrect/not-normal. If you have a gene that mutates then it's possible you could have other mutations that make you not suitable reproductively so it became ingrained to avoid those who look bad to get genes that would produce more fit offspring.


Also I thought that in 2020 it is becoming more generally recognizes that people have a right to an “affectional” preference. It should be OK no to be attracted to someone.


I wonder how many others also find "retouched" photos have a bit creepy, almost uncanny-valley quality to them, especially those of people we've seen in reality with our own eyes. I've also heard stories of others buying a recent camera, and seeing there was something subtly off about the faces in the pictures it created, later finding out there was a "face beauty" option that was on by default.


I spend a good portion of my days retouching portraits in Photoshop. Most of the filters do things I’d never imagine doing to a client’s face, like making eyes bigger. Most of what they do is smooth out the skin to a ridiculous extent.

My way takes 10 minutes vs. an app taking a millisecond, but that’s all in the name of making the image look flawless without looking retouched.


It's definitely creepy. What's weird is that you can repair minor blemishes in Photoshop and it's incredibly hard to tell, but these filters do so much more processing than using the healing brush on 2% of a face.


Argument presented in article about filters and re-touching is mostly self-contradictory: image from modern camera is heavily processed anyway. Anything from color balance, lens distortion to edge enchantment and smoothing is corrected in software/firmware. I.e. any modern smartphone camera is firmly in computation photography territory - what comes out is not what sensor “sees” anyway.

So a question of additional filter being on by default is really a matter of preference, because so many filters are already on anyway.


An interesting conundrum: if some apps automatically apply beauty filters and others don't, over time the ones that do could outcompete those that don't, even if it's to society's collective detriment. "The default camera on the Pixel is so bad, I always look so ugly in it!"


This has happened in the fashion industry where there is size "inflation" as sizes don't correspond to physical measurements.

Size 5 in Brand A is tight so consumers prefer Brand B who has size 5 clothes that fit better.


This kind of thing is why I'm happy that mens' clothes often use actual inch measurements rather than abstract 'sizes'.


At least for popular brands of jeans, the "inch" measurements have succumbed to the exact same nonsense: https://www.esquire.com/style/mens-fashion/a8386/pants-size-...


That article was written in 2010, but I can't help agreeing with its conclusion:

"But vanity waist sizing is so entrenched, it couldn't possibly be changed overnight, at least not without a government mandate. The only solution seems to be a gradual, year-by-year shaving of quarter-inch by quarter-inch until, in 2021, men's pants finally correspond with the label numbers — conveniently just in time for the New World Order's switch to mandatory full jumpsuits."

That sounds terrible, but it could still be an improvement over 2020.


For many years, I've noticed that 32 in one brand isn't 32 in another.


A Google Play policy that enforces this in other apps would be the next step.


Retouching is Women's magazines' stock in trade.

They actively seek to make women insecure about their appearance, and collect advertising from outfits that promise to correct it.

One of the promoted tropes is that overspending on a putative remedy is giving yourself a treat, rather than giving the vendor a treat.


giving yourself a treat, rather than giving the vendor a treat

That applies to... 100% of all marketing. Not just the cosmetics industry.


No. There are plenty of products marketed on the proposition that they provide as much or more value than you pay.

Selling 5 cents' worth of scented lanolin for $20, which no one can tell you used, is not even the sort of "price-signaling" that usually motivates absurd overpayment. That it succeeds indicates deep pathology.


I'm still amazed that they managed to spin self-care into another consumerist rat race.


It probably plays out like that, but I'm sure someone initially just saw it as a way to drum up magazine sales.

A less publicized example of this is what's done in music production. Most people aren't singers, so it's not as obvious, but the perfection you hear in any track produced in the last 20 years sets pretty unrealistic ideals for even a trained voice.

Unlike the magazine example, there isn't a compelling ulterior motive. They just wanted to sell more CDs.


I don't think magazines are at fault here. It's social media where most of these photos will end up and be judged.


Social media is this generation's magazines.


Social Media is worse with its instantaneous feedback factor.


You're giving them way too much credit. There's no strategem or design to hurt women's self esteem. They sell what people want to buy.


I think you have that backwards. They sell what they've made people want to buy with their advertising.


People will always want to better themselves versus another, it’s part of the mating process. If it’s not appearance, something else will be used to signal rank.


This is a variant of the "if I don't do it, someone else will (and things will be the same)" hypothesis, which... you might want to question.


I think you are both correct. It is push and pull.


And the men in "Men's Health" magazine aren't retouched? Are men not affected (or does Google not care about men)?

https://www.menshealth.com/fitness/

I am skeptical about this. I'm co-owner of a gym and I don't think the people there--men or women--who care a lot about how they look, are depressed or insecure at all when they see Instagram photos. And many of them spend a lot of time building an Instagram brand.


> I'm co-owner of a gym and I don't think the people there ... are depressed or insecure at all

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

― Upton Sinclair


I have no experience of "men's health" magazines. No doubt they are an attempt to rope men onto the same treadmill.


One problem is that selfies have an inherent uglification effect due to the short distance from the lens. Unretouched selfies actually look _worse_ than what you look like to other people. That's one big reason why mirror selfies are popular: the additional optical distance due to the mirror reduces the geometric distortion. (The other big reason is that bathrooms somehow tend to have excellent lighting.)


> We set out to better understand the effect filtered selfies might have on people’s wellbeing—especially when filters are on by default. We conducted multiple studies and spoke with child and mental health experts from around the world, and found that when you’re not aware that a camera or photo app has applied a filter, the photos can negatively impact mental wellbeing. These default filters can quietly set a beauty standard that some people compare themselves against.

Hm, I find myself unable to say whether I agree or not. I would say I am 70% confident that these filters can negatively affect people's wellbeing (based on my intuition/bias), and 15% confident that we can get a scientifically valid answer to that question (based on the track record of social psychology and related disciplines).


I'm actually amazed (maybe I shouldn't) that face retouch became so universal in most (all?) contemporary smartphones with seemingly no ethical considerations.

How could the likes as samsung switch this feature on by default like it was some minor optimization?

How many customers did you gain because you didn't require them to turn a switch on? Was it worth it?


Why are you amazed? If all professional photos are retouched, then this is the standard that people will hold themselves to, and their pictures seem nasty by comparison.

When you turn on automatic retouching, everyone suddenly feels like they take professional/promotional quality pictures.


What is a "nasty" photo? One that looks like it's not staged to hell?


Honestly, the premise seems exactly backward to me. I remember taking a selfie of myself on two different phones to compare their forward-facing cameras. The image quality on both was roughly the same, but one had subtle skin smoothing and lighting adjustments, the other one didn't. The one with "beauty filters" generally matched what I assumed I looked like to other people. The one without the "beauty filters" made me look much worse than my own mental picture of myself. I'm sure the one without beauty filters was more accurate, but it made me more self-conscious about my appearance and kinda made me realize "shoot, I'm actually uglier than I thought", and that bummed me out. It certainly didn't help my self-esteem or self-confidence! I rather suspect that subtle automatic retouching would actually be good for one's mental health.


Beauty in general is bad for people’s mental health, the only difference is that filters and retouching create beauty at scale. People can still encounter beauty in real life and despair that they will never attain that.

I find that people say retouched or filtered photos are not “real life”, but I’ve definitely seen people in real life who are just as attractive as any person that uses a filter. It can definitely be “real life”, but encountering these people is rare enough that people assume that level of perfection cannot possibly exist. In reality it just doesn’t exist at the frequency in which filtered images would have you believe.


Suggestion: Let's mandate filters that make everyone equally ugly, so nobody feels bad or pressured by beauty standards.


I'd like to submit a bug report; it's not working, my face still looks the same!


Like Harrison Bergeron?


Surely the real problem here is not when face retouching filters are on or off by default for your own photos. It's when other people use them and you don't know if they have them on or off. That's what quietly sets the new beauty standed because everyone sees these retouched photos and assumes they are representative of reality. What I'd like to see is something in the image metadata that stores what filters were used so apps could display that info. I can't see that happening though.


Why? Google's pixel line of phones uses incredible AI/ML techniques to improve low-light photography. They even suggested in a blog post that they are using their trained models to infer color and detail information which is simply not present in the raw sensor data.

So retouching is totally fine if google does it. But how dare Korean and Chinese manufacturers (who are targeting a certain market) use the same techniques on faces?!?! The hypocrisy is strong with this one.


This seems like saving Android's brand by Google. I don't know about anyone else, but the on-by-default face retouch on my old samsung phone looks horrible. It makes my face look like melted plastic and makes the lens extremely blurry. A Samsung camera could have a million megapixels or whatever and it would still look like crap because of the filter, and it makes it seem like all android phones have garbage front cameras.


But another important question: is using filters in photos you post online bad for other people's mental health?


I think it’s well established that it is


Is it? Isn't there research showcasing that beautiful things make people feel better?

https://www.sv.uio.no/psi/english/research/news-and-events/n...

Although, there are more concrete results about surrounding environment than people specifically.

Anything more I can read about this?


I guess the real question isn't if beauty makes people feel better. Fentanyl makes people feel great. I think that the better question is, what are the long term effects of unrealistic portrayal of human beauty?



It doesn't have any studies listed. I am more curious about any wide scale studies.

I have seen some stuff on impact of social media standards and how it affects people but is there anything separate? A study that removes social media and then test out filter applied pictures?

Original post: https://www.sheknows.com/health-and-wellness/articles/113866...


Why do people find these retouched images attractive?

I'm a good looking guy (in shape, did modelling, etc.) and I just can't understand it. It's like makeup to me... wrinkles, freckles, scars, and everything else have never affected my perception of beauty because I tend to like an exotic, atypical appearance. Health and physical fitness have a much larger impact on attractiveness for me and all this stuff just seems like a poor imitation of what it actually is to look healthy.


That's a really good point.

Both are true.

Beauty in the world really matters. Both people and things.

But expectations being set to high on oneself is bad.

Can you have one without the other and how much are each weighted.

You can also throw in beauty is a service industry that creates jobs and personal satisfaction. Which every UBI talk says is what we will be doing when we can't work.


Right, but we should be able to calculate the impact of various factors. Industries could be using wrong types of filters or social media magnifies the filters in wrong ways.

For example, instagram algorithms specifically pick certain kind of body features and people to throw in your feed.

https://voxeurop.eu/en/the-skin-bias-in-instagram/

Algorithms could also be causing a significant cognitive dissonance by making people feel excluded. If I don't see enough people that shares some resemblance to locals (however beautiful or filtered they are), there is some stress.


But not well accepted by enough of the population.


I'm not sure that's true. I think most people probably have an intuitive sense of this. They just don't care.


I'm furious at HTC and Samsung for ruining some of what would've been very precious photos I took with filters that they have on by default and don't tell you. There are some particular ones that were great moments that they absolutely destroyed, on purpose. Why would I want my phone's camera to deliberately change the shape of me and my friend's faces and replace our skin with porcelain?!


I honestly never saw this coming. It's the craziest dystopia nobody talked about. Cameras aren't even cameras anymore! Are there some phones with camera-filters on, without the option for you to turn them off? Cameraphones need to make a comeback; you know, phones with cameras and not some sci-fi CGI BS.

I don't trust anything coming from a smartphone camera now. Even my own phones.

Maybe related: I've noticed on iPhones and Samsungs that I cannot get a picture to match reality. I feel like it was easier to get a picture with colors that closely matched reality YEARS ago. Why have cameras gone downhill? Or is it my imagination?

When I take a photo of the grass and sky with any modern smartphone it never looks right. I try and try and try to adjust the settings and I can just never get a good photo anymore.

Edit: Example: I couldn't take a picture of the smoke on the West coast recently. The dark grey and orange skies would all come out looking blue or something. It's like the 'camera' is telling me I'm f'ing crazy for taking a picture of smoky skies, and surely I want to pretend the smoke isn't there at all! That's why I'm taking a picture right? So I can pretend that reality isn't real. I have no photos that match the smoke. I have only my memory and other people's photos. WTF.


> Edit: Example: I couldn't take a picture of the smoke on the West coast recently. The dark grey and orange skies would all come out looking blue or something. It's like the 'camera' is telling me I'm f'ing crazy for taking a picture of smoky skies, and surely I want to pretend the smoke isn't there at all! That's why I'm taking a picture right? So I can pretend that reality isn't real. I have no photos that match the smoke. I have only my memory and other people's photos. WTF.

That's automatic white balance adjustment. It's not used to an orange sky, so it can't figure out what color should be considered "white". If you want an example of why this is important, turn off automatic WB and take a photo under fluorescent light followed by a photo lit by candles.

I have no idea about Android phones, but iPhones do some amount of denoise, contrast adjustment, and a few other limited postprocessing steps automatically.


Who is downvoting this correct statement? Grow up.


> Maybe related: I've noticed on iPhones and Samsungs that I cannot get a picture to match reality.

I know I'll come across as pedantic, but you'll have trouble doing that even with DSLRs. Whether you're doing film or digital imagery, you're not capturing reality. With camera sensors, the raw data you capture (RGB brightness for each pixel, for example), needs to have some color/brightness/contrast curve applies to make it look realistic (i.e. linear will not look like reality at all). Each manufacturer picks their own curve, and then also has options (vivid, etc) - and some of them do pick less realistic options for defaults as customers think they make for better photos.

Not to mention tricks to fix overexposed pixels, etc. There's quite a bit of manipulation done before you get the JPG.

Cell phone cameras are akin to cheaper digital cameras that used to be (and perhaps still are) on the market - where you don't have much control on this. This is not new to smartphones - it's just that smartphones have more processing power to do more.

Long time photographers know this. It's not "capture reality", but "how much deviation from reality am I willing to accept", with 0 not being an option.


I think he is referring to machine-learning features adjusting local contrast and color, not simple white balance. In recent phones it happens even when HDR is off. I've had a few photos ruined by it too, with a strong halo around people's faces or a red-looking sun.

You can install an alternative photo app like Halide to take control of every variable, even aperture size, but it's a bit of a hassle when any 'dumb' camera would do the job.


> Edit: Example: I couldn't take a picture of the smoke on the West coast recently. The dark grey and orange skies would all come out looking blue or something. It's like the 'camera' is telling me I'm f'ing crazy for taking a picture of smoky skies, and surely I want to pretend the smoke isn't there at all! That's why I'm taking a picture right? So I can pretend that reality isn't real. I have no photos that match the smoke. I have only my memory and other people's photos. WTF.

That was due to automatic white balance, which is perhaps the most basic and low-tech automatic adjustment on your phone. We're talking deep AI tech from the 90s, if not earlier.

I ran into the same problem on that day, so I googled "iphone manual white balance", got an app that looked decent (Obscura), and took my pictures without issue. Try that next time. (But get another app, Obscura literally takes half a minute to load for some reason.)


You don't have to use a different app, actually. The standard Camera app lets you lock white balance. To capture orange skies, you just long-tap while pointing at a white source (another iPhone or iPad, for example) to lock the white balance. Then turn your camera to what you want to shoot, and it will keep the WB locked.


That's not how cameras work though. The raw output of a phone camera would be horribly off-color. It would NOT capture the image you are expecting. So the phone applies a number of filters to try to correct the flaws inherent in a camera with such small optics. You only notice when the external conditions (smoke-filled skies) don't match the assumptions that went into the filter design.


Aren't photos of extreme conditions some of the most important possible photos though? A camera that can't capture them is badly broken.


Cameras aren't gods. You have to give them a frame of reference (white point)


I'm aware of that. And yet, somehow we have many photos of the recent wildfires in California, captured by cameras, so setting a correct whitepoint is clearly possible.


It's called get a real camera and real optics and not a tiny one built into phones.


I haven't owned an android phone that didn't have a manual white balance function in the stock app.



You'll see these kinds of filters at lots of places in Asia. For example, in Japan, there are many "ki-re-i" photo booths that will automatically clean up your face for your ID pictures.

Even the old Panasonic digital camera I bought 13 years ago has a smooth skin option.


Coming from Europe where you are usually asked to not even smile on ID photos, it blows my mind that there is a whole legal ID photo retouching industry on Japan. Or are such retouched photos not supposed to be used for official purposes?


In the US, for things like passports, retouching is disallowed. However, if the effects are small, they'll have no way of detecting. No one's going to scrutinize your face for all the wrinkles you removed.


I haven't needed to use one of them, but if you look at the pictures on this website (https://travel.e-japanese.jp/japan-id-photo), you'll see that it's possible to select passport pictures and also apply skin correction/lightening to the pictures. I don't know if those options are disabled for official photos though.


There isn't. Japanese photo IDs look just like most other countries IDs (horrible).

There are photo booths for taking "fun" pictures, not IDs pictures.


I'd imagine you would have trouble traveling outside of japan with an obviously retouched passport photo.


There's the classic trick that Penn& Teller wrote about in "Play in Traffic"

https://www.instructables.com/Prank-a-Cop-with-a-Clown-Nose/


> Then, while he is looking at your drivers license, grab the bigger red clown nose from your visor and put it on!

Nice way to get shot.


ki-re-i photo booth has a feature to auto clean up feature but it's paid option. I've used the machine without the option.

BTW the name "ki-re-i" is maybe comes from "kirei" (means beautiful).


So everyone will turn on the "value-free" (yet for some reason worth building) options.


Yeah "value-free" seems pretty laughable. We're "retouching" your face to make it look different, not better? Not credible. By using code words, all they're doing is avoiding blame for the harm they are doing.


It's nice to have a filter. It feels like you're not being so vulnerable when posting your face. Maybe I want to send my Instagram followers a smile without sending a perfect likeness of myself, you know? They don't need to know how much sleep I've been getting or the last time I scrubbed out my pores.


For anyone interested: watch "The Social Dilemma" on Netflix. They show stats on mental health and suicide and how they may be related to the introduction of the smartphone.


What's the age/sex demographic of most of these suicides?


Mostly young women. The documentary is worth a watch.


I was just curious because I keep hearing how younger people are committing suicides but nobody wants to explicitly break out the statistics.

As for the documentary, although friends have recommended I see it I am up to my neck in propaganda from every which way from news about this, a documentary about that, so-called expert said so-and-so, et cetera. Just the stupidity on HN is enough for me to call it a day sometimes. I'll sit this one out.


What do you mean "nobody wants to explicitly break out the statistics"? Can't you just Google it?

I searched for "suicide rates in young people" and found lots of info, including https://time.com/5550803/depression-suicide-rates-youth/ which cites and links to a study published in The Journal of Abnormal Psychology

From the article:

>Between 2009 and 2017, rates of depression among kids ages 14 to 17 increased by more than 60%, the study found. The increases were nearly as steep among those ages 12 to 13 (47%) and 18 to 21 (46%), and rates roughly doubled among those ages 20 to 21.

>Among young people, rates of suicidal thoughts, plans and attempts all increased significantly, and in some cases more than doubled, between 2008 and 2017, the study found.

>These findings were based on data collected from more than 600,000 people by the National Survey on Drug Use and Health, an annual nationwide mental-health survey conducted by a branch of the U.S. Department of Health and Human Services.

>What’s causing today’s young people so much anguish? “This is always a tough question to answer, as we can’t prove for sure what the causes are,” Twenge says. “But there was one change that impacted the lives of young people more than older people, and that was the growth of smartphones and digital media like social media, texting and gaming.”

Note 1: The person quoted above is one of the study authors. Note 2: The modern smartphone (iPhone 1) was first released in 2009 (edit — correction: 2007).


> >Among young people, rates of suicidal thoughts, plans and attempts all increased significantly, and in some cases more than doubled, between 2008 and 2017, the study found.

This tells me that there wasn't a significant increase in the rate of actual suicides, or they would have mentioned it.


They did. It's literally in the article title. The supporting data is linked from the article as well.

https://www.cdc.gov/nchs/data/nvsr/nvsr67/nvsr67_04.pdf


I found it interesting because they were interviewing former managers and engineers of google, facebook, twitter etc. They were not saying "This is the cause because X and Y.". Often they were not even able to formulate the "problem" with social media. Because it is not that black and white.


Because they aren't child psychologists and therapists who deal with actual cases?


Well, obviously. Part of the story is that in fact a bunch of white men in their mid-to-late-20s were directing how the lives of billions are affected, just by designing notifications and apps.

Addiction plays a big role here, and the fact, that many people in the industry never seemed to care about how addictive software can be.


No need to force racism in. These companies are multiracial in the US and similar companies exists in other nations with other races.


You, Sir, do not understand what racism is.


Why take selfies at all?


Indeed. I don't understand it. The only time I take photos of myself is when I use it in place of a simple mirror. (For example taking a look at the skin in some part of my body I can't physically look at normally (my back for example).



Thanks. Changed from https://www.androidpolice.com/2020/10/02/google-says-beauty-....

We also changed the title from the clickbait of "Google says 'beauty' filters are bad for your mental health" and the bland corpspeak of "More controls and transparency for your selfies" to what appears to be a representative sentence from the article.


This was highly discussed on ‘unbox therapy’:

https://youtu.be/Q3GGdtn9poo


It sounds like this might have been a bug: https://www.theverge.com/2018/10/23/18011814/iphone-xs-beaut...

> Essentially, Smart HDR was choosing the wrong base frame for HDR processing when you took a selfie. Instead of choosing a frame with a short shutter speed to freeze motion and preserve detail, it would sometimes choose a frame with longer shutter speed. The front camera also does not have optical image stabilization, so it takes blurrier shots at the same shutter speed as the rear, stabilized camera. The result is a loss of detail that looks like smoothing on the front camera.


Conspiracy perspective: beauty filters are affecting their ML training.


If it only smooths over the skin then it would not affect it.


Why is that?


>There are a few studies that show such functionality can have a negative effect on mental health, and as a result, Google is now turning them off by default on its own phones

Google has all kinds of activity tracking enabled by default on Android phones. I wonder what impact that has on mental health.


Unless you're someone who gets really anxious/paranoid about being tracked by a BigCo, I can't imagine it making an outsized impact. The weirdest thing might be how that activity tracking is used to try and sell stuff to you, and sometimes that advertising is pretty compelling as a result. But I'm not sure how that compares to the more active influence of "influences" and social media across the larger population.


> Unless you're someone who gets really anxious/paranoid about being tracked by a BigCo, I can't imagine it making an outsized impact.

I can't say anything with absolute numbers but subreddits such as privacy, privacytools, degoogle, minimalism, and alike places have seen massive growth in recent years and I regularly find people (especially teenagers) who are severely affected by this.

So I wonder how long will it take google or any other tech companies to take an opt in approach to tracking.


When you know you are being watched, your behavior changes, for better or worse [1] [2].

But it's true it is a bit different for BigCo. Not only they watch, they also have an active role by modifying what you see - e.g. the ads and news they display [3].

Does this qualify as "outsized impact" or do you want moar?

[1] https://en.wikipedia.org/wiki/Hawthorne_effect

[2] https://www.harvardmagazine.com/2017/01/the-watchers

|3] https://www.theguardian.com/technology/2014/oct/02/facebook-...


> Descriptions of the well-known and remarkable effect, which was discovered in the context of research conducted at the Hawthorne Western Electric plant, turned out to be fictional.

> Workers experienced a series of lighting changes in which productivity was said to increase with almost any change in the lighting. This turned out not to be true.

> ...a series of changes in work structure were implemented (e.g., changes in rest periods) in a group of five women. But this was a methodologically poor, uncontrolled study that did not permit any firm conclusions to be drawn.[4]

> This interpretation was dubbed "the Hawthorne effect," although the data does not support that view.

I guess that's par for the course.


The original study was indeed subpar, but follow-up research research confirms the intuition that you are not the same when you know someone (something?) is watching you, as the "See Also" section demonstrates.


The problem with advertising overlaps wkth the problem with social media -- harming people's self-image to pressure them into "retail therapy".


I don't believe in Google being interested in our mental health. Can this be because "filtered" photos make our faces harder to recognise by their AI and produce higher false positive?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: