Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: As a Facebook or Google engineer, do you consider your company evil?
124 points by fdeage on April 19, 2020 | hide | past | favorite | 127 comments
Amid the severe backlash that tech companies currently face (Google and FB amongst others), how do you consider your own company from a moral POV? Do you think the current backlash is justified?

No judgement, just curious.




Googler. Yes, I think the backlash is all deserved. I’ve been here for many years, since back when any criticism was still very niche, and at the beginning it was different. Obviously morally superior to working at a bank, and more meaningful than working on Snapchat for Sourdough Starters or similar dumb startups. Now it’s no better than the banks, and they’re even chasing military contracts. And the work is largely pointless tedium.

I’ve thought of leaving for years. Problem is: I have a wife who doesn’t work, two kids, piles of debt, and live in the most expensive city in the country. We can’t just leave, our whole life is here. And moving to a bank now to make more money (if I even would, the tenure and promotions do pile up) means working harder and more hours. That comes out of spending time with my young kids.

I spent a couple of years having a tough time with this. It genuinely caused a long-term, slow burn existential crisis that only recently started to settle into a stable state. All of life is moral compromise, I think. It sucks and I’m sorry, but it would be too hard to stop and I’m just sort of accepting that now.


I would like to thank all the wonderful people like you who created such amazing products for me to use. I am sorry to hear that you have experienced mental anguish due to this. From my perspective you should feel pride and accomplishment from having contributed to helping millions of people every day.

I get dozens of questions answered each day through your search engine that before would go unanswered. (my last question was: "how does a digital scale work?")

I get to write beautiful documents, spreadsheets and presentations that I can easily share with colleagues, who in turn, easily can provide feedback and comments inside the documents themselves.

I tracked the length of my last run with only my phone using android and maps, something that otherwise would have been difficult to do.

My family easily collects photos of our holiday together in a single shared space.

These are just the first things that came to mind I used google for in the last week. Whatever evils google do, you do at least as much good as well.

It's absurd to me that you are feeling guilt because of this. I can't tell you what to feel, but people like you have made my life significantly better and I want to express thankfulness for that.


Thank you for this comment. HN just likes to put Google's contributions down.


I believe all across internet, there is a paid army of trolls who just want to spew vitriol about a company, paid directly/indirectly by the detractors of the said company. Granted, there are a few true believers but these days in general, I find only negative news / comments about more and more companies.


It's almost like companies are doing more and more negative things to comment about.


It's almost as if there is a world outside HN happy with their services.


Another Googler here, thanks for saving me the typingm

Yes, Google does unethical things. This sucks and I feel uncomfortable supporting it. Quitting is just too costly in terms of personal cost and would not have much effect.


To you and your parent comment, coming from outside of Google:

Anytime you get a large enough group of people, minimum size of one, evil will exist. That’s something we have to learn how to cope with, and sometimes we have to accept that we aren’t in control.

There’s sort of a rebel spirit in the USA where people see themselves as perpetuating evil if they’re not actively fighting it, boycotting this and that, throwing the tea in the Boston harbor.

On the flip side, I’d make the argument that one’s personal self-preservation is more important than “making a difference,” and that people have way less of a chance to make a difference than they give themselves credit for. As you said, quitting Google would not have much effect.

All of that said, I don’t see anything uniquely evil about Google compared to any other Fortune 500 or even much smaller mid-sized business. I think I could even make the argument that they are decidedly less evil than many much smaller businesses with approximately 200 employees.

And the upside of Google is tremendous. A lot of commercial products have an incredibly positive impact.

I just don’t think anyone has to feel bad for holding down a steady job, especially since we shouldn’t judge people on their employment prestige in the first place. Having the ability to choose employer based on moral compass is itself an indicator of privilege, so I wouldn’t want to look down on a cigarette company employee (especially individual contributors) for holding down that job, never mind a software company that makes a bunch of stuff that people largely like and benefit from.


I feel that when everyone would apply this world view there would be no point in ethics at all and we would live in a more grim world than we already are. I find it very discouraging since this assumes that we have no choice. But we do.

I met a guy studying CS in Cuba (he grew up there) on a Congress and visited him there. The living conditions there are not nice, wealth is something most people there don’t have access to. But this guy was asked by FB/Google to interview for a position and he proudly told me how he declined and wrote them exactly why he morally thought he will never work there. Regardless of the correctness of his view it impressed me that he held principles higher than his potential income. If you just follow financial incentives in your decisions plus a minimal ethical code that will keep you out of prison you are contributing to humanities demise. Long term if too many people just follow their incentives without reflecting on their net contribution to society as a whole this will destroy our society.

Also it’s not like you will be out on the streets if you don’t work for Google/FB. You’ll most likely still be considered rich with whatever you make at other companies in this sector.


I agree, I doubt there can be much difference between Fortune 500's company A and Fortune 500's company B, in the long run.

I thought, though, that "being evil" (whatever that means) came from some kind of instinct of survival experienced when facing the threat of bankruptcy, or at least very strong market pressures. And and it didn't seem to me that Google had been facing these yet.


    I thought, though, that "being evil" (whatever 
    that means) came from some kind of instinct of survival
    experienced when facing the threat of bankruptcy, or at
    least very strong market pressures. And and it didn't
    seem to me that Google had been facing these yet. 
It's even more difficult than fighting for survival. A publicly-traded company can't just survive, turning a tidy profit every year. It needs to grow at a rapid pace or else those stocks (and therefore the executives' stock options) are useless.

Google didn't need to fight to survive, but they also didn't have the option of remaining "Google, circa 2004" or whatever we think their last year as a mostly-good company was.


I see.

I have to say I'm slightly surprised, because I thought that top SWE like the ones working at "elite" companies such as Google were enjoying more freedom when choosing their employer.

Seems like it's hard for everyone then... (again, no judgement)


It's only partially true. While many could walk across the street and work for another Fortune 1000 company tomorrow or a well-funded startup in particular areas.

But, I think the point he was trying to make is that because of these mega-companies coalescing their people in certain cities, the economics of that takes a toll on everyone but the top executives who are already getting paid big cheddar + their enticements for being at the company: their stock awards which if there for even 5-10 years can amount to a huge amount. Enough for said ~managers~ [executives] to do the same sorts of hops globally as ~managers~ [executives] are interchangeable regardless of the product they produce.


By in large Google tries to go good. No company is perfect.


Cite your sources.

I spent years at Google. The trajectory was obvious even a decade ago. There were many reasons to leave and many of us did. I didn't have to wait for a line in the sand because we knew what was on the horizon and leaving was easy. (I did not have a family then, which is a consideration and why I left one of my last startups. Baby on the way, acquired by a big corp and knew it would be easy to become chained to that desk.)


I worked for a corp at a similar "golden cage" position, where switching would incur a pay-cut and working harder for longer hours. Eventually I changed company. I looked at the beurocrats around me and realised I'll end up like them - my tech skills on the decline, learning only how to manipulate and cover my ass.

We grow in the face of adversity and challenge. I think as an ex-Googler you'll have a choice of interesting positions that will benefit your career in the long run.


I believe that ethics are incompatible with capitalism, so this kind of personal choice doesn't change anything. You can change jobs from one evil company to another.

What would work is enough people willing to organise politically. Unions, people. This is a synchronization problem.


> Unions, people.

Yes. If Google knew, say, 25% of its highly-skilled engineering workforce would strike in the face of unethical business practices, Google would not engage in unethical business practices. Tech companies are assholes because we allow them to be.

There are also companies who happen to make money doing something more-or-less good - for example life-saving medical devices, and even a company like Roland, which makes music equipment. I think those things contribute to a meaningful life, and are a good place to land if you're jumping ship.


> And the borrowing was deliberately encouraged because people in debt are slaves to their employers

https://www.youtube.com/watch?v=qX-P4mx1FLU


And the employers are a slave to how modern money is made. https://www.youtube.com/watch?v=XcGh1Dex4Yo


Please don't feel bad about yourself. Very very very few people can have high standards in life and continue living up to those standards where no one will ever criticize them (heck, even Mother Teresa got a lot of criticism[1]).

> All of life is moral compromise

So true. I wonder what sort of life critics of FANG/big-tech live.

[1] https://en.wikipedia.org/wiki/Criticism_of_Mother_Teresa


You should focus on clearing your debts


If this person has been at google for years it’s almost certainly a massive mortgage, which isn’t really debt in the traditional sense


Agreed, I expect that is what they meant. "Massive debt" implies credit cards, gambling, or at least student loans. A mortgage isn't a massive debt like that because it is against the value of the house that you own. If you want you can just sell your house and be debt-free. You can't do that with credit cards.

I looked into moving to silicon valley for the enormous salaries, and even when you take into account the insane rent you're still taking home more than you would anywhere else in the world. I suspect it is the prospect of a pay cut keeping him there rather than actual massive debts.


Thanks for your reply, and sorry to hear you feel trapped in your company right now.

Do you feel like this feeling is common among Googlers?

Also, I'd be very interested in hearing what you think of the reasons from Googlers defending the company, since you probably hear them a lot by working at Google.


Many many Googlers are happy. Also many have stopped coming here because of this circle jerk / echo of many many posts against Google. Some posts clearly state how the author has it against Google after not clearing the interviews "they deserved" to clear.


if they only knew what their company is doing... but they don't. ignorance is bliss.


snapchat for sourdough starters exists?! what is it called?


Just as a counterpoint many happy Googlers will not come here and post "We like it ". Sorry your work sucks. You can transfer to more interesting teams.


I’ve had eight managers and six teams in three PAs. Not that frequent changes either. I assure you this is not an uncommon opinion.


Have you talked with your wife about it?


Using the word “evil” to describe these companies when today there are people literally dying from COVID19 because of the policies of their employers & governments seems like a lack of perspective to me.

The framing of your question has no good answer like the classic “are you still beating your wife?” where either a yes or no is still a bad look.


I hear your point, but I was interested precisely in Google/FB employees' feelings about their own company, not in objectively defining "evil".

Not sure I see the point you want to make with COVID-19 though... You mean there is "worse" evil going on somewhere else?


> ... when today there are people ...

This sounds a lot like whataboutism.

> The framing of your question has no good answer like the classic “are you still beating your wife?”

I don't agree your analogy holds.

1. That famous question was rhetorical, this one isn't;

2. That was a riposte to a grand claim that questions can be answered with a simple "yes" or "no" and no explanation whatsoever, the "framing" of this one — i.e., the expanded text after the headline — clearly asks for explained responses and not a simple "yes" or "no".


I don't think its a lack of perspective. Any moves they make that makes us lose some of the freedom on the internet (and in real life now) will not be temporary. It will outlive the COVID19 crisis.


Ex-googler, I felt on the majority of issues, both as an entity but also as a collection of people, the intent was to do good.

I was much warier of trusting Google with my data pre-Google to post-Google. It's constantly drilled into you the sacred nature of user data, and the technical protections seemed sound.

The elements I was most uncomfortable with were the profit-shifting/tax optimization, i.e. the Double-Irish Dutch Sandwich.


As a googler (opinions are my own) in payments that gets to see that side of things, I'd disagree with you.

To be able to do the Irish haven, Google is and was abiding by all laws in doing it. Google was doing what all the lawmakers of various countries allowed, and they allowed it for a long time. It also only impacted a portion of Google's overall revenue. It also led to Google having a fairly massive presence in Ireland, which I'm sure the people employed there are happy with.

As countries have been figuring out how to tax companies like Google, they have been forcing them to create local entities so all business with people in those countries can be taxed locally. I can tell you this was not a simple thing for companies like Google to conform to (massive amounts of work by lots of people to bill locally).

You can see if you are billed locally based on the company your invoice or payment is collected under.


Not the fact that the US military intelligence apparatus has real-time warrantless access to the data they pretend is sacred?


Blame the US govt. and your senators for that. Not a law abiding corporation.


I'm not of the belief that "I was just following lawful orders" is a legitimate defense of infringing upon someone's human rights.

Historically, it also seems a shaky defense argument.


Yes completely forget how they shut shop in China. It's easy to write all this from a keyboard when the US govt is not after you.


You clearly don't live in the real world. Every company is to comply with these


There are alternative options.

https://en.wikipedia.org/wiki/Joseph_Nacchio


Here you go - https://www.forbes.com/sites/thomasbrewster/2017/05/21/googl...

Now find some obscure niche case and blame Google. It's fun when the anti Google narrative here is spoilt.


I'm a Googler in Ads. This is a multi-faceted issue and there are tons of legitimate points on both its criticism and defense but IMO the backlash itself is deserved, especially for a number of anti-competitive business practices. But probably the exact same criticism can be applied to Apple, Amazon and Microsoft so I consider this to be more of a general tech sector problem where network effects tend to be amplified by its nature and perhaps better handled by regulations.

For privacy perspective, I'm not very satisfied with the status quo, but at least for Ads, privacy is now the top priority with a strict deadline so the situation may get improved. But I still think collecting less (or no) user data solves just a part of the problem; users still don't have understanding on what the real trade-off between their privacy and benefits to themselves/the overall web ecosystem from the ads they're watching is. Without this information, users cannot really make informed arguments and decisions on their privacy and ads. IMO, this is one of the fundamental reason of having significant discrepancies on the ads' perception across many people. But as this is not just a technical problem like 3rd party cookie but more of a subjective issue, this might be a much harder problem to tackle though.


What's your opinion on advertising in general?

I kinda consider ads a violation of human rights, given they're literally made to influence your decision making, therefore taking away a bit of your free will


Well, almost every aspects in human civilization try to affect your behavior and I think a translation of this into a binary question of free will is very likely over-simplification. Historical arguments on free will has been a very complex one and prone to errors, so I wouldn't step into this landmine.

Beside of free-will arguments, I'm seeing the huge (both positive and negative) impacts of ads on both societal and economical aspects and there's no real practical way to make it undone, it's pretty important to guide it to the right way. Yes, you can argue that it's better not to have it from the first moment but it's not the reality so the only option is making it better.


I work in ads at Google, and I don't consider the company evil. It does an enormous number of things which range from very positive to very negative, but on balance I think the company is and has been positive.

I think some of the backlash is justified and pushes the company to do better, but there's also some amount of what feels like backlash for backlash's sake where it just gets everyone here paying less attention to what critics think.

(speaking only for myself)


Thanks for replying.

2 questions:

- what's the most negative in Google's actions in your opinion? - do you have examples of routine critiques of Google that you feel are unjustified?


> what's the most negative in Google's actions in your opinion?

One that I was pretty angry about a few months ago was soliciting the business of ICE/CBP, and firing workers who tried to raise awareness of this internally: https://www.nytimes.com/2019/11/25/technology/google-fires-w...

> do you have examples of routine critiques of Google that you feel are unjustified?

One relatively recently one I remember was when Chrome announced that they were deprecating Chrome Apps in favor of Progressive Web Apps there was a lot of "Google is killing something again" backlash. But Chrome Apps were a Chrome-only solution Google came up with to make Chromebooks usable, and PWAs are the cross-browser replacement. Proprietary things that only work in Chrome should be replaced by standardized things, and this is actually a really good change.


Ex-Googler. I had my gripes with the company, but their ethics was not one of them. They constantly strived to make fair and ethical decisions, and they treated their people extremely well.

A company that’s as large and fundamental as Google will always have detractors. They pay too much. They pay too little. They collect too much data. They don’t make their data open enough. There will always be something.


The Economist Asks - Has covid-19 killed the big tech backlash?

https://www.economist.com/podcasts/2020/04/16/has-covid-19-k...


Might also hold for big pharma, if they find a cure/vaccine.


I think a better question is: Is there a google employee that doesn’t use Google products for personal use (search, youtube, android), and is there a fb employee that doesn’t have a personal fb account?


Recent article I read from someone who works on Android: https://jakewharton.com/removing-google-as-a-single-point-of...


A few years ago I was considering applying for a job at FB. The very first stage was linking your FB account. No thanks.


Pretty sure FB wont even hire you unless you are a fanboy to begin with so it's a weak barometer.


Well wouldn’t fan boys who stop using fb after they get an inside peak be a good barometer?


Googler here. Yes, there are many google employees that do not use Google products for personal use. Firefox, for instance, can be used on work machines.


I use Google's products because I have to. The alternative is spending huge chunks of my time to "work around" Google. Same goes for Facebook, except there is no work-around.

This does not mean I support either company, it just means they have manipulated culture to ensure they are the only choice.


Not sure if it's still the case but it used to be that employees used their FB accounts as logins to work systems.


Not any more. When I started three years ago, it was a mix of people doing things the old way and people with split accounts. Less than a year later, it went to all split accounts. Also, FWIW, the number of my coworkers who actively use "public Facebook" seems very small.


Googler here.

I don't consider the company evil. In fact, I think it's leading the way that we talk about ethics in computing. People talk about Maven, but it was an internal revolt--not external influence--that challenged the issues with the project.

Secondly, there's a lot of criticism about supposed violations of "Don't Be Evil". As a multi-national company, it's almost impossible these days to avoid morally complex issues. Microsoft, on the other hand, has recently been seeing in much better light due to Nadella's leadership in the post Ballmer-era. But Microsoft never had the "Don't Be Evil" and isn't getting criticism for its, e.g. defense contracts. I don't consider Microsoft to be an immoral company either.

Concerns about privacy and data handling are warranted. There should be stronger consumer protections in there area, and legislation like GDPR is good, and should be more widespread. I honestly don't think Google is the company to worry about here (or at least the biggest threat). It's companies who aren't as closely watched that are the highest risk to you. Google is pushing the state-of-the-art here, with efforts like differential privacy. From what I can see, Google respects your PII and has more advanced internal mechanisms for handling it than any other company I know about.

IMHO, Google tried to be as transparent as possible in this area. Concerned about data collection? Review your activity on myactivity.google.com. Want to move all of your data out, or back it up? takeout.google.com Location data? Google literally regularly emails you a report on your location summary so you know what's going on.

Is it perfect? No. Is there valid criticism? Sure. Could it do a lot better in many different areas? Absolutely.

Is it evil? I honestly just don't see it. Feel free to ask about individual issues if you disagree.


Thanks a lot for your reply. Do you think Google is serious about differential privacy?

Also, I wonder whether the fact that most comments on this thread come from Google (ex-)employees, and not FB's, is saying something...


> Do you think Google is serious about differential privacy?

IMHO, Yes, very much so. There are lots of other initiatives as well (can't find a public link at the moment).

> and not FB's, is saying something...

I don't think they are an evil company either, but they were somewhat cavalier with handling PII until recently. This is why I support stronger personal and data privacy laws. It's very easy to mishandle info, even with the best of intentions. Take a look at Zoom, which prioritized ease-of-use over security and privacy. This is another example of why I think people shouldn't worry (first) about FAANG, they should worry about the much smaller players that don't receive nearly as much scrutiny.


To add more flavors on differential privacy efforts inside Google; yes, when someone tries to launch something non-trivial then privacy reviewers will very likely ask them to apply differential privacy techniques where it's possible. This is causing lots of frictions on engineers who's not familiar with privacy issues, but in a good way.


> People talk about Maven, but it was an internal revolt--not external influence--that challenged the issues with the project.

The revolt started internally despite a lot of effort to conceal the project, even from engineers working on it. And Google management started to stepback only after the press got involved.


Well, that's one way to read into the facts.

The point I'm trying to make is that there are few other companies where the rank-and-file have as much pull with executive leadership and can influence strategy based on moral reasoning.


Google seemed comfortable losing any number of employees over Dragonfly, and firing organizers against it and other issues. It seemed like Congressional pressure, not employee activism, was what actually made the decision for them.

Obviously Maven wasn't shut down due to government pressure, but I feel like you drastically exaggerate how concerned Google is about valuing its employees.


I don't think comfortable is the right word. Jeff Dean personally talked to some SWEs to try and convince them to stay. Two people can take a valid moral stance and come to opposing viewpoints on the same subject.

> I feel like you drastically exaggerate how concerned Google is about valuing its employees.

I don't know what I would have to gain by such an exaggeration. It's not like there aren't a lot of current and X-Googlers that are highly critical [1] of the organization for a number of reasons. I'm not afraid of being critical, and I have nothing to gain by talking positively.

Are there other companies that regularly survey (see Googlegeist) their employees and actually make some progress on issues? Do those companies hold regular meetings with the CEO (TGIF) where anybody can ask a question? Tough questions aren't rare. They are quite common. Which other companies have those both and a survivor benefit? I'm asking seriously because I think that it's quite rare.

Do I have criticisms the company? Absolutely. But compared to corporate america, yeah, I will go out on a branch and say that Google values its staff.

[1] http://steve-yegge.blogspot.com/


> I don't know what I would have to gain by such an exaggeration.

I think Googlers like to believe their employer cares about them, and the internal activists in particular, like to believe they are effective at creating real change. Ultimately though, I believe the Congressional hearings created that change, and more Congressional action will be necessary to curb their other issues.

> Are there other companies that regularly survey (see Googlegeist) their employees and actually make some progress on issues?

This is actually incredibly common. My employer does it, and it's one of hundreds of companies in my local geographic area that participate in the given employee survey system. At our all hands meetings, the best and worst results are shared, as well as a plan to address where employees most feel improvement is necessary.

> Do those companies hold regular meetings with the CEO (TGIF) where anybody can ask a question?

No, but then again, neither does Google after they started getting real questions that they didn't want to answer: https://www.cnbc.com/2019/11/15/google-cancels-tgif-weekly-a...

(Side note, reading Steve Yegge's original platform rant was an incredible highlight of my early days on G+!)


As a user of Facebook. They shouldn’t be most upset about the privacy problems. A developer at Facebook should be ashamed of helping to build the absolute garbage that is the Facebook advertising suite. It’s a maze of bugs.


So you would consider that a user has the same responsibility as a FB/Google developer in using these services?


You'll have to be more specific about "the current backlash"


You're right. My question was to be taken from a privacy standpoint, and I was referring to some critics today (both inside and outside the tech community) of how Google and FB specifically handle privacy.


Purely personal opinion, but to me the now more immediate and pressing issue is how these platforms do very little to combat the spread of clear and obvious dis/mis information for either political or just completely idiotic purposes.


I would go as far to say the _majority_ of information spread through Facebook is fake news. So when does "protecting people from dangerous information" become censorship?

No the real solution is people actually need to be taught critical thinking skills at school. It's not Facebook's role to be the next nanny state.


Do you think they might have an interest in spreading them, besides user engagement? Also, I'm not implying anything, but what makes you think they are not trying to fight this spread?


I would have to assume OP is referencing the plethora of pre-COVID19 pieces relating to the issues of enabling disinformation spread and privacy.


Which mainly stems from the main stream media itself who are competing for attention and ad revenue against Google or Facebook.

Now one could argue that yes, Google and Facebook did profit financially from "fake news" during the last election because they make money from "attention", and did nothing to stop websites relaying misinformation because they were displaying Google ads or misinformation relayed on Facebook pages. Was there a "conflict of interest"? Always.


Actually the funny thing is I wasn't thinking about COVID-19 at all... (maybe I should get out of my cave sometimes)


> No judgement, just curious.

The issue is what does qualify as "evil"?


Maybe I should have been more precise in my post, but I was interested in people's subjective definition here, not in providing a definition myself (not feeling qualified for that).


I spent a decade at Google and I see it as rotten to the core. I used to think this was all Eric Schmidt but given the things that we have learned about in the last 2 or 3 years I am now convinced that all the founders are to blame.

There are a lot employees there that want to do the right thing, which is a why activism has been such a big deal for so long.

The tax avoidance thing used to annoy me but that's nothing compared to everything else that has happened. Even Vic Gundotra's incompetent management of Google+ looks tame now.

Maven was handled really poorly by Diane Greene who seemed to live in her own bubble during the whole thing. Urs Hölzle didn't fare much better in this regard. Moving all of SRE under cloud was a big mistake and it has backfired spectacularly because there's enough in SRE who are not happy with the whole "We want to work with the military" thing.

The shift towards cloud was a huge change and a lot of us would have preferred if Cloud had become its own company.

The high profile sexual harassment cases that we learned about did not do much to improve trust in Google execs. And the fact that the founders of the company just vanished into thin air speaks volumes about how bad things were.

Kent Walker in particular represents everything that is wrong with Google. He protected David Drummond, he's been pushing for more work with the US military and is as morally bankrupt as it gets. At one TGIF he tried to justify forced arbitration because it was better for employees to not have to go to court and when asked why not let them choose what they wanted to do he just walked off the stage. That's Kent Walker for you.

Heather Adkins is not much better than Kent. We all thought she could be trusted until two things happened: She actively sought out to destroy every copy of the Dragonfly investigation document that delroth@ had put together using searchable information and then claimed that the privacy review had proceeded as usual, which we all know it's not true given that Yonatan Zunger eventually left because of DragonFly.

Another interesting character is Laszlo Bock which many people used to look up to. When Eric Schmidt was caught colluding with other companies all he had to say was: "We do not believe we did anything wrong." That kind of moral compass, or lack thereof, is what defines Google's DNA.

Rachel Whetstone used to be loved here as well, for some mysterious reasons given her political background.

Google execs have managed to alienate many high value employees who either left on their own accord or were retaliated against until they had no choice but to go: Erica Joy, Liz Fong-Jones, Laura Nolan, Kelly Ellis, Claire Stapleton, Chelsey Glasson, Meredith Whittaker, Laurence Berland, Rebecca Rivers and so on and so forth.

This company could have been a profitable version of Xerox PARC. Instead it became SV's version of Monsanto.

It is a great place to work at. You will meet lost of ridiculously smart people and learn a lot. So put Google on your resume, learn a lot, and then run far, run fast.


No.


I think backlash is important, it keeps big tech accountable.

That being said, I do think big tech has been scapegoated for years. Everything is blamed on these companies: eroding privacy, Brexit, Trump, fake news, social breakdown -- at some point we as a society need to take responsibility as well because frankly these platforms are a reflection of us as a species.

I understand the skepticism, but after seeing how things work on the inside I trust my employer far more.


Is it really scapegoating though? Tech companies have radically restructured how business is done globally, ensuring that first and foremost, they profit at every step along the way. Other concerns, important concerns to society, have taken a back seat.

I mean, these are the two companies nearly every other company on the planet is in business with, and the negotiating terms are pretty one-sided.

I am not sure any other industry in human history has ever held so much power, and it's largely wielded by incompetent kids with little understanding of how what they do translates to the global stage.


So to pick up on my specific arguments (politics, privacy)...

You could make this argument about any revolution in tech/communications: The printing press certainly had similar concerns (politics + propaganda), same with the telephone ("people are listening in"). Powerful shady people have always been in charge of giant telecoms networks - I'm not saying if that's good or bad, just how it's always been.

The biggest difference I can see is that the web (and the networks big tech maintain) are far more open and free. That comes with downsides like even easier spreading fake news, but I don't see this as an issue with big tech, but with people. We've always been like this, but the internet just makes it easier.


I believe everything you listed is true, so it's not really scapegoating. And, saying society at large should be held responsible for the actions of a few in power is absurdity. "Taking responsibility" would be direct action against those in power. And, I agree.


I personally Facebook (& others) are _not_ responsible for what people post on its network - to believe they should be is dangerous. On the flip side, the media have been continuing this relentless attack blaming them for stuff they hate, like Trump, FB in turn try and moderate content, and the right come at them for censorship (because there's invariably false positives in that process).

So what are Facebook supposed to do in this situation. Just roll over and die?

How are they supposed to take responsibility for the the words the entire western world puts onto their platform?


A FANG employee here. I don't think my company is evil. In fact, I believe very very few companies are truly evil. A lot of criticism seems to be coming from competitors who are hurt and are determined to tar your reputation (newspapers vs FB/Google), or from ideologues who simply hate your existence (climate activists vs oil / energy companies). Sometimes, criticism is shallow in that people will say they don't like your actions but will still buy your products (eg. Apple profiting off slave labor in China or Amazon's treatment of warehouse employees doesn't affect their market success), or that the criticism of your company/product is coming from a very tiny but very loud minority (climate activists against Air travel).

You show me any sufficiently big company and I will show you enough reasons to call them evil.


Are there any realistic attempts, fictional or otherwise, at describing a market encouraging a more balanced society, but reasonably, but not excessively, rewarding successful risk takers?


I switched from Google (YouTube) to Facebook in 2019 after 2.5 years at Google.

When I joined Google I was very influenced by the 'Google is the good guy PR'. I never liked the Ads business model, but there a few Google products I really liked (Inbox, Maps and YT notably).

It's hard to pinpoint the wake up call, but I'd say it was project Maven (aka let's make an AI to help US drones kill more people), and all the lying and mislabeling there was around it (our AI don't kill people, etc…).

Wrt Cambridge Analytica, you have to go back to the mind state of 2016. At the time FB was mostly attacked because it was a walled garden hoarding all the precious data. So FB allowed users to share data with external apps. Then one apps managae to get data about 100k Americans and their friends and use it for helping Trump. And then 'data portability' was evil.


Companies exist to make money. So "evil" is probably the default when you get down to it. Anything that heads that off is temporary at best. CEOs with strong personalities, specific niches that allow for enough money without "evil" tactics, etc. Basically, the structure isn't geared toward anything better. Eventually things will come down to money vs whatever.


I see a lot of pseudo-communists point this out constantly, how evil it is to make money.

I would call them psuedo-communists because they continually engage in the same self-centered practices that corporations do, but it’s “okay” for individuals or hipster small businesses but not for larger companies.

Money is seen as some kind of evil but all it is is an exchange method. It’s a way to convert one type or good or labor into another.

The evil practices tied into money are human evils. Money is impartial and neutral, therefore trying to gain money is not inherently evil or good.


This is simplistic. The problem is not money per se, the problem is the hoarding of it. Would you hoard oranges? No, because they would go bad. What happens when you hoard toilet paper for example? Other people can’t have it. What happens when you hoard money? Other people can’t buy oranges, apples, etc. they have a lower quality of life. The problem is then that the current systems puts very little control on this hoarding and 99% of humanity is worse off.


I'm far from the "hipster" view on this. There are plenty of situations where making money isn't evil. But if it's always the #1 driver, it eventually collides with something "evil". I'm saying there isn't anything inherent about a company that heads that off.

So it's more interesting to contemplate what does head that off. Laws, incentives, competition, unions, etc.


It's not that it's evil to make money. It's that the most effective ways to make lots of money tend to be unethical, so a company whose sole mandate is to make as much money as possible will tend towards evil acts.


This is a straw man. The core of the argument isn't about money being evil but rather that in large enough quantities it overrides ethics, especially if it's combined with lack of responsibility.


Google is far ahead and mature in it's handling of data than FB. FB had many years to copy Google's practices and still let Cambridge Analytica happen. Google is a net positive in this world. Search alone is the starting point of most great things since the 2000s. I'm proud of Google.


Reminds me of this: "Of the 186 Facebook employees polled, only 21 percent said they would entrust their data to the social media giant."

(https://decrypt.co/12708/most-facebook-employees-dont-trust-...)


And as an Amazon/Inditex/Walmart engineer?


Googler: Not with the company for long (~1 year), so I don't have a good handle on the internal politics yet. From what I heard, it takes a few years to get disillusioned/cynical here. With that said: In general, I'm fairly impressed with how Google behaves internally and how much emphasis there still is on being nice and good. I mean, one thing that seems obvious to me is that the Google of 2020s will always be held to the standard of the Google of the 2000s or the 2010s, and that is a comparison that it just cannot win. It's very different to run a company of thousands of employees vs one of 100k people. Trying to not be evil just doesn't scale so easily: there is definitely going to be internal politics, and you are confronted with harder choices. You _are_ going to be hiring some bad apples, even if you try your hardest and manage to achieve that 99,9% of your hires are nice, exceptional people. The numbers are against you, and so is public scrutiny, so anything bad that happens will get reported anyways. And you can't be honest to your employees all the time if all your internal communication is going to be published on bloomberg, and has to be written with that in mind. You have multiple, sometimes conflicting interests that you need to handle. So in general: I think the Google of 2020 is less Good than the company it was 10 years ago. But overall it's still a very, very good company: I am very impressed with the general quality of people that surround me. Both on the grunt level as well as higher up the totem pole. Obviously no-one's flawless, and I definitely am unhappy with a lot of internal and external choices, decisions, strategies and communications. But I still feel that people (all the way up to leadership) are trying to do the right thing. Not to just do right for large stakeholders or themselves, but to actually figure out what the Right Thing is, and follow through on it. I don't see a lot of greedy behavior in leadership, or people who don't care or are doing a bad job. They seem to care to keep Google a "good" company, both to its users and internally. There were enough scandals, but I still feel like leadership is trying to sort stuff out and improving internal policies to make sure bad incidents don't repeat themselves. I think most people in the company know that in the long term, it's better to work with users, and put user's interests (including privacy) first.

The general backlash against Google is the obvious one that Google collects a ton of data about you. Google is an ad company, and it makes as much money as it does BECAUSE it can put the ads in front of those users where it will be most effective. It can only do that because it knows a lot about users. It's google's core business. The trade-off makes a lot of sense to me: users get to use a really good product for free (It is hard to impossible to compete with google search), and in return Google gets to serve them targeted ads that it sells to other businesses. That's a fairly obvious and IMO very good deal, and it's easy to see how this happened historically, and it's likely not going to change -- people are too used to getting good search for free, you simply can't charge money for a search engine these days. I think the issue is that users aren't always conscious of this transaction taking place (you get free search-results and emails, and google gets to use the data it collects when you use the services). But the good news is that as far as I can tell, Google takes privacy concerns very, very serious. It's mind boggling (but technically obvious) how much Google _could_ know about you, since it has your location data, search results, and theoretically even your emails (as far as I know, gmail data is fairly sacred and not exploited to better sell you ads). The mean reason I think google is still a good company (and doesn't deserve the backslash) is that, simply, Google doesn't exploit all the data it could collect about people for nefarious purposes, AFAICT. Now granted, I'm working in a role that is far away from those decisions, but it seems to me that a whole, Google tries to be nice and fair, and the protocols and technical solutions appear sensible and carefully designed to put user's privacy first. Given the position google is in, I think it is trying very hard to do the right thing and protect its users rights and data, and does not exploit them in any "evil" way, as far as I can tell.

Like I said, I think this is a decision that makes economical sense: even as it is Google faces enough scrutiny, it's in their interests to not exploit all the data it could collect. I'm sure all the scandals that Facebook has hurt its bottom line. At Google, I trust leadership to not fuck this up.


"And you can't be honest to your employees all the time if all your internal communication is going to be published on bloomberg"

Why can't a company be honest when their statements are shared with Bloomberg? In fact, can you trust what they're saying if they're afraid of it making it to Bloomberg?

Generally when companies keep secrets, like salaries, it's so it can get away with treating people unfairly or lying to one group or the other.


There's lots of small reasons why. You can't tell your employees about that new phone feature you're developing if it lands on Bloomberg a year ahead of release, because then your competitors might one-up you. You might not be able to openly divulge ongoing security risks because external actors will definitely exploit them once they land on Bloomberg. You might not even tell them the intricate details of search-engine improvements or Youtube spam detection changes, because if it lands on Bloomberg the external SEO market goes crazy and destroy all your hard work... I could go on, but there are a lot of things that would arguably make a lot of sense for you to divulge to your employees, but that you just cannot if you can't rely on them sticking to their NDAs.


Very interesting, thanks for taking the time.

"Trying to not be evil just doesn't scale so easily": your remark reminded me of this quote from Leopold Kohr:

"There seems to be only one cause behind all forms of social misery: bigness. Oversimplified as this may seem, we shall find the idea more easily acceptable if we consider that bigness, or oversize, is really much more than just a social problem. It appears to be the one and only problem permeating all creation. Whenever something is wrong, something is too big. … And if the body of a people becomes diseased with the fever of aggression, brutality, collectivism, or massive idiocy, it is not because it has fallen victim to bad leadership or mental derangement. It is because human beings, so charming as individuals or in small aggregations, have been welded into overconcentrated social units."

(The Breakdown of Nations, 1957)


> gmail data is fairly sacred and not exploited to better sell you ad

This is new from 2018 IIRC. Before emails from free accounts where scanned for ads tailoring. They had problem selling GSuite, because clients where confused between the privacy of free and paid tiers. So they estimated it would be more beneficial to stop using mail data for all account, and simplify GSuite sells.


I don't work anywhere near gmail, so I might be wrong.


Facebook employee. My ethical relationship with my employer is ... complicated.

First, I believe in our basic mission to make the world more open and connected, and I think in many ways we do work toward that. We're especially seeing that now. Tons of people are using FB (including IG and WA) as a way to stay connected during this crisis, and it makes me proud.

On privacy: yeah, we've made some mistakes. Were they evil mistakes? Let's put that in context. I was around when "information wants to be free" was everyone's mantra. When Facebook was being criticized for not making information they had available to third parties. I do not accept the gaslighting about attitudes toward privacy always being like they are now. They weren't. Should FB have put in better controls, and more strongly enforced those controls? Almost certainly. Was the lack of such efforts "evil"? Only if you apply today's standards of diligence to events and decisions in a very different time. Right now, I can see some of the problems that occur as we apply rigorous access control to user data even as it moves between internal systems. How many of our critics have ever needed to deal with such issues? There will always be more to do, but I'd say we're a bit ahead of the industry in that area now.

On disinformation: this is the real "damned if you do, damned if you don't" scenario. There's no logically-consistent reason why Facebook should exercise more control over content than Verizon does over the content of phone calls. To do so is to invite accusations of censorship, and run the risk of being treated as a publisher rather than a carrier. Nonetheless, we're devoting more human and computer time to detecting disinformation than most companies have in total. Vast data centers' worth. I'm in storage, so I see only the edge of this, in the form of (insanely complex) analysis pipelines and models. That's enough to get a feel for the scale of these efforts, and it boggles my mind. People are honing these techniques all the time, and having some successes. These successes are drastically under-reported compared to failures, for reasons I won't get into here, but they are reported. Anybody who's actually paying attention can see thousands of accounts being banned at a time for inauthentic content. Think for a moment about the computational complexity of detecting a thousand-node subset within a billion-node graph. Most people who accuse FB of taking this issue lightly just don't grasp the scale at which we operate and the additional difficulty that entails.

Does FB still do things that make me cringe? Sure does. The disrespect for user choice (e.g. chronological-timeline settings constantly being reset) pisses me off to no end. The demands put on moderators of large groups, and the dearth of tools made available to them, is inexcusable. Likewise for the lack of decent support or appeal processes when users are adversely affected by our own screwups. I've come to the conclusion that even though the people I work with directly are great, there are some other people at the other end of the company (user-facing product rather than deep infra) and at a different level who ... well, let's just say I wouldn't get along with them so well. It does give me pause sometimes, but not enough to outweigh my belief in the basic mission.

You want evil companies? How about those who make much of their money from contracts with ICE/CBP/NSA? How about those that have helped hollow out the economy with their anti-labor "gig economy" BS, leading to millions of unemployed right now? Or the worse half of the finance or defense industries? How about those helping to destroy our health or our planet? There are absolutely better companies I could work for. My ideal would be to do what I do at a company that's making vaccines or something, but they didn't seem interested in hiring me. But there are other companies that were interested and I turned them down because I don't approve of what they do. There are a whole lot of pots calling the kettle black.


Thanks for this long and well-thought reply.

> First, I believe in our basic mission to make the world more open and connected

Real question: do you think the top management feels the same? In general, do you see a lot of cynicism in workers around you?

> You want evil companies? How about...

Not saying GOOG/FB were worse than others, but people and the MSM began to increasingly question their moral alignment in the past few years, which is why I'm asking.


> do you think the top management feels the same?

Honestly I don't know. If I had to guess I'd say they probably do agree with the words but might have different ideas of what they mean. Should "open" mean literally anyone can see literally anything? Clearly not. I might put limits where they wouldn't. Maybe the converse is true sometimes.

> do you see a lot of cynicism in workers around you?

Nope. TBH that's mostly because they're heads-down solving technical problems. Or maybe they're focused on social problems other than the ones that apply to the company from an external perspective (e.g. diversity is a big one). To the extent that I see these company-direction issues discussed at all, I do see people trying sincerely to grapple with them and do the right things. Are those people just fooling me? Are there others who take a more cynical position? Don't know and can't know. I can just say what I see.


I hate the double standards in our society. We have people who actually work in the "killing people" industry. We call it "the military", and whenever one of them says they identify as a member, we tell them "thanks for your service", not "hey, do you think your org is evil?". We honor them before sports events. We make the president the CEO of the killing people industry, FFS. And these people are unquestionably, explicitly in the "invading countries and killing people" industry.

So are you seriously asking people who work in tech companies that spread cat videos and enable video chats if their companies are "evil"? With a straight face? Come on.

This thinking betrays both that people in the industry take themselves far too seriously, and that we're so brainwashed to accept these kind of glaring double standards, we don't even stop for a second to question them.


It might be because tech companies had been trying so hard to look like "the good guys" in the past 20 years. So it might just have something to do with that.

Also, the moral problems arising from having a military force have been studied for centuries, so nothing new here. It doesn't mean we cannot be critical of the military at the same time (and it's not all bad either btw).

Finally, the tech industry has a huge power on us. This power is rising every day, and has been invisible to most until recently. So questioning its morals, intentions and agenda today matters.


If you're not happy with tech, feel free to not join a tech company or not use tech at all. The problem with people here is "FAANG" is evil "but I love their services and want their really high comp".


> We honor them before sports events.

FYI: they buy that, with tax dollars. Same way they get editorial control over hollywood movies in which military hardware is featured.

It’s part of the advertising budget. It’s no accident that these mass events seem to be widely aligned with the military. It’s an explicit taxpayer-funded propaganda campaign.

That said, the military and private surveillance companies can both do evil, and their evils can indeed synergize: tech databases can be used to determine who to mass murder, e.g. Palantir, or PRISM, resulting in more murder and terror than without the tech involved.


Google doesn't directly murder people so it can't be evil. Got it.


Companies are not living beings and as such they can not "be" evil. The evil or good stems from us, living human beings and our behavior.


I'd disagree on that. Humans are made of cells and yet we are ready to talk about humans not just cells. It's a higher level of abstraction with its own decisions. Same way, companies are made of humans yet they have power structures that make them behave in unique ways. They make decisions based on processes that maybe no human would approve. Sure, there is often a catalyst leader but the organization can move on without it.


That nowhere near a reasonable analogie, humans are agents in their own right, we have a brain.

A company has no will of it's own, anything in the interest of the firm but not in the interest of any of the employees, it won't happen.


The book called Creation by Steve Grand has great arguments in favor of the analogie. I came across it as the favorite book of Jeff Bezos.

Sorry, I know this is a weak response to your argument but it would be too much to cover here, I don't have the time and I'm not sure I should convince anybody anyway. If you want to shake your beliefs on what life and intelligence are, go read the book.


Appreciated, I will check it out


I get your point, but what prevents us from considering "evilness" an emergent property of organizations...?


What makes certain agglomerations of atoms "living beings"? I don't see a fundamental distinction between "alive" and "not alive" that would apply to humans yet would not also apply to larger organizations.

Now, the question of whether certain companies should be considered "evil" may not be something I want to spend a lot of time on, but the arbitrary philosophical distinction that only humans within those companies can be "evil" makes the discussion even less interesting.


Companies have a culture, which evolves over time and is driven by humans and their morals/priorities, reflected in their actions and handed down through "generations".

Yes, they can be evil.


Odd to consider an institution incapable of being beneficial or harmful to society - my best working definition of evil not steeped in religion.


Aren't companies "moral entities" by definition?


You are wrong! Corporate person hood is a thing. It just turns out that the definition of a corporation could also be the definition of a sociopath.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: