Hacker News new | past | comments | ask | show | jobs | submit login

Many if not most HN users create software and internet infrastructure. Collectively, we have so much power.

Yet much of what we make (directly or indirectly) is what the surveillance state is built on. It relies on us to build it, make it work, and keep it running.

If we care at all about privacy, we should think carefully about the privacy impact of what we make, and try to make a positive difference (or at least do no harm).




I highly doubt that's a stable or dominant strategy. As long as there's pressure to build systems, they will find people to build them. Not to mention, those sorts of jobs can be fun, intellectually challenging problems.

Like Narus - we probably agree it's not in the world's interest to have companies or governments with such technologies. But offered money and a chance to work on such a system, I'd work on it in a heartbeat.

It's unlikely that there will be a shortage of qualified people or that the extra money required will make any impact. And you could always take such jobs, and donate to the EFF.

This is such a common argument, I'm sure it has a name.


"offered money and a chance to work on such a system, I'd work on it in a heartbeat"

You would. But that doesn't mean that everyone would.

I am appealing to those of us who value privacy and ethics above making a quick buck or getting to play with neat toys.

It's not like there's some huge shortage of interesting technology jobs in this second internet/startup bubble. And not all of these jobs are for companies that want to spy on their users. Many of us still have a choice.

Even if you are at a company which spies on its users, you could at least try to make some positive change from within, or at least avoid advocating for going down the road of ever more surveillance and spying.

Way too many developers, VCs, and founders either don't consider the privacy implications of what they're doing, or are only too happy to collect and sell data about their users to the highest bidder.

This mercenary mentality is not some unchangeable part of human nature, but is a learned attitude that can be countered and rejected.


My point is that it's not even remotely practical to convince enough of the population to "value privacy" so much that these things won't be built or to even remotely hinder them. The population of earth is just too large. "HN readers" aren't some magic special bunch that cannot be replaced.

> This mercenary mentality is not some unchangeable part of human nature

No, it's just basic game theory. The more people that refuse to sign up for these "unethical" things, the higher the reward for those that do. And those rewards are very small compared to the pressures involved.

So even if you succeed in convincing a ton of hackers to join your cause, you've done what? Raised the salary from $250K to $750K a year for the people that do defect? That's nice, but the actual effect on privacy is zero.


"My point is that it's not even remotely practical to convince enough of the population to "value privacy" so much that these things won't be built or to even remotely hinder them. The population of earth is just too large."

When you get enough people educated and caring enough about some issue do something about it, what you have is a social movement. Such social movements have a long history of bringing about significant changes, especially when they are well organized.

If there are enough HN users caring and doing something about privacy, there is no doubt in my mind that positive, significant change will come about. HN users might not be "magical", but they are pretty special in that most of them are very technologically savvy (especially compared to the typical internet user), with a deep knowledge and understanding of the very technologies which make the surveillance state so effective.

Knowledge is power, and we have to recognize that collectively we hold a lot of power in our hands. Our collaboration with the surveillance state or our opposition to it, our advocacy for and work to build privacy-respecting alternatives could be a major game changer.

Quite apart from the effectiveness of such opposition are the ethics behind it. Some of us believe that we should do what's right even when the odds of success are against us.


The argument of inevitability is the argument of a fatalist.

Yeah, so what if the incentives increase on the other side?

That's what hard work is made of. No one said this is easy. Far from it, it's hard.

And if its inevitable, so note that the success of the government to pass CISPA and its variants was considered by some to be inevitable.

And hackers have generally been at the forefront of keeping the net safe.

It's only after the government started going after them, and SV built a narrative of - interesting challenges-talent-fair,just but outsized rewards, that the equation shifted.


inspiration produces better everything (software) than lucre but if you wanna talk $ rewards its obvious theres a mass market for privacy developing.


Please expand.


> Basic game theory

The guy who came up with game theory was mad + in real life people don't comply to it.


> Basic game theory

The guy who came up with game theory was mad + in real life people don't comply to it except economists and psycologists.


There will always be developers who will be willing to do this either because it's challenging or because they need to put food on the table. You can't win that fight. The people who take those jobs also have the potential to become whistle-blowers and throw 6 months+ of internal emails to $POPULAR_TORRENT_SITE showing all kinds of interesting dialog and/or corruption.

Concentrate on getting laws passed or technology made to stop or avoid the tools of "Big Brother" and educating the general public on what's going on with the internet and BigCorp/government. What I would like to see is a browser extension or app that makes PGP easy for the masses so they can encrypt emails, messages, etc. Open source of course. We don't need any secret backdoors.


"There will always be developers who will be willing to do this either because it's challenging or because they need to put food on the table."

But those people don't have to be us. Each of us has a choice to make: to cooperate with the construction and maintenance of the surveillance state or not.

Perhaps I am too much of an optimist, but I'd like to think that many of the most talented and capable of us want to make positive, constructive changes in the world. We don't want to be a part of making the world worse. Many of us are lucky enough to be in an industry where we actually have a choice in this respect.

If the most talented and capable of us deny the surveillance state our talent and capability, they'll have to make do with whatever ethically challenged people they can scrounge up. I'd like to think that they'll be the worse off for it, and the resulting surveillance state will not be nearly as powerful and effective as it would be were everyone to unhesitatingly and willingly participate in it or sell themselves to the highest bidder without thought of the consequences.


> There will always be developers who will be willing to do this either because it's challenging or because they need to put food on the table.

Of course, but that's bullshit, and they're being immoral. Evoking romantic images of providing for your beloved family is misplaced here. No programmer is left with the choice of either working for evil scumbags or starving.


"You would. But that doesn't mean that everyone would."

Sure, but those things only need to be built once.


The more talented people work on privacy-invading technology, the more powerful, pervasive, and smoothly-running the surveillance state will be.

Most of us have the choice to deny the surveillance state our own talent. We can work at more privacy-respecting firms and on more privacy-respecting projects. We can try to educate people about privacy, and advocate for positive change. We can even actively work to counter the surveillance aparatus by building crypto, stego, mesh-network and other privacy and anonymity enhancing technologies. Most of us probably don't have the luxury of doing this latter as our day job, but certainly as a side project -- if we care enough and want to make a difference.


I think the cost can be raised, at least, if people not only refuse to take such jobs themselves, but attach some social and professional stigma to those who do. Sure, with enough money you'll likely be able to find someone who will do pretty much anything unscrupulous; even extreme examples, like getting someone to program a human-trafficking back-office app, can be completed if you offer enough money and look in the right places.

But the idea is to reduce the supply of willing workers for surveillance-state applications, and raising the cost of those who can be hired for the job, by making sure it's seen as a somewhat shameful job, rather than just another "regular" one. That at least incrementally slows things down and drains resources of the people building such systems. That would have a real effect, except in the case of an adversary who actually has infinite resources and can solve any staffing problem by just throwing more money at it, without compromising anything else.


> As long as there's pressure to build systems, they will find people to build them.

Right. But you could find people easily or with great difficulty. Imagine a whole generation of "ethical" hackers who refuse to become pawns in the industrial military complex and/or surveillance sectors. Who will build the police state then?

My point is that "the system" depends heavily on the humans and the best way to fight the system is not to work for the system. Just don't join //them// and (A) you will sleep better at night and (B) you will be helping the world.

The argument "someone was going to do it if I don't" doesn't hold any water. Just stick to what you believe is right and don't worry about the others.


"Just don't join //them// and (A) you will sleep better at night and (B) you will be helping the world. ... Just stick to what you believe is right and don't worry about the others."

There is another benefit to standing up for what's right (especially if you do so loudly and proudly), and that is providing an inspiration to others and leading by example.


> The argument "someone was going to do it if I don't" doesn't hold any water.

Can you elaborate or suggest some further reading on this? If someone possesses rare talent or technology, I can understand it. But that's hardly the case in what we're talking about.

I see approximately zero difference in the world if I write surveillance software or if someone else does. It will get done. Government and telecom companies aren't going to shrug and drop the issue. So why should I allow someone else to benefit from my refusing a contract?

I am honestly asking for the logical steps in arriving at your outcome.


You don't see a difference because you don't seem to have a strong feeling about the ethics of these kinds of jobs. I don't say this to attack you or to judge you.

To understand what others are saying here, you need to substitute some other ethical question that you do have strong feelings about.

Regarding your game-theory comments -- you're implicitly assuming that "winning" involves maximizing your income. For me, how I feel about myself is part of assessing any potential win or loss.

Fundamentally, you can't really answer ethical or moral questions using market or game-theoretic thinking, without considering how you feel about what you're doing as part of the win/loss metric. If you don't care about an issue, you don't care about it.

So, "how will my caring about this make a difference" isn't really the right question. Instead, ask "why should I care about this?"


Yes I might lack empathy for other people's privacy, especially as it seems that most people just don't care. Those that care can use technology to benefit themselves. You're right though, I have a hard time answering why I should care, personally. If I eliminate emotion from the question, I cannot come up with any rational reasons.

But more than that, blaming a single low-level actor in a system seems kinda pointless. It's like hating an individual DEA agent instead of the idiotic system that creates the DEA. Shaming DEA agents will accomplish next to nothing; the effort would be better spent where it might make an impact (like on elected officials).

The number of hackers you need to implement surveillance (or land mines) is pretty low, and you can get people from the worldwide population. Not to mention there are going to be a fair number of patriots that believe what they're doing is good, anyways.

The only question is which individual will profit and how much from actually building it. In this hypothetical situation, I can't see any reason to let someone else (who may have values I don't like) get paid to build the technology.


> This is such a common argument, I'm sure it has a name.

Someone who is willing to do the unethical dirty jobs that take a toll on society, because they pay so well and because if he didn't take them, they'll just hire someone else to get it done?

Yes, the name for that is "mercenary".

Adjective: Motivated by private gain.

Noun: A mercenary is a person primarily concerned with making money at the expense of ethics, [most often used to refer to a soldier who fights for hire].

Interestingly, being "fun and intellectually challenging" doesn't really factor into pattern much, except as a justification certain people need to make the job seem more glamorous or honourable. But as it happens, even with that factor absent, if they don't take it, somebody else will, right? For the right price, of course. The difference between those prices is literally the monetary value of your honour. Most people won't set a price on that, but you're a mercenary now, so you already did. Funny thing is that difference isn't exactly traded on a free market, it could even be negative, however/whatever gets the job done for your employers.


That's despicable. The way we look at the inventors of mustard gas, lobotomies, and torture wheels is the way history will look at you.

You must be a terrible person to be willing to directly further evil in the name of money and "fun".


Those were all technologies that were "known" to be "evil". But to get that point, there were probably scientific advancements made that were pretty benign, in isolation. And those advancements were probably used to develop other technologies that were "known" to be "good". The people working on those earlier advancements had no way (necessarily) of knowing what would be derived from their work. The same is true of technological advancements advancements today. The people who first started working on the internet probably had no idea (for the most part) of the privacy issues that have arisen. Ditto for satellites, GPS. . .


What's being referred to is the development and implementation of a very specific technology produced by Narus, not some open-ended research project. See here for more info:

http://en.wikipedia.org/wiki/Hepting_vs._AT%26T#Background_a...

I would not keep company with anyone desiring to be involved in a project like that.


I'm actually sort of surprised by the negative sentiment here. A project with government backing and legal cover will get done, regardless. This is hardly a case where you need some 4-sigma genius (which I am not) to conceive of some major breakthrough.

But creating something like Narus must require all sorts of interesting research and engineering. I wrote a packet analyzer for a single protocol and was just barely able to parse at 1Gbps speed, off a loopback device. Actually getting it deployed at much lower speeds was rather difficult.

Doing the same at 10Gbps, multiprotocol, actually doing neat stuff to the data? Unless their boxes have incredibly expensive hardware inside, they must have done some pretty neat hardware and software designs.

The way to fight this type of surveillance isn't convincing people to avoid working on such projects. That's hopelessly naive or incredibly optimistic. Change the rules of the game: Either make it illegal and enforce or spread proper privacy software like cryptography.


You're missing the point. Social conditioning and shame are very powerful.

It's likely that designing land-mines involves many interesting technical problems. Regardless of whether there are still engineers who will work on land-mines, there's a social good to shaming them (assuming you think land-mines are bad).

If you don't have strong feelings about land-mines, or you don't think they're all that bad, then this argument won't make any sense to you.

People can choose to do or to not do things regardless of their effectiveness.

Do you realize that you're implicitly adopting an amoral viewpoint when you raise pragmatism above ethics or morality? That's a choice -- not a given. This is why you're getting the strong negative reaction, here. History is littered with examples of how this leads to bad outcomes and evil systems.


The way to fight this type of surveillance isn't convincing people to avoid working on such projects. That's hopelessly naive or incredibly optimistic. Change the rules of the game: Either make it illegal and enforce or spread proper privacy software like cryptography.

Why not simultaneously push for change in government, develop powerful and easy-to-use cryptographic tools, AND refuse to work on projects that are very likely to have a negative impact on society?

I don't deny what you're saying, that these projects will end up being built by someone (hopefully someone less qualified though, and at a higher cost to the contractee), but by the logic in your first post in the chain, anything could be justified morally as long as someone is paying you to do it. What's wrong with pulling the lever in the gas chamber? If you don't, the next guy will, right?


Funny, I looked up the people that discovered mustard gas, and all of them seem extremely well respected and have had illustrious scientific careers.


You're a disgusting person.


I think your solution would suffer from the tragedy of the commons--it would only take a few defectors to ruin the whole scheme.

My guess is that the only durable way to increase digital privacy protections is through law. To achieve what you want, you would need to get Congress to pass a bill, or the Supreme Court to hand down a favorable decision, or maybe both.


Laws are mutable. Durable things must be invariant under human law.

The Riemann Hypothesis has not been proven; Congress cannot legislate its truth or lack thereof.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: