Hacker News new | past | comments | ask | show | jobs | submit login

The content becomes a part of the problem when it is liable to spread the "disease" (to continue the allegory).

I get the feeling we're discussing this on differing levels of severity, here.

I'm not talking about dissenting political opinions or pseudoscience or conspiracy theories or counter-culture or drugs or anything silly like that, but things more impactful and corrosive to humanity.

The persistence of certain content that isn't proliferated as a study has [arguably] the effect of normalizing itself, and whatever it resulted from.

---

I've been avoiding semantics, but let's go with a relatively specific example:

A person was raped. It was recorded.The gruesome story was detailed as entertainment for a sick (yeah, letting my bias out here) group of fans and interested or curious parties. The guilty party is found and apprehended, sentenced, and punished according to local law. Justice was served according to society's mandate. We consider this good.

Now of the persistent content. It's now in the hands of hundreds to tens or hundreds of thousands of people globally. Curious parties become interested parties, interested parties become fans, fans become culprits.

And all the while the victims get to live, not only with the experience, but with the knowledge that there are swathes of people enjoying the repeat viewing of the situation and the knowledge that it will never go away.

---

It's a big 'what if'. I could say the same about your position that if we don't gain total and open ability to publish whatever information whenever and have it persist forever that we will succumb to evil, oppressive forces. It's speculative.

I don't want to come across as if I'm coming down on the IPFS platform, or the interface Stavros ingeniously developed for it. Quite the opposite. I think that not enough thought goes into how powerful the platform really has the potential of being.

Software and network developers have a lot of inherent power to move large ideas with relatively few resources, and it should be recognized more often. And it should be discussed, honestly, whether one implementation or another is the best we can do before releasing it on society.

I'm of the opinion that ethical questions should be asked of engineers, and that empathy needs a larger role in the process of [not necessarily research, invention and development, but] implementation.

---

"We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented."

Ellie Wiesel

---

Feynman on a Buddhist saying:

“To every man is given the key to the gates of heaven. The same key opens the gates of hell.

And so it is with science.”




Comparing wrongthink to disease has a sorry history.

If you don't want to see bad things on IPFS, don't access them.

If you wish to force other people to not see them either, the process is similar to anything else on the Internet.

There's no need for moral panic.


Who's panicking?

I was posing the question as to how the community could deal with something horrific. At this point in time the power is in the hands of the community developing the technology. Surely there were lessons to learn from the implementation of the web.

I posed it again because each time there is no discussing potential for improvement and only responses crying censorship with pseudo-Orwellian lingo.

There's a line someplace. Nobody is preemptively stopped from producing snuff, but we don't support it by allowing it to be stocked at the local library in the name of anti-censorship.


Exactly, and this applies for all "levels of severity". There is no content so "bad" that censorship becomes an appropriate solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: