Hacker News new | past | comments | ask | show | jobs | submit login

What I'm saying is that bad things, things that are so horribly, terribly bad that they could alter the course of human civilization for the worse, permanently, are happening right now and are being largely ignored by technologists, while they are worried about preventing something that might happen, someday.

And when people level the criticism at the technology industry that it is currently mainly focused on creating trifles for already-wealthy people, the inevitable response is that incredible technological innovation, the kind we could use to solve actual life-threatening problems, might come out of the effort to create those trifles, so it's not worth pursuing actual problems directly.

What's so interesting about the obsession over the singularity is that there is massive effort and capital being directed at directly solving a problem that is purely theoretical while, for example, climate change is already creating mass social instability all over the world, and companies working directly on possible technological solutions for climate change have to fight hard for every penny.

These days it definitely feels like the priorities of the owners of capital are located somewhere in an alternate reality the rest of us can only scratch our heads at.




So, you're saying a few things here which I disagree with.

First of all, the basic argument, at least of the "existensial risk" community that I frequent is that, compared to humanity's extinction, nothing else that's happening now is quite as bad. (Unless of course it is something that would also lead to human extinction).

More importantly to your point, you seem to be operating under the assumption that " there is massive effort and capital being directed at [solving this problem]" (paraphrased). As opposed to say something like climate change. This assumption is wrong.

There was recently an incredible victory for something called the "Future of Humanity Institute", which had just received $10 mil from Elon Musk. This was an extraordinary sum for a charity dealing with existensial risk, which is very decidedly a niche topic. Even if you look at all the charities dealing with X-risk, I doubt you'll be looking at more than a $100 mil dollars raised or so, and that's something of a stretch IMO. (If anyone has any real figures on this - please let me know!).

As for something like climate change, it's hard to find good sources as most of my google searches return mostly criticisms of climate change activisits, but I would be shocked if the amount of money spent on climate change isn't in the 10's of billions of dollars.

The argument of the people concerned with x-risk is that, at the moment, considering how little money is actually spent on research concerning x-risk, more money needs to be spent. And since these technologists are the only ones really aware (or at least talking of) the dangers, they need to try to get money invested in this issue.

Btw, I will mention two other minor things:

1. I'm not trying to defend "people level the criticism at the technology industry that it is currently mainly focused on creating trifles for already-wealthy people" - I consider this a really separate topic, since its usually different people involved with x-risk charities vs. trying to make money.

2. A lot of the community that talks about x-risk is also part of the "effective altruism" movement, which concerns itself greatly with solving more immediate issues.


I don't understand how climate change isn't at the very top of the list of existential risks to human civilization. If your concern is warding off existing and urgent existential risks, and then you end up fucking around worrying about killer robots instead of solving the problem that is literally at your doorstep, right now, then something has gone very, very wrong.


Err, I thought I covered it in my post, but let me make it clearer -

the question in this case isn't what is or isn't a risk, it's where is it better to spend more money. Considering the fact that climate change gets billions in funding and other x-risks get almost nothing, the arguement is, not that they're important, but that they need the money more.

(btw, climate change might not be an existential risk because it won't necessarily kill the entire human race).


Climate change is imeasurably more plausable as an extinction event than the singularity though, and is actively being caused right now. There is no evidence whatsoever of historical mass extinctions being linked to any technological singularities, but rather the are all in some way linked to climate change. Worth keping in mind before dismissing it as a non existential risk.


"Climate change gets billions in funding" is a totally meaningless statement. Do you mean funding for basic research into the systems that cause climate change? Do you mean funding for lobbying efforts? Or are we talking about funding for people working on practical solutions?

If you believe, like I do, that every dollar spent lobbying governments to "do something" about climate change is a dollar wasted, then the funding picture looks pretty bleak.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: