Hacker News new | past | comments | ask | show | jobs | submit login

There are a couple of other issues that could be added to this list.

Credit

One of the sayings that I know to be (mostly) absolutely true is, "there is no limit to what you can accomplish if you don't care who gets the credit." I have a (much more successful, and respected) friend to whom it is something of a mantra.

On the other side, seeking credit has a lot of pitfalls. The obvious one is taking credit for someone else's work; that's just bad and leads to bad results. But further, by aggressively taking credit for things you've done, you can actively force other people out of the field you're working on. Has anyone been bitten by the off-handed, "yeah, we took a look that several years ago" comment?

Further, becoming the face of some project means that you with all of your warts hanging out come to represent the project and its goals. Take care.

Goals

Choose the goals of your project carefully. For one thing, they can take on a life of their own. On one hand, our modern financial services system has the excellent goal of allocating resources where they can do the most good, but they have become so complex as to be a maze with great freakin' bear traps all over the place.

Then there's opportunity cost. Some goals are laudable, but take on too much emphasis at the expense of other, more reachable, more effective goals. Take the "reducing extinction risk" mentioned (repeatedly) in the article. Sure, somebody should probably worry a bit about the risk of human extinction, but...

"Many experts who study these issues estimate that the total chance of human extinction in the next century is between 1 and 20%.

"For instance, an informal poll in 2008 at a conference on catastrophic risks found they believe it’s pretty likely we’ll face a catastrophe that kills over a billion people, and estimate a 19% chance of extinction before 2100."

The risks they came up with are, molecular nanotech weapons, nanotech accidents, superintelligent AI, wars, nuclear wars, nuclear terrorism, engineered pandemics, and natural pandemics. (I'm surprised; global warming didn't make the list in 2008.)

Here's the dealy-o, though: what actually is the risk of human extinction before 2100? 19%? (1 in 5, really?) Their conservative 3%?

"Nanotech" currently is at most an OSHA problem. (Don't breathe in the microparticles!) The risk of "grey goo" is likely pretty damn low, given the history of the last 30 years of nanotechnology. (First thought on hearing of the possibility of nano-machines? "Ya mean, like proteins?")

Conventional wars don't actually kill that many people; they tend to disperse too easily. I'm even given to understand that the effect of major wars is an increase in the rate of population growth. Nuclear weapons, on the other hand, are very, very bad...for cities. But they're unlikely to do anything noticeable to people in sub-Saharan Africa, South America, the Australian outback, or Mongolia.

Pandemics have been a problem before, they'll be a problem again, but I'll let somebody else describe the problems with an infectious agent capable of killing all of its hosts. Likewise, climate changes have been problems before, and have led to bad outcomes. But killing everyone isn't ever been on the table. And for AI, I'm more concerned with the AI that runs your car off the road because it's not actually able to perceive the lane markers.

Individually, each of those is bad. They'll possibly kill billions of people and possibly lead to the collapse of civilizations---some of them have done so before. But complete extinction is incredibly unlikely and "ending all life on Earth" is just silly.

But human extinction is an issue that will get attention. It'll sell newspapers. And more than some minimal level of resources spent on it means less resources for other issues. Like, say, identifying and addressing actual problems with nanomaterials or wars or infectious agents.




I agree, humans are probably the single hardest among multicellular life to entirely wipe out. Not because we're hardier than cockroaches, or hibernate when it's cold.

There's just so many of us, and every last one of us would be hellbent on survival. At this point in our technology and understanding, I bet there could be survivors of extinction level events equaling the K-T extinction that wiped out the dinosaurs.

People already have bomb shelters that can last them months; feeding people does not require sunlight, only dirt, water, lamps and electricity. Nuclear winter can't stop the flow of electricity, survivors of any event can jerry-rig surviving wind turbines, nuclear power, geothermal, or hydro to create life sustaining electricity.

Would the survivors be smart enough to accomplish such feats? Of course, assuming they could read. The internet may collapse but the abundance of printed material, even doomsday vaults containing encyclopedias can provide for entertainment and education in long post-apocalyptic nights.

These things exist: https://www.cbsnews.com/pictures/amazing-doomsday-bunkers-of...

When people fret and worry about the fate of the human race.

I'm not worried. People will adapt and figure it out like they always have. The only thing people need to worry about is their own survival in harsh times.


> survivors of any event can jerry-rig surviving wind turbines, nuclear power, geothermal, or hydro

Excepting nuclear, where do you think the energy for those things comes from?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: