One person can make one command decision and make #2 a reality. That's all it takes. No matter how many people drool over AGI the bandwidth and processing just isn't there. We barely have control systems as intelligent as a bacterium.
To my knowledge, this is correct. In theory a command from e.g. the President is authoritative and should legally be carried out. In practice, you'd require every person along the chain to implicitly agree to effectively ending the world. Even at the bottom of the chain, there are likely multi-key requirements so even there you don't have just one person with his hand on the trigger.
This makes one of the biggest threats to humanity's survival being buggy software. If a software glitch from one side or the other suggests nuclear weapons are incoming, they will almost certainly retaliate in kind. And once the bombs are away, the other side will definitely respond, and that's pretty much gg humanity. In fact this exact thing very nearly happened in 1983. Soviet instrumentation indicated that the US had attack the USSR with multiple nuclear weapons. The technician on duty, Stanislav Petrov, [correctly] judged that what was happening was an instrumentation error, and disobeyed protocol - refusing to escalate it to his superiors who very well could have ordered retaliatory strikes.
For some context this had happened at once of the highest levels of tensions between the US and the USSR. This was in the era of the 'Star Wars program', which was to be a US missile defense shield, leading the Soviet Union to become paranoid about intentions of a preemptive nuclear strike attack. Those tensions had already led to the downing of one 747 that had inadvertently veered into Soviet territory, while carrying a US Congressman at that. Given this context it's highly probable that his superiors would have ordered the retaliation. So he's one of very few people who can be reasonably said to have saved the world.
In America this is not actually a useful thing. America's ICBM silos have technicians sitting in front of consoles, and basically every day, they get a command of keywords/numbers to punch into the computers. The technicians do not know whether the numbers are a test, or a real launch, at any time. This is purposely designed to prevent a low level button pusher from preventing the US from launching nukes.
If the president wants to launch a nuke, the only thing that stops them is their cabinet simply not relaying the order. This happened several times when Nixon got shitfaced.
Like nearly everything in american government, this is just a "norm". The only thing preventing an insane american president from doing pretty much anything, at least for a while, is "norms" and customs. Nixon wanted to nuke pretty much all of Vietnam pretty much all the time, and it was only his cabinet that prevented that. But there is no precedent, or legal reason, to actually force that to be the case. So what happens if you then get a president who has surrounded themselves with a cabinet that explicitly thinks Nixon was right, that Nixon had the right to burgle the Watergate hotel because the president should be able to do anything (this is a real faction in US politics right now) and have openly put together a plan to fire damn near everyone on day one and replace them with explicit yes men?
This seems to run contrary to basically everything I've read on this topic. For instance during Trump's presidency there was a tremendous amount of fearmongering about him possibly wanting to nuke North Korea. This led to commentary from numerous generals and others opining on the topic about whether they would or would not obey the order. Here's some rando link. [1] The sort of system you're describing would enable the President to unilaterally carry out illegal orders independent of the military. I just find it difficult to imagine that they would concede that, not only because of the power concession, but because of the ethical or even constitutional implications. Notably officers don't swear an oath to their commanding officer or anything of the sort - they swear it to the Constitution.
As for Nixon, he was literally playing the madman as part of a strategic goal [2] (arguably the exact same thing North Korea is doing in modern times), but privately was the one who ultimately scrapped a plan that involved nuking Vietnam. [3] A nice quote from the Madman article (from Nixon), "I call it the Madman Theory, Bob. I want the North Vietnamese to believe I've reached the point where I might do anything to stop the war. We'll just slip the word to them that, "for God's sake, you know Nixon is obsessed about communism. We can't restrain him when he's angry—and he has his hand on the nuclear button" and Ho Chi Minh himself will be in Paris in two days begging for peace."
> Notably officers don't swear an oath to their commanding officer or anything of the sort - they swear it to the Constitution.
The Project 2025 agenda targets the Pentagon as being "woke" and full of "Marxists" and will likely result in a purge of generals and officers. Anyone who has ever signed off on anything they can call a diversity program will get ousted on day 1. The vacancies will then be filled with Trump supporters.
Tactical nukes are sometimes under the command and potential authorization of battlefield commanders who in theory could operate without the authorization of a president. Consider a risk of this happening in the context of a border war between two nuclear armed states with low spillover risk to the rest of the world - i.e. pakistan and india.
The commander cannot initiate it himself. No single person in the United States, that I know of (no access to classified info) can physically do it by themselves. The commander would have to order someone else to do it at least. This is a huge component of the whole doctrine as far as I'm aware.
There will always billions in losses from something, especially when global wealth is estimated to be around 450 trillion. The Dust Bowl was a disastrous incident of climate change, yet it didn't occur because of CO2. The damage caused by the California Wildfires are just as much a function of expanding developments into forests, immediately putting out small fires leading to the accumulation of fuel over time, and opposition to controlled burns. Property damage and deaths from flooding are mostly a function of population growth in the developing world, without sufficient corresponding investment in water management infrastructure.
Regarding the solution, stratospheric aerosol injection would be the most immediate and effective solution to rising temperatures. It's been estimated that current increases in CO2 have a radiative forcing effect of about 2 watts per square meter, compared to the total solar irradiance of 1361 W/m2. If CO2 levels doubled to 800 ppm then it's estimated this would have a radiative forcing effect of 6 W/m2. This scenario would require mitigation strategies like stratospheric aerosol injection to reduce solar irradiance by about 0.4%. In the context of plant growth this reduction in sunlight would be negligible given that photosynthesis is only 1 to 2% efficient. If anything we should see significantly accelerated plant growth by about 10 to 50% due to the CO2 fertilization effect at 800ppm.
I'd be curious to see the science on enhanced plant growth; my understanding is that the benefits of increased CO2 fade very quickly. I believe we can already see plant pores evolving to be smaller as they were in previous times when the Earth had greater CO2 levels.
About the only thing that gives me hope is SRM, but it's a half-assed solution at best. A world with 800ppm CO2 and a dimmed sun via aerosols is not the Earth that I was born to; it is probably not possible to fully understand the affects.
There is no perfect solution. One of the benefits of stratospheric aerosols is that they only stay suspended for 10 years or so, reducing the risk of long term effects. They are also an exceptionally cheap strategy, on the order of a few billion a year to cool the entire Earth. Compare that with the many trillions needed to just get to carbon neutral, while significantly reducing economic growth and living standards.
The world we were born into will not exist in any scenario. As they saying goes, you never step in the same river twice. CO2 emissions show no sign of decreasing, especially as China, India and Africa continue developing. Furthermore, we are on the cusp of AGI within the next decade or so, which will radically change reality far more than the Earth possibly getting a bit warmer in a 100 years or so. The IPCC does not even predict major cataclysm, and expects only sea level rise of a couple feet in the worst case scenario.
Seems plausible at least, and you are of course correct about the world; the CO2 level of the atmosphere when I was born was well below 400 and we'll not see that again as long as I live. That's what I tell those who respond viscerally to the idea of messing about with the atmosphere purposefully - we are already altering it regardless.
And then we have to do it again, and again, and again; once you grab that tiger's tail, there is no letting go. In the meantime, sense of urgency abated, we would most likely keep on burning fossil fuels and putting more CO2 into the atmosphere, locking ourselves further into the loop.
It is a terrible idea which will almost certainly happen.
Entropy is inexorable and must always be resisted. Humans need to drink water and eat food, again, and again, and again. Unimaginable amounts of time, effort, and resources are spent every day maintaining civilization. If we build out solar panels, we will have to replace them in 20 years. There is no free lunch.
We don't know how bad the climate will be at 800 ppm. 300 millions years ago, during the Carboniferous era, there were vast forests and very high CO2 levels, around 1000 to 5000 ppm. If the Earth gets too hot we will simply do stratospheric aerosol injection. If climate change turns out to be overhyped, then we get the CO2 fertilization boost for free, win-win.
Regardless, the real solution is next level energy production from advanced fission, deep geothermal, and ultimately fusion power. With a vast surplus of energy we could do wildly impractical things like filter sea water for gold, and of course extract extremely diffuse gases from the atmosphere. Renewable energies are very low energy density, and more akin to farming from a physics perspective. They make sense in certain situations, but are not reliable for powering an advanced industrial civilization.
I also suspect people's intuitions are out of perspective in terms of time scales. It seems likely AGI will arise within the next 20 years. Climate change is nothing compared to the singularity and rise of superintelligence.
Calcium carbonate aerosols are proposed as a good alternative to sulfur dioxide since they are basic and do not react with ozone. Calcium carbonate is used an additive to soils to reduce acidity and is used by organisms to construct bone tissues. Furthermore, these aerosols would be extremely diffuse. It's estimated that we'd need 5 million tons of sulfur dioxide per year to reduce temperatures by 1 degree centrigrade. Assuming we need 10 million tons of calcium carbonate that eventually descends to the Earth, that would come out to be 0.0196g per square meter. Effectively negligible, and may even be beneficial for areas that have slightly acidic rains.
> If anything we should see significantly accelerated plant growth by about 10 to 50%
If that’s true, there should be global evidence of this happening already- are you aware of any publications confirming this effect?
You say accelerated growth like it’d be a good thing, but I don’t think we should wish for it or expect that to mitigate any damage… getting to the point of doubled CO2 would probably be extremely bad. If CO2 levels doubled, we’d for sure lose significant amount of our ice sheets, and some coastal cities along with it. Some of that is already happening anyway, but tripling the radiative effect will make it go much faster and much farther. As it stands, any increase in plant growth isn’t in any way making up for the rate we’re cutting down and paving over all the plants, and it’s not clear that 10 to 50% accelerated plant growth would make up for it either… even assuming that accelerated plant growth actually leads to more plants and a greater volume of oxygen cycle, and not just earlier blooms.
"From a quarter to half of Earth’s vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide, according to a new study published in the journal Nature Climate Change on April 25."
Interesting observation: Trillions of dollars have been spent to date on emission reduction with many incurring a consequent lowering of quality of life. Some would say this is inevitable but better to survive than the alternative. To date (as far as I can tell) there has been no discernible reduction in the rate of increase in CO2 as measured at Mauna Loa station. When will this paramount metric relate to measures taken? Any guesses?
"The beneficial impacts of carbon dioxide on plants may also be limited, said co-author Dr. Philippe Ciais, associate director of the Laboratory of Climate and Environmental Sciences, Gif-suv-Yvette, France. “Studies have shown that plants acclimatize, or adjust, to rising carbon dioxide concentration and the fertilization effect diminishes over time."
The beneficial effects of increased CO2 will also be more than counteracted by other problems as the world temperature rises, such as greater heat stress and lower soil moisture.
> Trillions of dollars have been spent to date on emission reduction with many incurring a consequent lowering of quality of life
It's cheaper to avoid the warming in the first place rather than deal with the consequences, so complaining about the cost is a false economy.
> To date (as far as I can tell) there has been no discernible reduction in the rate of increase in CO2 as measured at Mauna Loa station
It's possible that we've reached peak CO2 emissions:
Are you suggesting that trying to reduce CO2 isn’t worth it? What alternative are you proposing? What lowering of quality of life are you referring to? How do you know what Mauna Loa would have measured without the efforts to date? And what is the expected delay between CO2 reduction and effect?
Seems like we're moving directly and rapidly towards a real solution. Perhaps not rapidly enough, but the only way to change that is billions more in "losses"
The idea that we are going to technology our way out of climate change is hilariously naive. That would require us to remove more co2 from the atmopshere each year than we add, which is just a laughable idea to achieve within the required time frame of the next 15 years or so.
It would have to be both cultural/political, and technological on that time frame. With current technologies that remove CO2, we could completely remove all of the CO2 we currently add to the atmosphere, without even any reduction in our use or release of CO2. The cost of such an effort with current technology would be slightly less than, for example, the cost of World War II (adjusted for inflation, etc.).
If people cared enough to make huge sacrifices, we could absolutely do it right now. Now the technology to make it effortless, so it happens automatically without people doing anything expensive or difficult? E.g. pulling carbon from the air to make things becomes cheaper and superior to pulling oil from the ground? Indeed, we are a ways from that.
there are likely technologies that could mitigate the effects while we work on a solution. geoengineering could help reduce the temperature of the planet although that has its own problems.
There's all kinds of tipping points that are plausible in the next decade that would lead to accelerating warming. Permafrost melting releasing large amounts of methane; gulf stream redirecting; extended drought in climate critical regions; large ice shelves collapsing, etc. It's gradual until it's not.
The problem is that we are adding too much CO2 to the air. In fact, we have already added too much CO2 to the air, to the point that we will have noticable bad effects within the next few decades. More importantly, all this stuff has huge inertia, so if we stopped producing CO2 right this second, we would STILL blow past CO2 goals.
If we EVER want to get back to today's climate, you know, "normal", it requires removing gigatons of CO2 from the air. From a pure chemistry energy of reaction standpoint, that will be enormously expensive. There is zero technology that can change the physics fact that combining CO2 with something that will sequester it will take more energy than we ever got out of it.
Petroenergy is explicitly a loan. We have to pay that back
I understand that, but we may have to adapt to new normal with noticeable bad effects. That's where technology can help. Not sure we can remove that much CO2. Maybe if fusion becomes a thing, it would give use enough cheap energy.
Not every green advocate wants this, but I think a few would be really upset if a technical solution was found that didn't involve the complete restructuring of society in the name of "fairness".
Number 3 is literally an article of faith. And like all matters of faith, not only is there no evidence for it, the evidence that we do have runs counter to the belief.
Let’s be honest here. Climate change was a completely predicable problem with known causes causes and solutions. It was detected early and alarms were raised in a timely manner. Literally nothing happened. In fact, even today, nothing happens. Why? Money. Because the billionaires would rather have us all die than not buy a tenth superyacht. (NB: Go Team Orca!)
The problem was not, and is not, science. It’s money and who has it. Now those same folks and their apologists promote quite laughable ideas like carbon capture and geoengineering. Could we have gotten cheaper lithium batteries earlier with more investment? Probably, but what did we get instead? Fracking. Literally the opposite of what was needed.
Science is not magic. Even if it was, the nature of the problem means it gets exponentially more difficult every day that passes. You mange to repeal the second law of thermodynamics and somehow got the carbon pollution industry to spend enough money to make it viable? Great! Oh, oops! The permafrost melted and now 10x more powerful methane has been released, so… yeah. Too late!
It’s just Pollyanna talk that actively harmful as it not only directs resources from mitigation and migration to wasteful handouts while simultaneously giving cover for just staying the course.
But hey. Maybe I’m just a doomer, and those Titanic passengers should have just sciences up more lifeboats instead of drown. Maybe all of this just… “needs more study”.
I think the implication is that the nuclear war would exist between sovereign rulers, not necessarily between the physical spaces themselves. Wipe out a capital city in an attempt to behead a sovereign government so that their land is available for take over.
Apparently though, the bombs that landed on hiroshima and nagasaki did not create much fallout, making the space livable relatively quickly.
It seems less destructive long term than a nuclear reactor meltdown which may be a tempting but misleading proxy.
A very real danger is a nuclear power that has a deranged person in charge. They know they don't have much time left, they don't care what happens after they are gone, they decide to fire off all their nukes.
Chernobyl had tonnes of radioactive material, a nuclear bomb has much less material, in the kilograms. After Chernobyl blew its lid off, the reactor core was laid open. Witnesses said they could see a beam of light going into the sky. It was the radiation ionizing the air. A helicopter that was trying to drop sand onto the reactor accidently flew too close over it, giving the pilot a lethal dose.
Maybe not "accidentally". That pilot may have known the danger, and done it anyway. I seem to recall reading that he did, but I can't point you to a source.
Regardless of how realistic you think AGI risk is, it’s not marketing hype- it has been a human fear and central in scifi for a long time. Movies like 2001, Terminator, and the Matrix were hugely popular long before AI was profitable. Arguably ancient myths about golems and genies are essentially the same fear and concept. AI companies are afraid of public fear over AI as it might lead to regulating them out of existence…. They are actively creating marketing hype in the opposite direction, to convince people that AI is safe and useful.
Fear of AGI is very real. But our proximity to AGI is marketing hype. In terms of existential crisis #2 and #3 are serious and already worth investing in mitigation. But #1 (AGI) is science fiction.
In 2022, the median ML researcher surveyed thought that there is a 5% or 10% chance of AI leading to "human extinction or similarly permanent and severe disempowerment of the human species," depending on how the question was asked.
Fiction is much more than just entertainment... science fiction has been pivotal in both shaping and preparing our society for changes caused by technology. Much of the best science fiction has been so prescient that it's hard for modern audiences to even understand that these things were written before the world was like this. When I share a lot of old sci-fi with my young son, he finds it unremarkable, as they seem to simply be mundane stories about our present reality. Authors like Vernor Vinge wrote about AGI risk in stories because they were personally worried about it, and trying to share something they felt was important with others.
You can easily disagree with the warnings and fears shared in particular sci-fi, but to dismiss fiction as categorically irrelevant to our reality is just ignorant.
There is some important history here your comments suggest you are unaware of. One reason AI companies talk so much about AI risk despite it being bad for their bottom line isn't marketing hype, it's because many people in those companies are genuinely afraid. One could argue that this is because many of them have been exposed to the rationalist community- which could be (uncharitably?) seen as a doomsday cult obsessed with AGI risk. The founders of many AI startups including OpenAI were heavily influenced and active in this community.
I neither said nor implied that AGI being firmly in the realm of science fiction is a bad thing. There's nothing wrong with thinking about the implications and they would be significant. It's just such a stretch from where we currently are that it's not worth investing significant resources in mitigation. The threat from AI that is present and real is janky systems being depended upon for life and death decisions. Authors like Vernor Vinge (and I love his stories) depend on bending the rules of both physics and computation. Physics in that computation is somehow fundamentally different in different areas of the universe and computation in that known computability limits no longer apply.
It's easy to re-imagine science fiction as being closer to reality after the fact.
AI safety is an issue whether AGI is 10 or 100 years away.
Out of curiosity, could you recommend any Science Fiction written before the nuclear age that talks about nuclear proliferation? Or to take something from our present world, what Science Fiction stories would you recommend to talk about the effect of drone tech on trade and shipping?
That wasn't what I was looking for, but I appreciate the honest response. UniverseHacker claims "science fiction has been pivotal in both shaping and preparing our society for changes caused by technology" and I want to understand that claim more. Let me explain:
By my reckoning, there are two driving factors in how the understanding and control of nuclear reactions affects our world: First, the huge step change in energetic density over chemical reaction, second the fact that you can cause explosive chain reactions. If there was a smooth gradient of progression towards the energetic density of Fat Man, the world would have been a very different place. If the science had panned out such that we couldn't cause a nuclear chain reaction, the bomb/energy source duality would never have emerged.
I'm curious if there is any fiction out there that contended with either of these factors prior to us realizing that the disparities between chemical and nuclear reactions are what they are. I've not read Alas, Babylon, and I'm not trying to deny that it had important influence, but I'm pretty sure we knew that it was possible to end the world via nuclear conflict prior to 1959. By your estimation, how did the book shape or prepare our society for the changes caused by nuclear technology? We're on the other side of the cold war and the conflict between capitalism and communism isn't a thing anymore, right? If we die in a nuclear fire, it will not be anything like, to quote wikipedia, "The explosion is mistaken for a large-scale US air assault on the military facility and, by the following day, the Soviet Union retaliates with its planned full-scale nuclear strike against the United States and its allies."
I'm not trying to be snarky, I'm trying to understand what I'm missing.
----
To me these claims of SF's prescience fall short, especially when it comes to the dangers of AGI. As an example, I have no mouth and I must scream, as wonderful as it is, doesn't have much to say about preparing our society for AGI. AM will not emerge from GPT-4o, and I'm pretty sure if AGI does happen it won't be through a super computer built to coordinate militaries. I enjoy reading SF, but I think its wider influence is as much miss as it is hit, and I think we can plot better courses when we treat SF as mostly entertainment. People who develop technology would be better served studying the history of technological development.
The unintended consequences of new powerful weapons falling into the wrong hands and new trade technology are very old themes in fiction even if the specific technical details are not.
I noticed you edited your initial response for tone before I could respond. Regardless, I'll respond to your initial statement to say that no, I'm not trying to make a snarky rebuttal. I'm trying to understand your statements better because my own personal experience and opinions disagree with what you claim and I'd like to integrate your perspective. I don't give a shit about internet points. You specifically mention Vernor Vinge, whom I haven't yet read. My understanding is his work is dependent upon their being an intelligence explosion with the development of AGI, which seems incredibly unlikely given how AI technology has developed thus far. I think we've seen amazing growth in capability but nothing close to an intelligence explosion as I've seen portrayed in SF. As someone working in this space, what lessons would I learn by reading Vinge's work as opposed to spending that time reading more about the history and happenings in the real world of computer security?
Now, to respond to your post-edit: I feel like my disconnect with your positions in this thread are around the importance of fiction. I think fiction is a minor muscle in the body that drives technology, if you'll allow the metaphor, whereas you claim "science fiction has been pivotal in both shaping and preparing our society for changes caused by technology." Perhaps you could explain what you mean more concretely? Another confusion I have: The claim of prescience and that the themes are very old seem to be at odds. If I develop an understanding of the dangers of runaway technology through a story about Genies, why do I need more modern work? What if I instead gain these understanding through actual history? The real world has so many more wrinkles than a story and so many more lessons to give about technological development than a story that I balk at words like pivotal. There is not pivot point, only hundreds and hundreds of articulations.
Anyways, I've written enough for now, and I don't expect you to follow up.
My apologies for the covert edit making you rewrite your reply.
Why is sci-fi useful when we could just look at technology history? It's the range of imagination involved- exploring big possibilities and unintended consequences that are unlike anything we've experienced before. A lot of it is also noticing how historical patterns could play out in a new context, and sharing those ideas.
I am reluctant to list out prescience examples, because on one hand they seem so numerous I couldn't do them justice with just a few, and on the other hand people will disagree with the importance of them.
Overall- I feel like it is a simple fact that lots of tech people and inventors love sci-fi, grow up reading it, and then it influences what they think about and choose to work on. I suspect we wouldn't have had rockets or space travel when we did, if we didn't have so many victorian era books imagining space travel. The numerous billionaire pet project space companies we have now seem a lot like people trying to live out fantasies from reading Heinlein. Even little things like the devices we have- my Kindle looks and acts exactly like the e-book readers in The Next Generation, for example.
When my son sees old Star Trek episodes very little is remarkable to him except the big things like warp drive and transporters. The "advanced tech" they use day to day is all stuff that is a normal part of his life- 3D printers, tablets, ebook readers, video conferencing, mobile communicators/phones, etc. Same with the "Back to the Future" movies- they travel to a future filled with various electronic gadgets that are very similar to those we now have, but didn't exist at the time... and my son doesn't see how remarkable that seemed to me as a kid, when those didn't exist.
As a researcher in biotech, some of my current research is definitely inspired by sci-fi, especially biotech and nanotech from Neal Stephenson books.
Sorry for now responding for now, and sorry for responding so briefly, but I just wanted to say thank you for your earnest and thoughtful response. You rock! I don't have the freetime to properly articulate my rebuttal, but I will say we are on the same page in many regards. I think the power of story is very important in the process of technological development, but I think 1) All stories, not just sci-fi are important for developers and 2) stories are the least important and most pre-determined aspect of the process. Humans tell and learn from stories and act has purpose and use everywhere, but they are also easily misinterpetted and used as inspiration for badness. Most of the billionaire pet projects fit under this umbrella. IMO Great sci-fi inspiration should be the sprinkles on top of a layer cake of development prowess, experience, thoughtfullness, and continued education in the craft.
Where would I go if I wanted to learn more deeply than wikipedia about how hindu mythology talks about nuclear proliferation? Any directions I should search?
Spiders have been a human fear for two million years, but I'm not too worried about them causing our downfall as a species. Just because we're afraid of AI doesn't mean it's actually worth fearing.
Yes, even if something is worth fearing, it's not simply because people are afraid of it already. Did something I write make you think I was arguing otherwise?
As an aside, I would argue that this deep rooted human fear of AI is related to the general ancient fear of humans being wiped out by other bands of more technologically advanced humans. Which for most of human history has been the single biggest danger by far.
People also have deep instinctual fears of things like spiders and snakes for the same reason- they were serious hazards in the environment where humans evolved.
One could say we have evolved these instincts to fear them, because they are worth fearing... not the other way around.
> Did something I write make you think I was arguing otherwise?
At the risk of having boring meta-commentary about comments, I feel like I see people often forgetting that their comments are in the context of a larger conversation and are going to be read as responses to the parent comment that they are, well, responding to. And then they get confused when their comments are interpreted in that context, rather than as completely standalone statements. So let's backtrack:
(A) AGI is a threat to human civilization. (B) Actually, its risk is marketing hype. (You) It's not marketing hype. Our fear is deep-seated.
It's not that weird to interpret the 3rd comment as disagreeing that the fear is just hype. When in fact, you were disagreeing only with the word "marketing". It's "innate" hype, not "marketing" hype. That's a much -- for lack of a better word -- weaker comment, because it is nitpicking about a minor word choice ("it's marketing hype") rather than the sentiment of the comment ("it's just hype"). So now you're asking "why did you assume my comment was interesting, and not just a little nitpick!?"
> One could say we have evolved these instincts to fear them, because they are worth fearing
But now you're back to saying AI is worth fearing, right? Isn't that what you're telling me I was wrong to assume you're saying? This feels like a needlessly difficult conversation.
All that aside, your additions to the conversation are interesting. I actually really like your take on it being a fear of technologically superior tribes. It just doesn't feel like it needs to be quite so adversarial. You established an adversarial attitude with "it's not marketing hype".
Technologically superior tribes always needed land. Their technology was advanced, but not in a way in which land is rendered useless, i.e. land price equals to zero.
Two reasons mainly for land price not equating to zero. First: buildings could not be extended upwards. Second: chloroplasts could not be produced just with air and sun.
These two problems will be solved in less than 10 years both. Land price worldwide will collapse to zero, or near zero. Buildings will be built using a flexible material, very strong and lightweight, i.e. graphene. Food will be produced using genetically modified spirulina, alongside with pharmaceutical substances and narcotics.
Also, just to reply to a parent comment, Matrix was a philosophy movie, Terminator was a comedy with some scary parts, and 2001 i didn't understand any of it. Was i supposed to be afraid of something in 2001?
The seriousness of AGI risk is a big conversation our society is having right now, and I am not interested in laying out my position on it in detail on here... only that dismissing the fear of AGI as marketing hype is a straw-man argument - and ignores all of the history, discussions, and arguments on both sides of this debate going back decades, or arguably even thousands of years.
Sure, it is fair to suspect that, because I am making that point, that I might also think AGI is a real threat to human civilization. The debate is so widespread right now, with mostly aggressive and dismissive bad faith arguments on both sides, that it sounds pretty exhausting and pointless to discuss.
I share your concern, and objection to making discussions needlessly adversarial. In general, I've seen a huge rise recently in internet discussions mostly reducing to people yelling at each other that the other person is a narcissist. Even in niche technical and hobby forums, it is becoming the norm for disagreements. I feel it is infecting me as well, and it might be time to take a break from the internet and talk to people I actually like in real life more.
None of the movies and stories you mentioned were about the "rise of AGI". The closest you can get might be The Matrix via Animatrix? You can go look up what the authors meant by their pieces, there's plenty of analysis of each of those stories. In addition I would love a citation on "the rise of AGI" being a fear in sci-fi for a long time, I've enjoyed quite a bit of science fiction literature and I can only think of a handful that even touched on the idea.
Skynet??!!?! It seems so obvious that I am having a hard time understanding your objection... it would feel condescending to try to explain that Terminator is about AGI risk.
The modern conceptualization of AI existential risk comes largely from Vernor Vinge, in particular his essay "The Coming Technological Singularity" but it is also a central theme in most of his works, especially the Zones of Thought series.
I think it's about as much about ai risk as transformers is.
I'm terms of skynet I think you'd be referring to the third one? But that one is about drone strikes, and that's what we do today without AGI - we send out invincible killer robots to go wipe out villages from above because all males above the age of 16 are enemy combatants. The end of the movie could just as easily have been that they find a person behind the skynet controls killing everyone
The overall plot of Terminator is that an AGI called Skynet becomes self aware and unstoppable sometime in the mid 1980s and humans only hope of survival is to use time travel to destroy it before this happens. I’ll admit this isn’t very well explained to the audience, and the focus is mostly on Arnold blowing things up.
I don't think anyone is confused about Terminator's plot, but people in this thread are definitely confused about what theme means and how to do literary analysis (not surprising given the context of this website).
I understood "rise of AGI" as if preceded by "uncontrolled": a runaway intelligence that ultimately turns against its creators. In that sense, yes, those movies do include this theme.
Your comment makes sense if you intend "rise of AGI" as just growth. But then why should it be a concern?
You have massively confused "AGI as a plot point" with "is what the literary work is about". For example Neuromancer is about trauma and identity. If I'm being charitable I could give you Erewhon, but that's only because a major theme is anthropocentrism. A vast majority of the book is a criticism of victorian society.
Ellison's analysis of his own work is pretty well known and while not specifically about AGI is definitely about technology misuse, so you win there.
This is an absurd level of nitpicking... the phrase "is about" means both of those things in common speech, and everyone can tell which you mean depending on context. Language is defined by regular people through its use and context, scholars only describe it after the fact- to their extreme dismay. You might be technically correct within the jargon of, e.g. an academic literature course, but you aren't correct to insist the words mean that unless you are in that context, with that audience, where it has that meaning.
Moreover, I don't think you are correct, even in the literary sense, when saying, for example, what Neuromancer is about. I don't really care what an author says it's about, most artists are channeling something unconscious and often have very little conscious understanding of what their work is actually about. I've read plenty of Gibson books. The guy is an absolute luddite, and writes on a typewriter. He is terrified of how dangerous and dehumanizing technology is, and all of his books are "about" an all-consuming fear of technology and the changes it causes.
I don't really care about your opinions on academic literary analysis, it's OK to be wrong and have your own thoughts on the matter, but prepare to be confused when people don't agree that the Transformer movies are about the dangers of aliens turning into vehicles.
For someone who doesn't care what Gibson thinks of his own artistic works you sure do have a strong opinion on Gibson's artistic work.
You are mis-representing people that intuitively understand that a phrase can have different meanings depending on context, as fundamentally confused and actually muddling the two concepts.
Nobody would say "transformers are about the dangers of aliens turning into vehicles" but people would say it is "about aliens that can turn into vehicles." They might also say something like it's "about the the struggle to live day to day without being sure if everyday things are truly as they seem" - if that's were what it's about- I'm not actually familiar with transformers.
As an academic and a human (both!) I think it is important to remember how to keep the jargon from whatever you study distinct in your mind from common usage, so you can still communicate with people that didn't study whatever you studied.
You're welcome to make assumptions about what I'm doing, but at the end of the day you have to contend with the fact that I read a comment at face value, in the context, and then multiple people tried to back that original comment up. Maybe if you were the original author your opinion on what they were "trying to say" would matter, but it doesn't.
If not Neuromancer (although I feel differently) then Queen of Angels (Greg Bear). It's the most fascinating take on AI I've ever read - the entanglement with the themes of justice and punishment is thought-provoking.
The Turing Option (Harry Harrison, Marvin Minsky). I can't stress strongly enough, do not read this book. It's terrible. But it does exist.
It's about an AI struggling to become conscious! Alarms go off when it uses the word "I"! The whole point of that subplot is why that AI can't achieve consciousness, but the one sent to Alpha Centauri[?] can. It writes essays meditating on its circumstances, the way humans have treated it and its sibling, it writes an essay exploring whether punishment of others is a gateway to self-awareness...
You want to tell me the theme of that thread is human identity? It's our responsiblity to our creations. It's what does it mean to be sentient? It's the ethics of using a self-aware AI as a tool. (Amongst many other things).
It contains an AI struggling to become human (to gain identity), if I remember correctly.
Either way, we're way off course from "it was a great fear of scifi that AGI would become a risk" to "a book with AGI as the vessel for metaphor about humanity exists".
Crypto is crunching random meaningless numbers. AI is crunching real world data. They are not the same. But AI should not be viewed as a problem. The real problem is that some people don't want others to have the same stuff as they. They want to work 1 hour while the other guy has to work 100. Interest on interest.
Unlike race, the fear of global warming is a social construct. Logically, if you are worried about humans, you should be worried about aging, cancer, heart disease. And if you are worried about the rest of nature, you should be worried about geology and human influence (of which climate is a minor part and self-limiting if significant).
I also think it's bewildering when people say things like that. Are they thinking that humans are incapable of causing climate change no matter what they do? Or that we are currently not in trouble and fine for decades/centuries to come?
I think you are <strike>crazy</strike> unreasonable if you think we are not fine for decades or centuries to come, climate-wise. Barring big volcanic eruptions.
And how hypothetical global warming is a threat? So let's assume it's happening. Let's assume some ices melt and ocean level rises. Let's assume some coastal cities gets flooded. Just move to higher places, eh? Why do you live in coast anyway, what a weird thing.
I'd welcome global warming, I tired of paying $50/month every winter just to keep my house warm.
Your comment is so misguided I can’t tell if you’re joking, trolling, or serious. I don’t know what else to say other than please keep the following in mind: CO2 is transparent to solar radiation, but opaque/reflective to the IR wavelengths that earth radiates heat back to space.
> #3 is happening and could cause #2 if wars for livable land and water resources break out.
Seems more likely to me that attempts to solve #3 will lead to #2.
If we actually attempted to curtail fossil fuel consumption as much activists claim we should, there would be such a drop in agricultural and industrial productivity that there would be mass starvation, particularly in the developing world.
I don't think those countries would peacefully go along with that plan.
'Mother' Gaia will wipe all complex life from this planet with 1 to 1.5 Billion years if her 'undisciplined' puppies don't find a way to leave this ball of dirt.
She's also got quite a lead with killing off 99.99999% of all species that ever lived, when compared to us.
Everyone hell bent on leaving is actually distracted from the real less immediately glamorous challenge, which is learning to sustain life and exercise restraint.
>Everyone hell bent on leaving is actually distracted from the real less immediately glamorous challenge, which is learning to sustain life and exercise restraint.
Unless you're willing to exterminate people who don't comply, the incentives are such that maintaining the status quo will give economic (and military) advantages.
Besides, over long time scales there's more to fear than a few degrees of climate change.
They’re probably referring to the earth getting enveloped by the sun in a billion or so years as it expands.
I still think you’re right though. The better plan is staying on earth. The trick is moving it outward as the habitable zone expands with the sun. Only have to convince humanity to sling a giant meteor just outside earth’s orbit every year for millions of years without messing up. What could go wrong?
Yeah in a billion years. We should figure out the common issues (tragedy of commons, etc) that will follow us wherever we go -- before we try to leave. A billion years is plenty of time to focus on that.
The earth hasn't always existed in its current state, or for that matter existed at all.
Once day the earth will almost certainly cease to exist and intelligence will have to find a new home of some kind. We have probably got a couple of billion years though if we are careful and I have no idea what intelligence will evolve to over that timeframe.
Definitely not the solution. Ending capitalism for some other form of economy is the only way in my opinion. Not that I don’t think people should be rewarded for the products and services they offer just that the incentive to make cheap shit and sell an upgrade every year is definitely harmful to our earth. The problem I see is I don’t know what type of economic solution there is that would fit.
I think the tools to solve the challenges of waste, environmental damage etc. already exist within the framework of capitalism. Mostly they are just unpopular and seen by many as a government overreach.
1. taxes that force corporations and individuals to pay for the negative externalities / social costs of their actions
2. regulation (e.g. stop allowing planned obsolence, mandate the right to repair etc.)
3. government spending into R&D, incentives and subsidies for renewables etc.
Anyway, my point is that the issue is basically one of co-ordination and political will. It obviously doesn't help that many Americans (and Australians too for that matter, where I live) don't accept the basic facts of the situation (before we can even discuss solutions).
>Anyway, my point is that the issue is basically one of co-ordination and political will
Again, what does "political will" mean? What are you going to do to those that disagree? Lock them up? Exterminate them? What is the solution to force people to do your bidding, and has it ever worked?
I assume they mean convince enough people to implement the proposed policies that they can fix things through normal, legal means. "Forcing people to do your bidding" normally consists of winning elections and then implementing and enforcing legislation. This is how we force people who want to shoplift, cheat on their taxes, or murder to do our bidding. It doesn't work perfectly, but it only has to work well enough.
Theft is also a problem of political will. If people would just not steal, the problem of theft would be solved. For some definition of "solution", it is a solution. But not a useful or realistic one. It's just not going to happen in any reasonable timeframe. Only if human nature itself changes in some distant future. Same thing applies to environmental damage.
I have yet to hear of an economic model that humans have discovered which is better than free market capitalism.
The issue isn't the cheap junk; it's the demand for the cheap junk. Things would be far more sustainable if people focused on reducing their consumption habbits, as producers would be run out of business.
The free market is probably the best we are going to get, but we need to address some of its known failure modes: externalities, monopolies, and the imbalance of power between employers and employees.
Externalities: any negative externality upon an involuntarily third party can become illegal via law. This can cover things like littering, servitude, etc.
Monopolies: the free market has yet to produce a monopoly that increases prices for consumers if there isn't a natural monopoly. The gov deals with allocation of naturally constrained resources such as radio frequencies.
Imbalance of power: just save more. Save enough so you can wake up comfortable with the idea that you were fired overnight. It dissolves any power imbalance when your boss needs you as much as you need the income.
Luckily we have 1 to 1.5 billion years to figure out how to survive outside of this ball of dirt... (cataclysmic asteroids and other similar events notwithstanding)
Climate change is analogous to a scenario we see all the time in nature i.e. a species finds massive success causing its population to spike, this population spike degrades the supporting environment and the species risks extinction.
We dodged this in the 20th century with the green revolution but the risk remains. We still haven’t figured out how to live within the limits of our environment, instead we continue to extract from it and degrade it. If we don’t figure this problem out then our species is done for.
Compared to this the other two are barely even risks.
#1 isn't happening anytime soon, nothing we have is on a path in that direction.
#2 people always worry about, but it's almost certainly not going to happen - insane dictators or no. Even in much worse scenarios we avoided it.
#3 Real, but slow. I think we'll see some coastal communities devastated over the course of the next ~50 years, but the rest of the world will adapt and/or sweep it under the rug as best they can. The bigger threat is economic. Companies will no doubt try to use the changing situation as an excuse to skyrocket prices and keep everyone broke.
#2 People aren't evaluating this risk correctly. If the war in Ukraine has taught me anything, it's that the assertions, "Putin would never do..." is wrong.
#3 The real risk is the refugee crisis. The world is going to be split into two groups: refugees, and those trying to keep the refugees away. Large countries are going to experience crises from both internal and external migration. Think, people from mexico flooding into texas, while people from Texas flood north into the great plains. It's going to happen slowly, then all at once.
There's a big difference between an invasion and launching nukes. There is no amount of reparation, apologies, victim blaming, empty promises, propaganda, etc. to un-do a nuke. Putin wants to remain in power - which doesn't happen if there's no country left to govern.
droughts are here now and will get worse in the coming decade. the amount of energy required to melt ice sheets (if that's what you're alluding to) is extremely huge and while it is certain they'll melt in a business as usual scenario (and possible if we stopped all emissions now...), it'll take hundreds if not thousands of years to get there.
whereas extreme droughts are here today, now, as we speak.
On the topic of “Mother Gaia” being a bloodthirsty bitch (how we humans like to anthropomorphize rather than grasp things as they are) see Tiptree’s excellent story: https://en.m.wikipedia.org/wiki/The_Last_Flight_of_Dr._Ain. We may have come close to this being reality four years ago perhaps.
You forgot the H5N1 with 50% mortality rate raw milk thing. There is an article on Ars about raw milk drinkers actively seeking infected milk in the belief it will immunise them against H5N1. I’m not a biologist but this feels like it could be very dangerous to everyone.
I understand why some people might not weight the probability and risk of AGI as highly as others, but to deny the risk entirely, or to act as if it's ridiculous to be concerned about such things in my opinion is just an intellectually ignorant position to hold.
Obviously near-term AGI and climate change present us with near zero risk, and therefore certain groups will dismiss both. This is despite clear trends plotting towards very clear risks, but because these individuals are yet to see any negative impact with their own eyes it's all too easy for them to dismiss the risk as some kind of hysteria. But these risks are real and rational because the trends are real and clear. We should take both seriously.
The nuclear risk is real too, but unlike AGI and climate change the risk isn't exponentially increasing with time. I think other weapons like bioweapons and drone weapons potentially fit that risk profile, however.
> deny the risk entirely, or to act as if it's ridiculous to be concerned about such things in my opinion is just an intellectually ignorant position to hold.
This would take someone actually articulating what this risk you refer to is. All I have to go off of is Terminator, which is patently ridiculous. People in power would never relinquish this power to a computer without being able to dictate whom it benefits.
Letting off a few nukes could slow climate change a bit right? I vaguely recall that all the dust kicked up by the hundreds of tests in the 50s and 60s had an effect.
Possibly, but you have to keep doing it. kicking up dust into the atmosphere tends to have significant short-term impact on the global climate (see e.g. volcanic winters: https://en.wikipedia.org/wiki/Volcanic_winter). Then the dust settles shortly and everything returns to normal.
But volcanic eruptions that cause a winter like that tend to be quite a bit more powerful than your average nuclear weapon. More importantly, not all dust is created equal. Sulphur containing compounds tend to have the biggest cooling effect. One neat (in a terrifying sort of way) geo-engineering idea is continuously injecting large amounts of sulphur dioxide into the stratosphere.
It works best if the nukes hit densely populated areas, but I hope we manage to implement the solutions to climate change that don’t require killing billions.
Temperature problem can be solved relatively easily and cheaply by putting a bunch of reflective film all over the planet (mainly deserts).
Getting CO2 out of the atmosphere is actually expensive. But we will have to do it. High concentrations of CO2 reduce IQ and make humans actually uncomfortable.
We are not at risk of runaway climate change. That's a common misconception and a convenient scare tactic.
We're also not at risk of runaway AGI any time soon. LLMs are a joke. Minds are computationally irreducible, and all our supercomputers can barely hold a baby fruit fly's connectome.
Unless you mean biological AGI. We're going to figure out genetic engineering far sooner than we'll get the hardware necessary for inorganic AGI. Once we master biology, we can genetically engineer organic super intelligences or, far sooner, superviruses whose genome can be uploaded into a nucleotide synthesizer in someone's garage. Then it's game over. Unless we've colonized Mars (which also helps with nuclear conflict).
Gaia isn't really a thing. There are only natural processes, chemistry, and physics, and there is definitely nothing motherly about them; they are ruthlessly efficient.
You don’t understand the enormous immediate risk of climate change if you think that AGI is a comparable risk. Climate change is now and is killing people every day
Nature is out to kill us. We’re doing a far better job in this fight than we ever have. I’d rather deal with problem of climate change than the problems of our ancestors.
That's not true at all. Remember CFCs and the ozone layer? That was a comparable problem, except people actually stopped that one, by no longer emitting the gasses causing the issue.
I don't know that's it's comparable. The ozone required manufacturing changes, it didn't require an upheaval on how we live. Sure, it's "don't emit the bad gas", but the gases come from different sources.
The primary difference was that the ozone depletion didn't gain much traction as a political wedge and world leaders were able to take the threat seriously. 14 years after scientists published basic research warning of potential risk, the Montreal Protocol was signed. 25 years later there was a 98% reduction in release of ozone depleting substances and the ozone layer has begun to heal. Throughout all of that, DuPont lobbied and testified that ozone depletion was a hoax / fake news / scientists making stuff up / etc.
Contrast that to today, where the entertainment news outlets have people, who don’t even cook, up in arms that someone will take away their god-given right to a gas range, and who in turn view it all as a hoax and conspiracy for corrupt politicians to profit.
I’m not sure the world would be able to pull off the Montreal protocol today, even if largely manufacturing changes and having to find a new hairspray brand.
I have the sense that people back in the 60-80s had a bit of an innate trust for scientists born out of the rapid technological progress that preceded that time period, but that has since gone away.
Things like CFCs were taken seriously. Things like radiation were taken seriously (for better or worse, yielding our insane regulatory landscape around building new nuclear power plants).
The last major thing that scientists warned about that was really taken seriously (in the sense that something was done about it before it had/would have had massive negative effects) was world overpopulation, with the publication of things like "The Population Bomb" and China's one-child policy, etc.
Unfortunately, that one was gotten wrong; we now know that without any intervention, world population will tend to moderate itself and we won't actually see mass starvation due purely to too many people. I think that error was the first major blow resulting in people no longer really trusting catastrophic predictions.
I wonder what the world would be like if instead climate change was put forth as a catastrophic issue with the same fervor back then.
With the ozone layer, we already had alternatives to CFCs that were viable. A handful of companies lost out on product lines, but they're all still doing fine today. Individuals didn't have to do anything.
With the problem of CO2 emissions, lifestyle changes are required to fundamentally solve the issue, and people aren't willing to make them. Yes, it's possible that geoengineering can buy us some time. It's possible there will be a battery revolution. Renewable energy is increasingly widespread. But there's nothing right now that's a drop-in replacement. The only sure-fire solution that we have right now is a widespread reduction of consumption and mobility, and very few people are on board with that.
AI doomers don’t even care about its harms today such as being used for automated American death panel decision-making, something that the Victims of Capitalism Memorial Foundation will recognize someday
Here's another source using data from the "EM-DAT International Disaster Database"[0]. Excerpt from the article[1]:
> As we see, over the course of the 20th century there was a significant decline in global deaths from natural disasters. In the early 1900s, the annual average was often in the range of 400,000 to 500,000 deaths. In the second half of the century and into the early 2000s, we have seen a significant decline to less than 100,000 – at least five times lower than these peaks. This decline is even more impressive when we consider the rate of population growth over this period. When we correct for population – showing this data in terms of death rates (measured per 100,000 people) – then we see a more than 10-fold decline over the past century.
Well, for starters, the graph lopped off the first twenty years from the data set so that it could "start" from the massive peak in the 20s and 1930. The top reply to your original tweet is a retweet of two videos that rebut the graph.[0]
It's further skewed by the use of decadal averages, which hid the fact that the greatest peaks included deaths that were either the direct result of--or greatly worsened by, conflict and/or a handful of specific policy decisions--food production failures during and conflicts such as the Zhili-Anhui War in 1920-21[1], floods that occurred during the Chinese civil war which dramatically worsened responses and recovery, the Holodomor in Ukraine in 1932-33 and the Soviet famine of 1930-33 more broadly, the 1938 Yellow River Flood[2] following the intentional destruction of dikes in an attempt to slow the Japanese Army's advance, World War 2 more broadly in the 40s where you had both the food production interruptions of war on a massive scale and explicit acts of mass starvation, the Great Chinese Famine in 1959-61 which is considered to be one of the largest man-made disaster in history,[3] etc.
The graph falsely suggests that we've we've somehow stumbled upon a viable adaptation strategy that makes climate change nothing to worry about. Since 1900, we've seen massive medical advancements, improved early warning systems for at least some types of disasters, transportation networks and technology that helps move people away from disaster zones both before some disasters and in their aftermath, the ability to rapidly move large amounts of food to disaster areas, and more.
Those are all great achievements, but the largest factor in the decline your chart suggests (albeit through data misrepresentation) is the fact that we don't have massive conflicts on the scale we saw in the first half of the 20th century, genocidal dictators looking to quickly wipe out millions of people through starvation, or political ideology driving inane agricultural policies that killed tens of millions of people because the autocratic dictators of some of the most populous nations on Earth read some pseudoscientific drivel (Lysenko and others managed to inspire not only the Soviet Famine in the 30s but also the Great Chinese Famine in the late 50s) and decided it sounded pretty ideologically reliable. We still have conflict and famine, but nothing on the same scale.
Trying to take that and spin it as climate adaptation is, well, absurd. Even by climate skeptic standards, that argument's a real stinker.
No matter how you want to spin it, there are relatively few climate related deaths today compared to the past. Climate change is not causing a rise in climate related deaths, which is what OP was essentially claiming.
Fundamentally just because the current value is at a low point doesn't make something not a threat.
The easy way to think about it by handwaving half-lives of an element. You start with 100 and end up with 50 for a 50% survival rate but also a raw loss of 50. Each of those remaining 50 still are going to have a 50% survival rate despite that the next raw loss is ~25.
But yeah; you can challenge the source of the argument as invalid as opposed to just challenging the argument as invalid.
I don't know about that; I could see everyone drifting into a conflict no one wants, WW1 style.
Recent events also are incentivizing small countries to go nuclear, which will increase the risk of nuclear war. I expect South Korea, Japan, and maybe even the Philippines and Vietnam to acquire nuclear weapons at some point. Japan already has a capability to acquire nuclear weapons quickly, with their separated reactor-grade Pu stockpile.
Acquiring nuclear weapons is not the same as using them.
Nuclear weapons aren’t a combat armament. They are an insurance policy. They are a way to get adversaries to leave you alone or bring you to the negotiating table.
I believe it was Kim Jong Il or Un or maybe both who directly stated this to insiders in their regime. They weren’t ever intended to be used, and those leaders were fully aware that any real conflict would result in their regime’s annihilation.
Their #1 goal was preservation of their family monarchy. Nuclear war is the opposite of that.
The argument "war won't occur, it's irrational" isn't a good one. Both WW1 and WW2 were irrational, but they happened anyway. If actors were perfectly rational no wars would occur.
We all work in complex systems and see first hand how fragile and error prone they are. Yet you think the risk of nuclear war is low even though the programs rely on similar types of systems?
Human-driven systems, embedded in the real world, are still inefficient enough[0] that they are much more resilient than the kind of systems we work with. At the very least, each part of the system has the ability to independently evaluate the process they're a part of, and stop it in its tracks if they think a fatal mistake is about to happen. See e.g. Stanislav Petrov and be thankful his job wasn't automated, or we'd not be here to talk about it.
I'd put a higher risk number of nuke throwing starting between smaller countries, like Pakistan, or Iran. Putin strikes me as more like a coward than suicide bomber.
Putin is quite old and will die soon. My concern is the chaos which follows, as Putin has killed all strong contenders for power.
It shows how little he cares, for the best thing a leader can do for a nation is to ensure a strong successor. The resulting civil war in Russia is where the nukes may appear.
1. AGI doesn't even exist and Sam Altman and I agree on one thing, GPTs will not get us there. Saying AGI is a risk is like saying the sun exploding is a risk.
2. Nuclear conflict is real, it's astounding Pakistan or Russia hasnt used the bomb yet but they will when the US backs them into a corner.
3. Climate change? At this point, who cares? We know it's happening and happening VERY slowly so we have PLENTY of time to get ready. Theres no real risk other than failing to move away from the cost in the next 100 years. International shipping and big ag are the largest polluters by far and those ain't stopping.
Fear of AGI is not based on Altman’s latest brews. Look at the progress curve of AI in general is one way to get a feeling for it. The other one, which I prefer, is looking at existing AIs for which we thought human ingenuity was top notch and how they are beating us left and right. Strategy, creativity, cunning, all human notions that are being slaughtered. Slowly, but surely.
It is “just” a matter of combining. I think it’s an “engineering problem” by now. Which is not to say that it will happen soon, just that if we set our minds to it it won’t be that big of a deal.
AGI by 2030 is not a reach statement at all anymore.
The LLM's we have today weren't supposed to show up until 2030 at the earliest and 2050 more realistically. If you go back 10 years and read what people were writing about AI progress, you will find that they were _completely_ off the mark. It's prudent to assume a very liberal timeline for AI progress, based off whats been happening now.
I'm loving this absolutely breathless parroting of "AGI in the next 5 years" and yet: we don't even know what general intelligence is. We don't know how/why/where consciousness arises. It's not even a clearly defined concept. But here we have "Workaccount2" prophesizing the rise of general intelligence :D
Please enlighten us sensei, what is actually General Intelligence, and how will you know it's here?
I realize my reply is a bit flippant, but it's tiring to see how certain are people about something that nobody truly understands.
Consciousness appears to be a form of intelligence. Is it required for general intelligence (whatever that means)? We don't know. Would humans be classified as having general intelligence if we weren't conscious? In any case, I perhaps worded that poorly. I wasn't saying that consciousness is the same as intelligence, it was just an example of how little we actually understand intelligence.
> AGI simply means that we can create a type of machine that can do any kind of mental task that a human can, at least at a human level.
What does this even mean? What is human level? The ability to write a symphony? Or the ability to calculate 2+2=4? These are such poorly defined metrics that I don't understand how can we then start throwing around very precise (and short!) timelines for attaining it.
The statement: "we'll send a human mission to Mars in the next 5 years" is 10x more believable than this breathless AGI hype, because at least the former is well defined.
Ironically I often post the exact same sentiment that you are sharing. But you are conflating two different things.
AGI doesn't require consciousness. We likely will have no idea whether or not the first AGI computer is conscious or not. What we will know though, is that it is solving novel problems that humans struggle with, and from the outside the only way to discern it isn't another human is because it is way to smart to be one.
I strongly suspect that the first computers which actually are conscious (AGI or not), whenever that happens, will have to fight a heavy uphill battle to prove it. And there will always be a group of people who will never believe an AI (is it even artificial at that point?) is conscious.
I didn't say that AGI requires consciousness, just that we don't understand what intelligence truly is, along with consciousness (which appears to be a form of intelligence).
And I'm not particularly keen of the reasoning: "it is solving novel problems that humans struggle with", because a pocket calculator can solve problems that humans struggle with. A computer can play chess better than a human can, does it mean it's an AGI?
We're still in the same AI hype as 10 years ago. You just stopped noticing for a while. It really hasn't stopped since AlexNet back in 2012.
Personally, since I first started to learn about Neural Nets in the 90's, I've always predicted that they would bring AGI around the time they matched human brains in complexity and compute power. Back then, assuming Moore's law meant that would be between 2030-40.
Now, there was always the prospect of a serious breakdown of Moore's Law, which might delay it, but since that didn't happen (compute power just moved from CPU's to GPU's), we're now approaching the final stretch. Human brain scale neural nets are already being planned, if not already being built in secret.
25 years ago, I did a back-of-the envelope calculation during a party and came up with roughly 2040. I didn't know that Kurzweil had already predicted 2029. Once I heard about it, I felt Kurzweil was over-optimistic, and stuck to 2040. Recently, though, it has started to seem that AGI before 2029 is more likely than 2040 or after.
Well, also, plastic waste getting dumped in water ways, and in the USA, steel and coal waste being dumped into the Ohio (causing a huge dead zone in the Gulf). There are serious pollution problems everywhere, and I kind of hate the climate change narrative for taking people’s eyes off of those problems.
Exactly. Plastic will be the same at some point. Natural selection would probably lead to plastic-eating bacteria anyway. DeepMind's enzymes are just a way to speed it up.
Oh well, some concerns around AGI might be attempts at marketing and attracting attention, though some caution probably isn't too bad to have. AGI might become a thing, it's just that there's no guarantee that it will happen in our lifetimes. On the other hand, even LLMs will put some folks out of a job.
Relations between nuclear powers are anyone's guess but there's no reason why Russia couldn't be forced to back off, as opposed to "being backed into a corner". Wouldn't be the first proxy war and most likely won't be the last.
I care about climate change, there are certainly others. No idea whether much of a change will be made in our current course, but there's no reason to be super dismissive. If anything, recycling, eating a bit less meat or doing lots of the other common recommendations improves my own quality of life, even if the legislature that goes after the big corporations is nowhere to be seen yet. No reason not to make the world a better place, or ar least try to, in whatever ways are viable, like donating towards planting trees or something like Wren.
> failing to move away from the cost in the next 100 years
Unless you're a time-traveler from the 50s who has somehow managed to post here, there is no excuse for this type of disinformation these days.
Mentioning higher sea levels is also a red herring; massive agricultural yields collapse will be an issue long before (like, this century) sea levels become a major problem.
More than this, we have now reached levels of atmospheric changes that put actual near-term Human extinction (not to mention that of most sea and land species) on the table.
Despite the IPCC being very conservative/optimistic in their scenarios, this is in fact in line with their reports; admittedly my comment above might be badly worded - when I said "near-term", I meant "in the upcoming two centuries or so".
The IPCC reports say we might reach 4C, 5C or even more; based on the historical record, such a major change in such a short time - several orders of magnitude faster than previous CO2e-gases-linked mass extinction events - likely cannot be adapted to by the majority of species (there will of course also be a few evolutionary winners), resulting in potential extinction. I also quote the latest draft report from that same IPCC, leaked about two years back by concerned involved scientists to newspapers before the usual step where political stakeholders are allowed to reword the parts they deem too disturbing or against their interests:
"Life on Earth can recover from a drastic climate shift by evolving into new species and creating new ecosystems," it says. "Humans cannot."
The IPCC also consistently underestimated sulphur in the atmosphere because if you look at this and previous years we've reached their 2030-2035 goal of warming this year.
> GHG emissions will lead to increasing global warming in the near term, and it’s likely this will reach 1.5°C between 2030 and 2035.
> Global temperatures have been exceptionally high over the past three months – at around 1.6C above pre-industrial levels – following the peak of current El Niño event at the start of 2024.
> The past 10 months have all set new all-time monthly temperature records, though the margin by which new records have been set has fallen from around 0.3C last year to 0.1C over the first three months of 2024.
You can't grow food for a large population when average planetary temps are at +5C. It doesn't mean it's locally always +5C warmer than it used to be; it means you're seeing insane temperature swings in a matter of days, constantly - in both directions, it just so happens that the average is +5C.
Not to mention, at +5C it is all but certain shallow methane hydrate deposits (those stabilized by temperature, not pressure) all over the world are now outgassing CH4; not to mention several other similar tripwires, and likely not all of which we've even identified.
Edit: I can't seem to be able to reply to your comment below, not sure why. You're absolutely wrong about those Siberia figures, and they're not supported by current scientific consensus.
Much of the soil in Russia that will warm is indeed acidic and largely inexploitable, which I suspect is what you were getting at; to their credit though, they picked an area that isn't (IIRC). Not that it changes much.
Most soil can be used for some type of farming with some help from fertilizers. Temperature and access to water is more critical. Higher CO2 also increases crops, in isolation.
We're also talking about a time span of 100-300 years. Keep in mind how much more efficient farming is now compared to even the 1950's.
1) Is real but will have net positive impact on humanity (of course there will be losers, but like most new technologies humans will have more for less effort)
2) It is real but unlikely to end humanity. Much more likely to create large areas that are unlivable, but a lot of work has gone into avoiding world ending scenarios.
3) Is mostly hype. Doom predictions in the last 20 years on this topic have been very wrong. The climate is always changing, and humans will have to adapt as always (+1 to 2C increase over 100 years is not world ending). The long term problem will be prevented by a) running out of cheap carbon based energy b) geo-engineering c) rise of cheaper non carbon based energy (related to a).
Asteroid impact or pandemic / virus / man-made bioweapon are all higher probability humanity ending risk IMHO. Not enough thought & energy are put into those scenarios.
Without even going into the unsubstantiated assertion with #1, your comment on number 3 shows a dramatic misunderstanding of how compounding effects work. You can't use the last 20 years to linearly project like this. It is true that most scientists agree that humanity will likely not go completely extinct, but it is also true that most scientists agree that many, many individual humans will be impacted. It is tough to say just exactly how humans will be impacted, but think famine, war, major societal upheaval.
He said 20 years of doom predictions that haven't come true. Compounding effects don't apply to the accuracy of academic predictions. It's not like academic accuracy automatically gets exponentially better over time. Linear projection when attempting to guesstimate the accuracy of future predictions by a group of people who have also made predictions in the past is a fairly reasonable thing to do, unless you have some specific evidence that they significantly improved their methodology.
Your citation is merely an advocacy piece, not science. For example the first diagram contains charts of fertility rates, institutional assets divested and world GDP whilst claiming they are "climate related human activities". Presenting a nearly random collection of metrics as evidence for your argument isn't a sign of robust thinking or argumentation.
When someone says "20 years of doom predictions haven't come true", I charitably assumed that claim was about scientific consensus predictions, but perhaps I can't assume that everyone shares knowledge of what that is.
What doom predictions from the last 20 years haven't come true? If someone says that doom hasn't happened yet, I guess what I want to say is that they haven't waited long enough.
I think the climate scientists are frustrated and giving up. https://www.nytimes.com/2022/03/01/climate/ipcc-climate-scie.... My initial link was an attempt to show where the Overton window is regarding the experts in this field, more than anything else. This comment is probably not the right place to bring someone up to speed with the climate science field when they can Google it themselves.
Although it's not well known, you unfortunately can't use temperature data to judge whether climatological predictions are correct. That's because the databases of temperature data that are presented as "global temperature" (a fundamentally statistical product) are themselves maintained by climatologists. It's a bit like asking a CEO whether his products are good, and he cites his own private data on customer happiness to prove that it is. Lots of people wouldn't accept this as evidence because it's not independent. The data might be correct, but his salary is at stake and so there's the risk of shenanigans. You'd want a truly independent assessment.
Climatologists like to claim that they are of course far better than that and would never abuse their monopoly position on such data, but they also regularly change those databases in ways that retroactively make failing predictions correct. Like here [1] where they declared a new record-breaking temperature that was lower than their previously announced record. They didn't mention that anywhere but the previous press release was still on their website and somebody noticed.
Anyway, you're right, let's Google things. Here are a few failed predictions from 20 years ago that can be judged without using temperature databases:
• Dr David Viner, climatologist, March 2000. "Children just aren't going to know what snow is". David Parker, climatologist, same article. "British children could have only virtual experience of snow." [2]
• "Australia faces permanent drought due to climate change", 2003 [3]. Dr James Risbey, Center for Dynamical Meteorology and Oceanography at Melbourne's Monash University, says "the situation is probably not being confronted as full-on as it should". Current data shows no drought [4]
• Pentagon report, 2004 [5]. By 2020 the weather in Britain "will begin to resemble Siberia", by 2007 violent storms have rendered parts of the Netherlands uninhabitable. "A ‘significant drop’ in the planet’s ability to sustain its present population will become apparent over the next 20 years". "Immigrants from Scandinavia seek warmer climes to the south." None of that is even close. "Senior climatologists, however, believe that [the author's] verdicts could prove the catalyst in forcing Bush to accept climate change as a real and happening phenomenon."
There are hundreds more like this. It's inevitable that people take this history into account, and kinda unfair to demand that people don't. If there had been rigorous investigations of what went wrong in these cases, and clear evidence of learning or regulation of the field in the same way as happens in other areas of life after big failures, then people's confidence might be higher.
Average increase of temperature has negative consequences, but the accompanying variance increase has a lot more catastrophic impact. Storms, forest fires, floods, droughts are some of those things where increased variance can screw societies over.
A lot of this can push insurance costs so high, that insurance companies stop working. There are only so many times insurance can payout damage due to extreme storms, or crop failure. Once insurance stops, there is rapid deterioration of infrastructure.
This is what passes for common sense when you live in a conservative media bubble. The mainstream of climate science as represented by the IPCC has tended to be over conservative when it comes to how much change we should expect over a given period of time.
Right wing media outlets focus on the the predictions of lay people like Al Gore and Greta Thunberg and use those to try to dismiss the entire field of climatology. Which is a non sequitur.
I'm not that worried about nuclear conflict because it's a scenario where literally nobody wins. There's always the "Madman" hypothesis but really that's no way to do geopolitical analysis. Nobody is truly "crazy" (IMHO). Remember those orders have to be carried out by someone. They've studies on this with missile silo operators and they had a disturbing or comforting (depending on your POV) tendency to not launch.
It's really the "slow death" scenarios that are a much bigger risk.
While I'm firmly in the AGI camp, I'm both fatalistic about our willingness to do anything about it but I'm also highly skeptical of the "runaway climate change" doomsaying. The Earth has been around for ~4.5 billion years. While it's only been similar to what it is now for th elast 300 million or so, that's still a really long time. We've had periods where ice extended to the equator (~500 million years ago). We've had much warmrer periods.
The Earth will be fine. We however might fall by the wayside. There's really been such a long period of time that if runaway climate change were going to happen, why hasn't it happened already?
> There's really been such a long period of time that if runaway climate change were going to happen, why hasn't it happened already?
I don't think runaway climate change or even something like a reversal of the Gulf Stream were ever any of the mainstream scenarios from IPCC. The worst scenarios, IIRC, were a global warming of +6 to 10C.
And that is over several centuries, provided we do nothing to stop it.
This century, the worst scenarios are about 4C hotter than today.
Also, it's not like the temperature is going up that much everywhere. For instance, heating an area near water from 34 to 38C means a lot more water evaporation, and thus more cooling. Also, stronger winds mean the humidity may be blown away more quickly.
Now even 4C of heating, provided we don't develop any technologies to either counter it or cope with it could cause a disaster that could cause similar disruptions, forced displacements etc as WW2, but the world didn't end because of WW2. It was only a minor speed bump in the grand scope of things.
Anyway, the probability that it should take 100s of years to reach AGI seems quite slim. And once we have AGI/ASI, the world is going to be so fundamentally different that I'm not sure how much some warming really means, at least for humans.
At 4C of heating, you cannot grow food in any reasonable quantity with any reliability, period. 4C is the collapse of modern civilization at the very least (in fact likely earlier due to increased geopolitical instability and tensions due to dwindling resources, combined with the availability of nuclear weapons), with a massive die-off of humans along with it.
We're at +1.5C, more or less, and growing the usual grapes in France - of all places - is already becoming much harder (they keep dying of frost after waking up due to wild temperature swings).
At +5C, there is no growing food in any substantial amount outside of high tech, low yield approaches; approaches that depend on complex planetary supply chains (both for initial deployment and maintenance), which will have disappeared by then.
> At +5C, there is no growing food in any substantial amount outside of high tech,
It seems you have some specific geographic region in mind. Earth has a lot of regions that are more than 5C colder than the most fertile regions.
People adapt. People move. Sometimes they fight wars about it. This has happened many times before.
We're currently living it the most peaceful, prosperous and safest period that humanity has ever experienced. (Despite what social media is tricking our brains into believing). In the future we will surely live through periods that are closer to the average. But I'm not seeing any extinction-level events due to climate change within the next few hundred years.
AGI or nukes, on the other hand, they both DO have the potential to end us as a species.
You seem to think that a couple of degrees just applies uniformly across the planet, or to a specific location, transforming somewhere cold into something which is now suddenly hospitable. That isn't how climate change works. It doesn't mean that Canada will suddenly be nice and balmy year-round; it means that the climate will fluctuate more wildly and wreak havoc on our agriculture, as described in the comment above. The temperature change is a global average, and your local experience is going to be a lot worse at the extremes.
> You seem to think that a couple of degrees just applies uniformly across the planet
I don't believe that at all. I even studied the IPCC for how their various scenarios lead to different levels of increased heat in different places.
When it comes to fluctuations, there are several types. An obvious one is wind, which will probably become noticeably more chaotic with more energy. Another is temperature.
Temperature variations generally depend on humidity and wind. As winds get stronger, that in isolation leads to some increase in temperature variations.
For humidity (both at ground level and in the atmosphere), increased humidity leads to lower temperature variations.
There are also precipitation. Higher temperatures lead to heavier rain (when it rains), and can increase the likelihood of hailstorms.
There are also extreme weather patterns that become more common when it gets colder. While tropical storms and hurricanes increase in frequency in hot weather, more laminar storms ("winter gales") get more common when the weather is colder. I believe this is because the LACK of turbulence/chaos means there are fewer factors that can break up such storms.
This last type is common in places like Canada, Scandinavia or Siberia now, and come almost exclusively during winter.
Btw, the impact of increased temperature on weather is something that we can already observe on Earth today, simply by travelling between different weather zones. While SOME of the extra energy can affect areas far away from where the heating occurs, a lot of the effects are local or regional.
That means that it's likely that Temperate Zone type weather is going to shift a bit to the North, and include a greater proportion of Canada, Scandinavia and Russia. These areas will then get weather more similar to places like the US/German/China today.
The southern parts of the temperate zone is likely to see weather patterns that resemble tropical (or desert) weather zones. Much of the US can be more like Mexico, France can be more like Morocco or Greece, Sothern China more like Thailand, and so on.
This means that areas that get warmer AND dryer (like Spain, Italy and France, probably) will get some of the variations currently seen in Sahara.
But it doesn't mean that the temperature fluctuations get greater everywhere. Some areas become more humid, and that means lower fluctuations.
Btw, for humans, dryer weather can be an advantage, since it allows us to dissipate heat much more easily. For farming it's less ideal. Places like Saudi Arabia could go in the opposite direction, with higher humidity and more rain, farming could become easier, but the risk of wet bulbs could also go up.
Anyway, while it is true that more energy in the atmosphere ON average increases the frequency of most types of extreme weather, it is not true that it will increase all types of extreme weather everywhere.
Ok, but I fail to see why we would want to increase extreme weather on average? Like, if we happen to make some part of the planet a little better for agriculture by accident, while ruining the rest of it, how is that a good thing?
Do you actually believe people think the warming is a good thing?
People want to drive their car, heat or aircon their house, go an vacation and use or consume all sorts of products that require energy to produce.
And even just scrolling facebook means a lot of energy is used in some data center.
This has side effects. Most cars will spew poisonous gas out the back, into the city where people live. Some of them die from those fumes, most don't. People still drive cars, because they think the benefits outweigh the costs.
"5C warmer" doesn't mean a uniform increase in temperature of 5 degrees at all times. It means "5 * the thermal mass of the earth's biosphere" worth of extra energy in an extremely chaotic system that is currently in a local stable point, but doesn't have to stay there.
Just to clarify. The reason I didn't respond to this one:
> "5C warmer" doesn't mean a uniform increase in temperature of 5 degrees at all times.
Is because I thought it was completely obvious that this is correct. I've seen that several responses thought I ignored and argued that 5C warming would be the same everywhere, while what I meant by "5C warming" was "the effect of a global 5C warming".
What I DID think was the main message was this one:
> It means "5 * the thermal mass of the earth's biosphere" worth of extra energy in an extremely chaotic system that is currently in a local stable point, but doesn't have to stay there.
While I agree that a "5C global warming scenario" may mean that the average temperature in Lyon, and even that the VARIANCE of the temperature IN LYON may go up quite a lot, I did have objections with the hypothesis that the chaos would be the main factor leading to variations in temperature.
While, for the global average, increased energy in the atmosphere may lead to SOME increase in the variation in temperature, I don't think that variations in temperature depends nearly as much on the energy in the atmosphere as other types of extreme weather, such as hurricanes, heavy rain, hail storms etc. (And effects of those, such as destroyed crops, damage to property or flooding).
Changes in humidity seems to be a much greater factor in the variability of temperature than this extra energy has.
If you check the IPCC projections for changes in precipitation patters, maximum and minimum yearly temperatures, you will find that in areas where precipitation is expected to increase, the minimum yearly temperature goes up a LOT more than the maximum yearly temperature goes up, especially so in the sub-arctic part of Eurasia (like Siberia).
Meanwhile, in areas that are expected to get dryer (including Spain, France and Italy), the minimum yearly temperature hardly increases at all, while the maximum temperature goes up a lot more than the global average.
Basically this means that some vineyards in France, Italy and Spain may have to move to more robust crops, like maybe olives. But it also means that new areas open up that may become more favorable to vineyards, for instance in Germany, Poland or even southern Sweden.
A 5C warming is indeed likely to make a few areas uninhabitable. I'm not saying climate change is not a problem. I'm just saying it's not an extinction event.
But keep in mind that the areas that tend to get the greatest warming tend to be the dry ones. In such places, sweating will still allow human bodies to regulate body heat, if air conditioning breaks down.
From a different comment whose main thrust you also ignored (you are all over this thread with the same fallacy):
>It doesn't mean it's locally always +5C warmer than it used to be; it means you're seeing insane temperature swings in a matter of days, constantly - in both directions, it just so happens that the average is +5C.
"Wild temperature swings" is already quite common in dry places. Sahara can be below freezing during the night. The IPCC predicts that Southern Europe will get dryer, so they will have larger variations.
More humid areas, especially if they're near coasts tend to be a lot more stable.
I think these tendencies will remain true.
While introducing more energy to the atmosphere is likely to generate more winds (including hurricanes), it doesn't seem plausible to me that (given constant humidity) this will be enough to cause enough variation in temperature to make farming impossible in most places.
If you have some reference (preferably something like IPCC, as opposed to something that could be fringe), I would be willing to reconsider.
> you are all over this thread with the same fallacy
Maybe you could state exactly what fallacy you think I'm advocating for? I'm not saying global warming is something good and that we don't need to worry about.
I just don't think it's a likely extinction level threat, like an asteroid, rogue AI, nuclear war, an alien invation etc. Unlike those others, it's almost certain that we will experience some degree of global warming, but if that's the worst we will face, humanity will survive as a species.
Also, there is a couple of other factors:
First of all, I think many that worry about climate change (beyond those who honestly think it will lead to extinction) really care mostly about all the pain and suffering that global warming could bring at some point, at least to some large minority of humanity. Maybe also that it will cause a non-trivial fraction of the population to die from famine, wet bulbs etc.
I don't think that's impossible (though maybe a bit pessimistic, see below), but I think the fallacy that the these people make, is to assume that we will have a future world without such events. Historically, we have seen that bad things have happend from time to time.
The collapse of the Roman Empire caused half the population to die off (partly due to colder weather). The Black Death caused a similar percentage to perish in many place. The Mongol conquests resulted in large parts of the Eurasian steppes to be so depopulated that they still haven't recovered. Then the were the world wars, Bronze Age collapse, and the list goes on.
The future is likely to bring similar events, too. Climate change could possibly be such an event. But usually, these events are not those we expect, but rather some kind of Black Swan that surprises everyone.
The second fallacy that some climate change fanatics seem to ignore, and this one by choice, it seems: We're still very much in a kind of exponential technological development. 200 years is a very long time, and unless the technological development suddenly grinds to a complete halt, we will have a lot of new options both when it comes to minimizing global warming and also to survive any warming we're not able to prevent.
People seem to choose to ignore this based on a better-safe-than-sorry philosophy. That's ok when dealing with risk that we aim to reduce to zero, as long as the cost is low. Kind of putting on a seatbelt when driving.
What many don't seem to realize, is that this is a luxery belief / first world concern. For someone less privileged, like most countries in South Asia or Africa (and also working class people in the west), access to cheap energy now is seen as really important. That means for such people, some risk is acceptable.
Kind of like if the seatbelt on your car is broken, and the nearest grocery store is 20 km away. Do you walk there, or do you drive the car regardless. To do that risk evaluation, you want to know the real risk of driving without the seatbelt.
Similarly, for those most affected short term by for instance ending most fossil fuel use, the REAL risk associated with global warming is relevant.
And to evaluate that, it's actually really relevant to factor in that humanity is likely to grow a lot in technological capability to face new challenges over the next 100-300 years. How much is a matter of opinion, but zero is unreasonable.
As far as I can tell, the most likely scenario is that Climate Change is going to be a challenge for humanity. My best guess is that in most places on Earth, people will find ways to deal with this challenge. But I'm open for the possibility that 100s of millions might die because of it.
I also realize that global warming could be a factor leading up to a nuclear war. I really don't think it would be the main factor, though. I consider nuclear war as a separate risk category.
Most wars are caused by nationalism or religious conflicts, especially betwen the kinds of countries that are likely to have large arsenals. The obvious current example is Ukraine. It's not a famine that drives Putin, it's a desire to Make Russia Great Again.
Compared to reasons such as nationalism, religion or even ideological conflicts, I think global warming would be a significantly smaller risk factor in terms of how it increases the risk of global war.
Another huge uncertainty is the what population Earth will have in the future. Already, it's pretty clear that the population in most developed parts of the world is going to decline rapidly over at least the next 50 years. Population growth is mostly restricted to South Asia and Africa now, and even in South Asia there are indications that it's going down.
It's certainly possible that this can be reversed completely, but if the current trend continues (and assuming Africa also has this trend eventually), the population could be halved every century. That means we will be only 1 billion by 2300.
On the other hand, if the trend reverses back to exponential population growth, we may be up to 30 billion or so by 2300 (unless prevented by starvation).
This is a huge gap! With only 1 billion, it would be far easier both to minimize global warming and to survive it. With 30 billion, it would be very difficult to prevent global warming and also much harder to deal with it when it comes.
And all of this hinges on us being unable to develop AGI/ASI this century. Which is starting to seem unlikely. Most of the researches in the field seem to expect AGI some time in the range from "within 5 years" to "several decades".
If we DO develop AGI before 2100, the exact circumstances around it is going to matter way more than global warming. At the optimistic end, AGI may find ways to completely end global warming.
Or it could cause human extinction before it matters.
So it's not that I don't "believe" in global warming. I generally accept the scientific consensus in most fields, as long as the field actually uses something like the scienific method, and is not just a cover for some ideology.
It's just that it seems to me that many "True Believers" in climate change turn it into something more similar to a religion than the actual science it's based on. And that this causes them to only see this single issue, while generally ignoring almost all other risk factors we're likely to face in the future.
Stability in a chaotic system is precarious. Changes, even if small or seemingly trivial, can cause massive cascading effects from positive/negative feedback loops.
Sigh, I don’t mean to sound like a dick, but if that’s gibberish then you might want to strengthen some of the foundational understandings around systems.
I'm familiar with chaos theory, and not rejecting issues related to chaotic systems. But this part doesn't really seem well formed:
> It means "5 * the thermal mass of the earth's biosphere" worth of extra energy
Here it seems that the units were missing, at best. The extra energy would be "5K * the heat capacity of the biosphere (in J/K)"
> in an extremely chaotic system that is currently in a local stable point, but doesn't have to stay there.
First of all, I don't think the current state is THAT stable. And when it comes to any disturbance to the stability caused by the extra energy, that's what we have climate scientists for. Specifically, those are the ones that need to assign specific probabilities to the varous scenarios available.
For instance, while it is POSSIBLE that a result of the chaotic dynamics of the system would be that the Gulf Stream got reversed, it seems that the curren consensus is that this is highly unlikely.
However, there are a lot of other effects that would be highly likely, such as an increased rate of hurricanes for instance. But hurricanes is not a state of the whole system, it's basically just a type of weather that gets more common.
So, maybe "gibberish" was not the perfect word to describe it, as I was able to parse it. It's just that it didn't say anything specific. It was more at the level "It's getting chaotic, and chaos is scary", without making any specific predictions or producing references.
Really? You’ve never met someone with the “if I can’t have it, nobody can” mindset? Never heard of a custody dispute where the losing parent kills the kids in order to deprive the other parent of them?
Vladimir Putin is exactly that kind of person. When backed into a corner and knowing he will die, he would rather nuke the whole world than accept his fate. And those orders will be carried out by people in heavily restricted information silos, who only have the explanation of their commanding officers as reasoning. For all they know, this is a test, or the US already has launched their nukes, or if they don’t launch the nukes their mom will be dropped out of a window.
So this is a learning opportunity for the differences between "idealism" and "materialism".
"Idealism", which unfortunately underpins all mainstream political discourse, is simply that certain actors (people, states, etc) are good because they're good or they're bad because they're bad. It's really the Marvel view of the world. There are good guys and bad guys. Idealism is a completely inadequate way to view the world because it simply reflects the propaganda that's effectively sold the idea of who the good guys and bad guys are.
"Materialism" is a broadd term that originates in the philosophy that people affect the material world and the material world affects people. More specifically to this point, it's the idea that people do things for a reason. That doesn't mean the reason is justified or sound or good. But there is a world view that underpins any actions by any actor.
So when it comes to wars, the idealist will say one side are the good guys and the other side are the bad guys and never go any deeper than that. The materialist will derive that all wars come down to the desire for resources and land. Not religion. Not ethnicity. Those are simply pretexts to motivate the citizenry and foot soldiers.
So Putin dons the Duganist [1] hat, just like European kings in the Crusades wrapped themselves in Christianity. Putin may even be a true believer but that's not why he's doing what he's doing. He's securing a land bridge to Crimea, a warm water port on the Black Sea and control of territorial waters in the Black Sea and the resources contained therein.
He probably figures the West will eventually lose interest (which is already happening) and in the mean time he can keep selling oil and gas to China and India so the sanctions are inconvenient but can be weathered.
There's nothing crazy about it.
But even if he was crazy, to launch a nuclear attack, someone would have to carry out those orders. Bomb loaders, generals, submarine commanders, pilots, missile silo operators, etc. Look up the history of Vasily Aleksandrovich Arkhipov.
Thanks for sharing your blogpost-level understanding of philosophy, while completely ignoring the fact that Putin has already said exactly what he is doing: trying to rebuild and even grow the Russian Empire. He has explicitly said this multiple times, and he even likens himself to Peter the Great. Land bridges to Crimea aren't an objective, they are merely strategic checkpoints in his long term goal of taking over as much of the world as he can.
And yes, he absolutely is the type of person who would destroy something if he can't have it. He did it with Grozny, he did it with Mariupol, he did it with Bucha, he did it with Bakhmut, he did it with Avdiyivka, and he will continue doing it with the rest of the world. So far, he has sent hundreds of thousands of Russians to their death, and they continue to do it. There are literally thousands of Russians that know they're gonna die, and they do it anyway. Your belief in the humanity of people running nuclear bunkers is already betrayed by their own actions in every conflict under Putin's watch, and somehow you think 100% (because it only takes one person!) out of the 3000+ nuclear silo operators will suddenly grow a conscience when Putin tells them to send the missile. It's fucking absurd.
- The risk of the rise of AGI
- The risk and madness of Nuclear conflict
- The risk of runaway Climate change
My money is on Mother Gaia. She will brutally and swiftly discipline her puppies.