I guess we should take this as a lesson in communications. The "breakeven" thing is a red-herring that should have been have been left out of the message, or at least only mentioned as a footnote. The critical ELI5 message that should have been presented is that they used a laser to create some tiny amount of fusion. But we have been able to do that for a while now. The important thing is that they were then able to use the heat and pressure of the laser generated fusion to create even more fusion. A tiny amount of fusion creates even more fusion, a positive feedback loop. The secondary fusion is still small, but it is more than the tiny amount of laser generated fusion. The gain is greater than one. That's the important message. And for the future, the important takeaway is that the next step is to take the tiny amount of laser fusion to create a small amount of fusion, and that small amount of fusion to create a medium amount of fusion. And eventually scale it up enough that you have a large amount of fusion, but controlled, and not a gigantic amount of fusion that you have in thermonuclear weapons, or the ginormous fusion of the sun.
To give some more detail: hydrogen to helium fusion (even with intermediate steps) is extremely unlikely to happen. That's part of why the sun will last for billions of years. And that's also why first human attempts at fusion are not trying to use straight up hydrogen as the fuel.
Good old Wikipedia has this gem:
> The large power output of the Sun is mainly due to the huge size and density of its core (compared to Earth and objects on Earth), with only a fairly small amount of power being generated per cubic metre. Theoretical models of the Sun's interior indicate a maximum power density, or energy production, of approximately 276.5 watts per cubic metre at the center of the core,[63] which is about the same power density inside a compost pile.
Another fun fact: there's a decades old design for a gadget that fits at the top of your desk and does nuclear fusion. You could build one yourself, if you are sufficiently dedicated. Unfortunately, no one has ever worked out how to run one of them as a power plant. Ie how to get more useful energy out than you have to put in.
If it produced a quarter of the heat of the human body per volume, its temperature would be lower as well (less than 37 degrees Celsius).[1] This is obviously not the case.
[1] Obviously heat and temperature are not the same, I know that. But when something’s temperature is higher than another thing’s, then heat is exchanged along that gradient. Meaning if the sun produced less volumetric heat than the human body, a human body placed within the sun would warm the sun and cool the human.
For both the sun and a human on earth there are two processes going on:
1. Heat production per unit volume.
2. Heat loss per unit surface area.
The volume to surface area ratio for the sun is much larger than for the human, for a minor reason (the sun is a sphere) and a major reason (the sun's linear size is much bigger). So the equilibrium temperature of the sun in the same ambient outside environment is higher than the human's.
Your thought experiment about placing a human inside the sun would in fact work as you say, if a human body continued to produce heat once it had achieved thermal equilibrium with the surrounding plasma.
Yeah, this is a morbid analogy but if you got a bunch of people enclosed in a small area, the people in the middle will get so hot they will heatstroke, even if it's freezing outside. See the recent Korean crushing disaster.
Don't give them ideas, harnessing people power would solve all the major problems. Overpopulation, global warming, energy crisis,... Reminds me of that 'Mitchell and Webb - Kill all the poor' sketch.
Well, plant the idea that there is a fusion reaction going on in mitochondria at a very low scale. Throw in terms like "proton gradient", "electron transport chain" and create a science conspiracy.
Can you say more about the ways in which fusion reactors need to surpass stars, and why people believe it's feasible that we can sufficiently get to that point?
(Also - thanks for sharing one of the most interesting comments I've read on the internet in quite a while.)
That, but also mimicing the pressure of the sun here is just not possible (yet? :D), why we need to play with higher temperaturs with different problematic consequences.
> The highest instantaneous pressures we can obtain here on Earth are in the Fusion reactor at the National Ignition Facility and in Thermonuclear weapon detonations. These achieve pressures of 5 x 10^12 and 6.5 x 10^15 Pascal respectively. For comparison, the pressure inside our Sun’s core is 2.5 x 10^16 Pascal.
Total power radiated by a black body per unit surface area scales as T^4 (in Kelvin).
So for black bodies with identical shape and linear dimensions R1 and R2, with identical power production per unit volume, both in thermal equilibrium with whatever is outside them, you would expect:
R1/R2 = (T1/T2)^4
(because setting power produced equal to power radiated gives R proportional to T^4).
Pretending humans are spheres with radius 1m and the sun is a sphere with radius 7*10^8m, you would expect the sun to have ~160 times the temperature of a human at equilibrium in vacuum. It's going to be lower because not all of the sun is power-producing, of course. But higher because a human is not 1m in radius. And again higher because humans are not spheres and lose heat more than a sphere would for the same volume (more surface area).
The sun is about 6000K on the surface. That would give us ~40K for the equilibrium temperature of a human in vacuum, which at least seems truthy.
TL;DR: the sun is big, with a small surface area compared to its volume, because it's big.
and not a gigantic amount of fusion that you have in thermonuclear weapons,
Ironically, this experiment was designed primarily to simulate the fusion you have in thermonuclear weapons. That's the NIF's purpose and the purpose of this experiment. From Nature https://www.nature.com/articles/d41586-022-04440-7
"Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”
How would it help build better weapons? I thought the existing thermonuclear bombs do the job perfectly fine. Sure, the military might want to make them a bit smaller or a bit cheaper, but is it really such a big deal as to warrant a major announcement?
"Nuclear stockpile research" is only making weapons "better" in the sense that they are reliable despite not having been tested in decades. There's probably a component of "unemployed nuclear weapons designers is a bad thing" also.
The fusion for power experiments are using the same laser equipment but different targets and sensors.
I guess they also don't want to blow up random civilians by accident again.
When castle bravo was tested, we didn't knew that lithium7 fusion was possible and that it would generate energy. The bomb had a lot of lithium7 because it was cheaper than lithium6. Castle Bravo then proceeded to explode with way more power than intended, it vaporized the measurement instruments, ruined the test site, damaged civilian property and caused a horrible amount of fallout that screwed a enormous amount of people from more than one country.
Even during war, I suppose you want your explosions to behave in the way you expect... so you need to figure out all the physics related to them.
Given the limitations on nuclear research for weapons purposes any information that can be gleaned from these experiments that is 'dual use' is more than welcome with the parties that are currently stymied by various arms control agreements. This is also why you will see a lot of supercomputer capacity near such research, it allows simulation of experiments with high fidelity rather than the experiments themselves. These are all exploitation of loopholes. The biggest value is probably in being able to confirm that the various computer models accurately predict the experimental outcomes. This when confirmed at scale will allow for the computer models to be used for different applications (ie: weapons research) with a higher level of confidence.
Presumably these computer models are mostly useful for creating new designs (since the old designs were proven by real tests). Would such new designs be convincing enough to the adversary to fulfill its role as a strategic deterrence?
When (in XX years?) almost all US nukes are only simulated on computers and not actually tested, the Russians may start wondering if the US aresnal actually works, no? That would be a horrible outcome, since it means the Russians would be taking somewhat greater risks in their decision-making. Wouldn't far outweigh any opertaional or financial benefits the newer designs offer?
I suppose one could argue that if the loss of confidence in strategic weapons matched the actual loss in reliability, it might be a "no op" (although even this is arguable). But if the Russians think the US simulations suck, while the US is actually building really good simulations, the loss of confidence would be greater than the actual loss in reliability. In the extreme case, the nukes work great, but everyone thinks they are scrap metal.
Of course, the same happens in reverse: if the Russians are upgrading their weapons to untested designs, the US may start underestimating the risk.
> the Russians may start wondering if the US aresnal actually works, no?
If anything the last year or so has probably made the reverse happening and the US and its adversaries likely both have very high confidence in that the US arsenal actually works.
Or the US finds that after x many years bombs degrade in unexpected ways, and that while we were able to figure this out and fix it. Then we speculate that the Russians probably haven't fixed theirs in the same way and their bombs aren't good anymore. Which means the risk of a nuclear war just jumps up, since MAD is compromised.
To learn about their degradation modes and how to maintain them (since full-scale nuclear tests are now verboten, and subcritical experiments only deal with fission part of the entire assembly -- we now also need some experimental setup to test the fusion part: radiation pressure, X-ray reflection, ablation modes etc. NIF is this setup).
Also, there is a constant need to improve fusion/fission rate in the total energy output, and perhaps eventually design pure fusion weapons, though this is still probably out of reach.
The USA no longer has the technical capability to manufacture the necessary parts to maintain the current stockpile. The whole strategic arms reduction treaty regime is basically a fig leaf to cover for the fact that we have to cannibalize some legacy weapons to maintain the rest. If nothing changes the USA probably won't have an effective arsenal within a century. Given the likely state of the USA by that time, that's probably for the best.
> > USA probably won't have an effective arsenal within a century.
> If other countries joined, it would be a great outcome.
Why the optimism? Without MAD, it's nearly certain that we'd have a world war at some point in time. Sooner or later, it will surely happen. If you think it won't happen, or won't cost millions of lives, or won't employ re-developed nukes eventually, please tell me why you think so. (No sarcasm.)
I don’t think MAD is all that effective. At this moment in time we’re seeing:
* Several concurrent arms races in the Middle East, Asia, and Europe
* A high intensity conflict in Ukraine
* China threatening a land invasion into Taiwan
MAD might be preventing a country like Poland from jumping into the Ukraine conflict, but more likely it’s because of its involvement in NATO.
I think collective security organisations are a far more potent force for peace than nuclear weapons. If countries abided by their security agreements in WW2, then we’d have nipped the entire thing in the bud.
> I don’t think MAD is all that effective. At this moment in time we’re seeing:
> * Several concurrent arms races in the Middle East, Asia, and Europe
> * A high intensity conflict in Ukraine
> * China threatening a land invasion into Taiwan
I mean... Only one of those is an actual fight. And there MAD doesn't apply because the defender doesn't have the Assured Destruction capability needed.
> Several concurrent arms races in the Middle East, Asia, and Europe
Which is to say, possible future proxy wars between the great powers where MAD will supposedly restrict conflict intensity. See below.
> A high intensity conflict in Ukraine
What's going on in Ukraine is a bog standard cold war style proxy war. The NATO plan is basically to turn it into another Afghanistan for the Russians. It's the exact thing that MAD is meant to keep from spilling over into a world war between the principals.
> China threatening a land invasion into Taiwan
This is more interesting. US conventional forces almost certainly have no hope of beating China that close to home. Therefore, any effective US response would require nuking China and China is presumably deterring that with their nukes. There is an argument to be made here that a non-nuclear Chinese military would be in Taiwan's best interests. However, I see no scenario where either a nuclear or non-nuclear China and a non-nuclear USA is in Taiwan's best interests. So while the MAD case isn't the best case for Taiwan here, it's also not the worst.
There are a lot more armed conflicts throughout the world that are proxies or partial proxies between the various powers in the world (US, Russia, and more). See Yemen, Syria, and many more throughout Africa. World powers tend to get on one side or another as they see their interests align and oftentimes this prolongs the conflict rather than bring it to any resolution.
I'm not really optimistic about this option, just a bit of wishful thinking about peace upon the world and such. OTOH, even with MAD we still have wars and probably much more than 100K people on average get killed in wars every year. Even with MAD, NATO is pushing conflict with Russia well beyond a proxy war at this point. MAD doesn't work if the world goes mad...
> 1. How does this research help address this problem?
I see it the other way around, this problem makes me doubt that this research will ever actually lead anywhere, supposing it's even as good a result as it first appears.
> 2. What are the sources for your opinion?
It's a fact. And my source is dead tree media. I don't recall all the details, but there are some very finicky parts that go into a state of the art warhead and we have lost the capability to manufacture them. Is this really so surprising? We can't even build new F-22s anymore!
The most famous key component that was speculated or leaked, was that we lost the makeup and reason for the Styrofoam that holds the primary and surrounds the secondary.
This research successfully initiated fusion, using a capsule of hydrogen made of some material, surrounded by something, with an outer layer. This outer layer is turned into X-Rays by the laser, which then ablate the hydrogen capsule's casing casing the inwards pressure. You could speculate, that they just found the makeup for something that would replace the Styrofoam, or we just improved upon it.
From what I can tell this is mostly a useful cover story as a fallback for the moon-shot type energy potential. But are there serious downsides to having this be the case? Does it significantly distract away from the pursuit of energy? Or somehow fundamentally constrain the projects scope?
Exactly the opposite: their task, assigned by Congress, is weapons work. Period.
But pretending to work on "carbon-free energy" is good for funding, just now. Four years ago, being all about weapons opened the tap.
Make no mistake, there is no story here. There will be no "unlimited free energy" from this, or any other fusion project. The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.
We know the way to get unlimited free energy: solar. Build more, get more. It doesn't have bomb scientists making inflated claims; it just works, and better every year.
> The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.
You seem to be awfully certain of that. I'm not an expert in this area, but my understanding is that the MIT Arc reactor is planned to use FLiBe as a liquid coolant that absorbs heat/neutrons/etc from the fusion reaction and is pumped into heat exchangers to boil water to run turbines. I mean, maybe there's some details not worked out and maybe I'm misunderstanding how it works, but it seems like a plan to generate electricity to me.
There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant, but there's no conceptual reason they couldn't do it. (Not that ITER is a great example; the design is already antiquated before it's even finished.)
Driving steam turbines, even with other costs at zero, leaves you uncompetitive with renewables. But other costs would be very, very far from zero. Extracting the grams of tritium at PPB concentration dissolved in 1000 tons of FLiBe every day so you have fuel for tomorrow is an expensive job all by itself.
Making a whole new reactor every year or two because it destroyed itself with neutron bombardment is another.
You seem to forget that storage also has no operating expense.
The cost of operating a steam turbine far exceeds the cost of the coal or uranium driving it. But the steam turbine would not be the only operating expense for fusion. We don't know exactly what it would cost to sieve a thousand tons of molten FLiBe every day to get out the tritium produced that day, because no one even knows any way to achieve it at all. But it would certainly be a huge daily expense, if achieved.
Combined-cycle gas turbines are fine for backing up renewables and storage. They are used for that today. As the amount of renewables and then storage built out increases, the amount of time the gas turbines must run, and thus total operating expense, declines.
It should be clear that to build out storage when there is not surplus renewable generating capacity to "charge" it from would be foolish. The immediate exception is to time-shift renewable energy generated at midday peak for evening delivery, as is being done successfully today.
Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.
Capital expense of renewables is very low already, and still falling. Even substantial overbuild to charge storage from does not change this. Cost of various forms of storage is falling even faster. By the time much storage is needed, it will be very cheap.
> Combined-cycle gas turbines are fine for backing up renewables and storage.
So are plain steam turbines, if you have cheap steam.
> Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.
Huh? Combined cycle setups use steam turbines as part of the system. Steam turbines can ramp up and down plenty fast. It's traditional heat sources that don't ramp well.
> By the time much storage is needed, it will be very cheap.
That would be nice but I'm not depending on it, and I'm definitely not going to assume that long term storage will ever be cheaper than steam turbines.
> There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant
Interestingly, that's not really true: IIRC the japanese team working on the WCCB breeder module (that uses supercritical water as coolant) plans on connecting the water loop to a small turbine. If they succeed it would be the first ever electrical power produced from fusion.
At least ITER has a plausible, in a sci-fi story sense, way of creating a sustainable fusion reaction to which more fuel could be given. The fact the neutrons produced, if it did reach the high Q ratios, would irradiate and break down everything involved, is probably 2070's scientists and engineers problems to solve.
Why won't fusion still be a worthwhile goal even if we do have abundant solar/wind?
Abundant and cheap are relative terms. Solar and wind could be abundant in a "powers everything we have now and for the foreseeable future of population growth" sense, but maybe not in a "gigantic power-hungry megaprojects which aren't remotely possible today" sense?
Most likely, spacecraft will rely on power delivered via laser.
If anybody succeeds in working out D-3He fusion, that could work in a spacecraft. (D-T, no.) We could probably scare up enough 3He to use for that, if there weren't too many.
If anybody in the universe is doing interstellar travel I think they would have developed D-D fusion which is somewhat more difficult than D-³He or D-T fusion but probably possible with the a scaled up version of the same machine.
Outside the frost line there is a lot of water and a higher percentage of D relative to H so it seems possible to "live off the land" between the stars without being dependent on starshine. A D-D reactor would produce ³He and T, a lot of those products would burn up in the reactor because the reaction rates are high but it would probably be possible to separate some of those out and use it as a breeder reactor that makes fuel for D-³He and D-T reactors elsewhere. I could picture the big D-D reactor running on a large comet or dwarf planet like Pluto producing D-³He for smaller reactors on spacecraft. (D-T not only produces a lot of neutrons but the T has a half life of 12 or so years and won't last for long journies.)
My guess is that interstellar travelers would develop a lifestyle that works around the frost line, where generic bodies above a certain size have liquid water inside. If they were grabby they might consume Ceres or Pluto but might not really care about dry, idiosyncratic worlds like the Earth and Mars.
Anybody doing interstellar travel should hang their collective head-analog in shame if they haven't mastered aneutronic p-11B fusion yet. (They will need to have figured out how to reflect xrays.)
Having got used to spending interminable ages out in the infinite chill void, they probably have come to prefer being there, so have no desire to roast deep in a stellar gravity well. Their equipment might not even work if warmed too much.
I think parent was talking about space applications.
Anyway unlike fusion, seasonal thermal storage is viable and available now, and will be scaled up in immediate future. Also, with electrical vehicles inducing massive investment into the grid, there will be both pressure and resources to solve the rest.
Since we're talking in present tense that's the remaining 1%.
Moon base can be fine with power beamed from a satellite or plain mirrors in orbit, no atmosphere in the way. Might end up being still cheaper than hauling nuclear reactor there plus all the infra to reliably dump waste heat from it.
It works super-great, collected in the tropics and shipped in chemical form. Before you object to depending on imported liquid fuel, consider that most of the world does already.
The main difference is that literally anybody can make it, not just "oil exporting countries" and "fuel refiners". And, will. And export excess production when local tankage is full.
I do software for a living so that gives you my level of ignorance of the real world.
I do have two question about solar
- is it “drivable” / “pilotable” ?
meaning reacting to surge in the grid? My understanding is that this feature is highly desirable for a grid.
- can we actually build enough solar panel, physically ?
Don’t we need some rare earth thingy that is not in sufficient quantity on our planet as far as we know ? ( follow up : if there is enough, will there be enough in 200 years ? )
The "rare-earths thingy" is a common, transparent lie told frequently about both solar and wind. No rare-earths are used in any solar panel. Some wind turbine generators have used them, but not the big ones. And, "rare-earths" are not in fact rare. So, a double lie.
Solar panels provide cheap power generation on a schedule. For dispatchability, you rely on storage. There are many different kinds of practical, efficient storage; which are used where will depend on local conditions. Which will be cheapest isn't clear, but probably not batteries. Batteries used won't need lithium, or rare-earths, either
The lie most frequently repeated is that storage needs some sort of "breakthrough". Second is that the small amount built out means more than that there is not enough renewable power yet to charge it from; when there is will be time to build it. In the meantime, we fill in with NG burning. The third is that "pumped hydro", the most common used just now, needs "special geography". Hills are very common.
The lie most frequently repeated about solar is that there is any shortage of places to put it. It is most efficiently floated on water reservoirs, where it cuts evaporation and biofouling, although efficiency is only one consideration. It shares nicely with pasture and even crop land, cutting water demand and heat stress without reducing yield.
There will never be any shortage of wind or solar: need more, build more; materials needed are all abundant. Likewise storage. Costs are still falling as fast as ever, but are already lowest of any energy source ever known.
They are in ores, but are mixed with other lanthanides that they are expensive to separate from. Two of them, yttrium and scandium, are not lanthanides and are relatively easy to separate out.
A new powerfully magnetic iron-nickel allotrope may eliminate much of the market for several of them.
In regards to your first question, the word you're looking for in google-able energy industry jargon is "dispatchable". And yes, dispatchability of intermittent generation is achieved in a couple of ways in contemporary electricity networks:
1. Deliberately backing off wind or solar generation from full capacity to provide reserves for demand spikes, transmission/generator outages, etc. This means other generation that may otherwise not have generated at all over that period, is brought online to cover the shortfall.
2. Co-locating grid-scale batteries at intermittent generation sites ("hybrid generation facilities" in energy industry jargon) to cover short-term contingency events.
Thank you, not my industry and not my language so “dispatchable” is a valuable keyword for me. ( it would be “pilotable” in French; if you ever have to discuss that abroad with my snotty kind )
Anyway. What I read is : having something else on the side can make solar dispatchable.
Realistically, what would be that other things ?
Nuclear don’t like to be turned on/off.
Wind has the same issue… are we saying the good ol’ coal burning kettle ?
If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem. Getting a contained fusion reaction that gives out more energy than input is the problem how to convert that into electricity is not going to be a problem.
> If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem.
Even accepting the qualification that's not just a mere matter of engineering, capturing that heat from a source that hot is not without trouble. A bit like how there is plenty of energy in a single lightning strike and yet we can't easily catch it even though in principle 'just build a large enough capacitor and connect it to a lightning rod' is a workable recipe.
> Getting a contained fusion reaction that gives out more energy than input is the problem
Not in the least because the container itself is a very hard problem to solve.
> how to convert that into electricity is not going to be a problem.
It is also a problem, albeit a lesser one.
The better way to look at all of these fusion projects is a way to do an end run around arms control limitations with as a very unlikely by-product the possible future generation of energy. But I would not hold my breath for that. Meanwhile, I'm all for capturing more of the energy output by that other fusion reactor that we all have access to, and learning how to store it over longer periods. Preferably to start with a couple of days with something that doesn't degrade (think very high density super capacitor rather than a battery), but I'll take advanced battery technology if it can be done cheap enough per storage cycle. We're getting there.
It doesn't make heat. It makes fast neutrons. Turning them into usable heat is a project of its own.
Turning dumb heat into electric power is expensive. Nothing that depends on doing that can ever compete with wind and solar, anymore.
Tritium doesn't grow on trees. Making it by blasting those hot neutrons into a thousand tons of FLiBe is easy enough. Getting your few grams a day, at PPB concentration, out of that thousand tons of stuff is... nobody has any idea how. But you need to, to have fuel for tomorrow.
No, there won't be any of that. It would be fantastically more expensive than fission. Fission is not competitive, and gets less so by the day. Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).
Saying that there are unknown engineering challenges is kind of a "duh", otherwise we wouldn't be researching we would be implementing. As you also mentioned there are other alternatives which we could consider than tritium.
> Fission is not competitive, and gets less so by the day.
> Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).
We genuinely don't know if fusion is a money pit or not, because we don't have any idea how much a successful form will cost. Tritium blankets may be easy or not. Maybe helion's D-3HE will have a breakthrough. Maybe it's ICF.
I've not seen suggestions by anyone that wind and solar build-outs stop, or get diminished. Indeed at this point because the cost are low, industry will continue to invest in them regardless.
However, we will need a lot more energy production than folks think. We need to decarbonize the atmosphere. And that's going to require a lot of power.
All that aside, solar and wind are not getting you to mars in a timely fashion. We have reasons to research fusion that escape large commercial power generation.
We certainly need a lot more energy production but to decarbonize the atmosphere there has be a target level with an evidential basis on which to proceed. If it's considered (as by many) that we have a climate emergency (though this is not reported at all in the IPCC report) then any decarbonization at all will obviously serve for starters. If this is not the case then there are other considerations such as the fact that as CO2 levels have risen so has global food production - about 30% over the last 30 years.
Simulations with multiple global ecosystem models suggest that CO2 fertilization effects explain 70% of the observed greening trend, followed by nitrogen deposition (9%), climate change (8%) and land cover change (LCC) (4%). CO2 fertilization effects explain most of the greening trends in the tropics, whereas climate change resulted in greening of the high latitudes and the Tibetan Plateau..
https://sites.bu.edu/cliveg/files/2016/04/zhu-greening-earth...
This is not a surprise given that carbon is needed for plant growth, a fact well understood by commercial growers who pipe CO2 into their greenhouses. So one issue might be, if decarbonization is successful then what might be the acceptable level of reduction in global food supply?
Another issue relates to temperature. From an analysis of 974 million deaths in 384 locations across 13 countries it’s been concluded that twenty times more people die from the cold as from the heat.https://composite-indicators.jrc.ec.europa.eu/sites/default/... A recent paper (Dec 12 2022) regarding heart attacks states “extreme temperatures accounted for 2.2 additional deaths per 1,000 on hot days and 9.1 additional deaths per 1,000 on cold days.” Circulation. doi.org/10.1161/CIRCULATIONAHA.122.061832.
Do any of the reports present an ethical problem? No, they do not given an extreme interpretation of climate models.