A tokamak is a kind of fusion reactor, and basically the most prevalent.
"100M degrees" (Kelvin) corresponds to 10 KeV (kilo electron volts), which is an important figure to exceed for D-T fusion. D-T fusion which is the kind of fusion the ITER Tokamak (a forthcoming fusion reactor and international megaproject) intends to demonstrate.
An older fusion experiment, JET (Joint European Torus) reached these levels, so this does not break new ground, but it is important if this Chinese Tokamak is going to provide data useful for ITER.
I will note that it's rather unusual to refer to plasma temperature in Kelvin rather than in KeV. I edited this comment with a few more details to try to make it easier for laypeople to understand.
My aussie friend who was a scientist said when he was in the us for about 15 years that he still used c for temp in lab but F for weather and it took him a while to realize that he had flipped and the 2 things had basically no relationship in his head.
Kind of a tangent, but I still firmly believe that for Earth weather conditions, Fahrenheit is a more intuitive scale. 0 is a really, really, really cold day and 100 is a really, really, really hot day. 50 degrees is neither especially warm nor especially chilly, 70 degrees is warm, 30 degrees is chilly, 20 and below are truly cold, 80 and above are truly hot, and if you go into negative numbers or above 100 degrees than you're in the "extremes".
Cold and warm are relative. 0 for freezing / snowing and 100 for boiling gives you a much more understandable range. Honestly, US metrics don’t make any kind of sense anymore...
While it's true that 100C doesn't have any weather meaning, 0F being "very cold" isn't particularly objective. I just looked it up, and apparently 0F is only about -18C. Whether -18C is "very, very cold" depends a lot on where you're from and what you're used to. I'm Canadian and I certainly wouldn't characterize -18C as "very, very cold". -30C maybe. 0C has a fairly objective interpretation in terms of weather: it's the point at which puddles start to freeze into ice slicks.
Yes, they are relative. 0 for freezing and 100 for boiling works well for chemical reactions and cooking. 0 for extremely cold weather and 100 for extremely hot weather works well for knowing what to wear before going outside. Celsius requires smaller numbers with decimals for similar weather precision.
Some. Not the physicists or astronomers, but some Americans who work with data from non-scientists regularly stay with the units in the data. Some engineers, those who are dealing with older equipment, stick with the units used when the equipment was built. US pilots and aerospace generally still talk in knots and feet of altitude.
Mountain climbers in the US typically also use feet for elevation. Compasses sold in the US also have different markings for measuring the Universal Transverse Mercator grid(1:24,000 and 1:50,000 in the US) and rulers(inches, feet). 1:24,000 is used on USGS 7.5-minute maps, and I believe 1:50,000 originates from an older map series, both of which have topographical lines marked in feet instead of meters. Altimeters sold in the US also customarily use feet although the digital ones can be switched to Meters.
You're right, the degree was removed in 1967, because Kelvin's units are considered absolute. However, one could argue that, since it has a scale, each "step" or "unit" can be considered a "degree", like in a literal scale. In informal conversation, we should not be so pedantic. Even some modern scientific textbooks still include the degree.
I remember seeing an infografic a while ago where some very high temperature of like 10 million Kelvins was displayed like "9999728 degrees". I thought it was mildly amusing.
It must've been, "99,999,999,726 C, the temperature inside a newly formed neutron star"! It's even more preposterous than I remembered. It's not even rounded correctly!
But then again I just noticed that I was off by one in my parent comment so I shouldn't through any stones I suppose.
You seem to know what you're talking about - do you know what the end goal of the work in this field is? Is it working towards nuclear fusion as an energy source?
The specific EAST reactor [1] mentioned in the article is a testbed that will enable new technologies to be used on the ITER project. The ITER project [2] is currently the largest fusion power research project underway (and the largest reactor under construction). ITER's goal is to provide research that enables new technologies to be used on the DEMO project. The DEMO project's goal [3] is to provide commercially available power plants utilizing nuclear fusion.
From ITER's wikipedia page:
>The goal of ITER is to demonstrate the scientific and technological feasibility of fusion energy for peaceful use. It is the latest and largest of more than 100 fusion reactors built since the 1950s. ITER's planned successor, DEMO, is expected to be the first fusion reactor to produce electricity in an experimental environment. DEMO's anticipated success is expected to lead to full-scale electricity-producing fusion power stations and future commercial reactors.
And from DEMO's wikipedia page:
>As a prototype commercial fusion reactor, DEMO could make fusion energy available by 2033.
Any idea on how much money we're spending on this?
Also, I can imagine it's a joint project only partially because we can share the cost, I imagine another reason to work together is so that no one country gets this technology first.
Several billions of Euros, with Europe paying the largest share, but the reactor is in France so we get much of the secondary benefit (i.e. scientists and engineers spending their salaries there, construction work).
Initial budget was €5bn. Current budget is four times that and with completion nowhere near estimates of the final cost are as high as $60bn. Go figure.
If by "we" you mean US's share, that's 9% of total costs. China, India, Japan, Russia, South Korea, and the US are paying 9% each and EU is paying 46%.
With 'we' I meant every country that's involved, not just my own country.
$60bn is actually surprisingly little for research that could change the future of energy production and possibly society as we know it. To put it into context, the Apollo program cost $200bn in today's money and a high speed train between LA and SF is projected to cost $100bn.
$1 trillion is not a lot of money for the US. the Congressional Budget Office estimates that interest spending on US public debt will hit $915 billion in 2028[1].
That doesn't mean 1 trillion isn't a large amount of money. That means the US spends a trillion dollars servicing it's debt. As it so happens, the US will be spending a lot of money servicing debt.
That’s a very misleading figure as inflation both directly reduces the actual debt payment nessisary, and makes comparisons between spending 2003 – 2011 vs 2028 difficult to compare.
except inflation is directly related to the amount of new debt issued, because thats the only source of M1 money supply. If the US Treasury didn't borrow from the Federal Reserve and instead printed its own currency, this would be different.
It's interesting to note that on the ITER FAQ page, they mention that 90% of contributions are to be delivered "in-kind" in the form of parts of buildings, instead of cash.
> The DEMO project's goal [3] is to provide commercially available power plants utilizing nuclear fusion
That's not true unfortunately. Yep it's planned to be attached to the grid, but it won't be a production-ready power station. That will be PROTO [1].
Basically the whole schedule slipped more, US pulled out of ITER so they had to scale it down, then it was delayed, so as things stand now DEMO will still be a testbed. Some recent DEMO design notes can be found here [2]. A bit dated, but if anything an optimistic outlook can be found here [3] at page 8. Note that DEMO is though to "resolve" some issues still.
I think calling 2033 highly unlikely would be an understatement.
https://en.wikipedia.org/wiki/ITER: ”Initial plasma experiments are scheduled to begin in 2025, with full deuterium–tritium fusion experiments starting in 2035.”
So, according to Wikipedia, DEMO will build on ITER’s results, but will produce energy before ITER’s first real fusion experiment starts.
If global warming will be more significant then is predicted now I believe they will suddenly invest 100x more into this technology. This and nuclear plants are only reasonable way how to have available energy for some 'CO2 removal' projects...
More money means you can buy better gear, hire more workers, complete projects faster and run multiple parallel sites to complete various goals at the same time.
Of course there is some point where adding more funding will not advance the speed as much anymore but I doubt were even close to that point at the moment.
Why is it so underfunded? The first country to build one will be taking a big step to reducing their dependence on other countries for energy. I can understand big oil exporters not wanting to push it too much (although it's unlikely to come into fruition for the current generation of politians), but for others isn't it a no brainer?
Historically, Fusion has been bad at predicting when it's going to be ready for prime time, that plus very early fusion experiments turned out to be a lot of duds. Just think of the Cold Fusion hype (on a side note; cold fusion does work, µCF replaces electrons with muons in the hydrogen atom which reduces the radius of the atomic nucleus such that fusion becomes possible at room temperature and far below but generating the necessary muons is a fools game) and the various other nuclear fusion failures.
Fission on the other hand could report a lot of results and success and, at the time, seemed to be infallibly safe.
Same reason preventative medicine is not practiced. The upfront costs are great and the out come not certain leaving many hesitant to invest in something they may never benefit from. Especially if your country has more immediate pressing matters that you could fix right now with that money.
Another (completely stupid) reason is that it's "nuclear" energy. The general public has a hard time understanding that fusion and fission have very different risk profiles. I recall that after Fukushima some countries cut their budget for fusion research because nuclear energy is clearly bad and unpopular.
- Oil/coal/natural gas lobbyists and interests which hinder tax payer funded research.
- The fact that there is no guarantee we will ever figure it out and no idea whatsoever as to how much it will cost to figure it out. Investors like returns, in their lifetime, leaving largely tax payer funded research as the greatest source of funds... see above.
And then with government-funded research... if a government figures out fusion, what do you do with it? Do you license it to private industry? Do you make state-owned power plants?
If you give it to private it industry, it's going to get to other nations. If it gets to other nations, you lose non-electrical power and create potential strategic issues, which means you are motivated NOT to share the technology.
It sucks.
I wish we could all just get along, fund stuff like this and space exploration, and get over petty politics before our species goes extinct.
I'd argue fusion is actually overfunded, if you go by its likelihood to actually deliver a competitive source of energy. A clean sheet energy R&D program would invest very little in nuclear fusion.
Given climate change, giving it to other nations is exactly the right thing to do, even from a purely self-interested perspective. Nobody wins if coal plants in China tip us into runaway warming.
And that's another reason ITER et al are very broad international projects: everyone wins when the project wins, and nobody can stall one project by poaching Von Braun for their own scheme.
Because we already have a functioning fusion reactor.
Utility-scale PV now costs only $43/MWh. Investing in developing fusion reactors makes very little economic sense compared with capturing the output of the fusion reactor we already have.
The research should still be done, of course. It can have benefits to a future interstellar civilization - but until we're interstellar, PV is far, far more compelling.
Cheap intermittent sources are sufficient to destroy the economic case for expensive baseload sources. The latter have to be able to sell their output most of the time or else their economic case collapses entirely.
Nine women can't have one child faster, but on average they can have about nine times as many children in nine months as one woman. It really depends what your goal is and the bottleneck in reaching that goal whether or not that analogy holds.
Sustainable high-temperature plasma is what we need to create fusion as an energy source. Current fusion processes rely on pulsing a very short instant of high temperature. This creates a brief instance of fusion, but takes heaps more energy than that quick moment of fusion releases. Thus, it's a net negative of energy. One way to do this that the US uses for nuclear weapons research is to zap a small bit of gas with a tonne of high energy lasers all at the same instant.
A high temperature plasma represents a continuous supply of fusing atoms. The current research at ITER, this place, etc. are attempts to create a persistent environment for fusion. If they can do that, then it creates an environment where research can focus on 1) reduce the energy required to hold it at that temperature (which includes limiting how much plasma leaks out, since leaking plasma drops the temperature), and 2) work out ways to extract the energy created by fusion.
As I understand (and I could be wrong, it's been years since I last read about it), ITER plans to generate a net-negative energy situation (i.e. it'll never produce energy, just consume it) but hopes to create a sustainable plasma field at temperatures that cause fusion.
ITER will never produce electric energy but they plan to achieve Q=5, ie, output 5 times more energy than put in.
The more important work on ITER is work around enabling actual power stations using Q=5 and developing the tech to maintain operating fusion reactors (remote robots, etc.)
At that point can't you just store the output current in supercapacitors for later use anyway? You can even feed it into the machine itself. Supercapacitors easily have better than 20% efficiency for each cycle.
No, ITER doesn't have any planned way to make electricity, the power it outputs will be largely thermal and radiative. Engineers will likely put up a cooling system to measure the thermal output, especially after putting up the various ways to turn the major neutron radiation into heat (which is a fun way to produce energy with rare materials as a byproduct).
DEMO, to my knowledge, will then include an actual electric generator to be hooked up to the fusion core.
As I understand, this represents sufficient energy to overcome the Coulomb Barrier [1] which naturally repels atoms apart. To cause fusion, you need to push particles together either hard enough or fast enough that they push through this repulsion and fuse. The repulsion is a product of the electrostatic repulsion of the positive charges of the nuclei (pushing the positive ends of two magnets together, essentially).
The temperature of a gas is essentially a measure of the constituent particles' kinetic energy. Higher kinetic energy = higher temperature. 10keV represents enough kinetic energy for the D-T atoms to collide fast enough that they overcome the Couloumb repulsion and fuse together.
A system of particles in thermal equilibrium will contain a small fraction whose kinetic energy exceeds the average by a factor of 10. Also, the energy available in the center-of-mass frame of two colliding particles is higher if they happen to be moving in opposite directions. That could give you another factor of up to 2. In any event, I don't think you want your fuel fusing all at once!
Expanding on this, this 10KeV temperature is the average of all particles, and there's a distribution around this average. Some will be higher, and thus more capable of colliding with high energy.
In addition to that, whether two nuclei fuse is also dependent on how squarely they collide. A glancing blow intuitively allows both nuclei to push each other away a lot easier than if they experience a head-on collision.
A graph of the fusion rate such as [1] shows that even at much lower average temperatures, fusion will occasionally happen when two higher-than-average nuclei collide head-on. As the temperature gets higher, this rate increases as more particles have enough energy and less require those head-on collisions. The peak rate for D-T according to this graph is about 70kEv.
I can't speak to why the grandparent's link referenced 100kEv, as it's been a decade since I last studied this and I'm very rusty.
How does one control the chain reaction of the fusion process? I understand fission reactors using control rods to absorb some of the neutrons to prevent those neutrons from hitting other fissile particles, but this seems like a harder problem. Those particles fusing at a "low" temperature in the distribution of equilibrium cause additional fusion reactions at higher temperature thresholds in other particles because of the energy released, if my mental model is correct? And assuming the fuel is gaseous, the idea of a control/absorptive retarder seems like a much harder problem. Edit: Oh! Maybe they reduce the strength of the magnetic containment field? Which reduces pressure inside the reaction chamber and thus reducing temperature?
In addition to this there is also a quantum effect called quantum tunneling [1] which allows for a very tiny probability for particles with insufficient energy to fuse upon collision anyway.
The Wiki entry also mentioned "two effects that lower the actual temperature needed", one being average kinetic energy and the other quantum tunneling "if [nuclei] have nearly enough energy." The term "nearly" isn't precisely defined though.
There isn’t a hard limit. But as you get farther away from the amount of energy that would be “enough” without tunneling, the probability of tunneling falls off exponentially.
Quantum tunneling effects are generally (maybe always?) exponentially unlikely, in the sense that the probability of tunneling will decay exponentially with the discrepancy between energy-available and energy-required.
They may be trying to differentiate between high temperature as "fast", like in this case, and high pressure as "hard". Though both can lead to a particle kinetic energy above the Coloumb barrier, in the former case the time between collisions will be lower (AFAICT), which may be important. I'm not a plasma guy though.
Sorry, I should've been much clearer, that was really badly worded as I was distracted also doing my day job :) As a sibling comment pointed out, I wasn't using any specific terminology as I was trying to talk at a layman level. There's not really a difference. I should've said 'hard enough and fast enough'.
Conceptually, there's two main methods of fusion - inertial confinement, and magnetic confinement.
In inertial confinement, lasers are shot at the plasma to squeeze it together so the pressure (and thus temperature) increases until fusion occurs. This is also what happens in stars, except they use gravity not lasers. Conceptually, this is what I alluded to when pushing it 'harder'
In magnetic confinement, the pressure/temperature is increased by shooting electrical currents through the plasma among other techniques that i'm not as familiar with. The volume hasn't really decreased, but the kinetic energy and pressure of the plasma has increased so it's the same principle, but this is conceptually what I alluded to when i said pushing it 'faster'.
In either sense, the idea is to somehow increase the plasma temperature which increases the kinetic energy of the particles enough that they overcome the electrostatic barrier. It was just really badly worded by me and I apologise for that!
I guess hard is something inside the star where matter is compressed by gravity and fast is when you're accelerating matter. In the end it's the same, only you can't use gravity for small reactors to replicate star technology yet, so you must accelerate particles.
SciFi likes to talk about antigravitation. But supergravitation would be cool as well :)
GP isn't using standard terminology, but I would presume "hard" was in reference to more mass, which results in higher kinetic energy at lower temperatures.
I don't know about CRT's but 10KeV is not particularly energetic. A CT unit may operate in the ~100-200kV range (so max energy of 200keV) while radiation therapy units operate in 6MV-18MV range (so electron accelerated up to 18x10^6 eV).
I don't know much about fusion, but my guess is the 10keV is impressive because it is a self-sustaining fusion reaction rather than being impressive because of the absolute energy of the reaction?
edit: someone down the page mentioned that containment is the issue. In a CRT you are just accelerating an electron across a few thousand volt potential and slamming it into the screen.
>> edit: someone down the page mentioned that containment is the issue.
Right, and that was my understanding. It's easy to accelerate a particle or a stream of them to high energy. It's another thing entirely to contain a gas/plasma at those energies. The distinction becomes more obvious when they flip back and forth between impressive sounding temperatures and simple KeV measures.
Considering that multiple particle colliders can accelerate particles over 1 TeV (LHC topping 13 TeV), yes, 10KeV isn’t a very high number taken out of context.
The main challenge in working with these high temperature plasmas is confinement. In order to achieve nuclear fusion matter needs to be heated to immense temperature, so that the kinetic energy of nuclei colliding can overcome the electrostatic force of the protons pushing each other away and "fuse" into larger nuclei (held together by the "strong force"), converting a fraction of the reaction mass into a relatively large amount of energy in the process.
In order to keep the plasma at the temperatures where fusion can occur, rather extreme measures have to be taken. In the Tokamak approach, the plasma is placed in a toroidal vacuum chamber, and "suspended" in the center of the torus by using electromagnets that line the Tokamak chamber's walls. At such high temperatures the plasma is so energetic that it is very hard to contain such fast moving particles. If the plasma "escapes" the confinement and contacts anything (ie. the walls of the Tokamak) it rapidly cools down to temperatures below where fusion can happen.
The immense engineering challenge here is to heat plasma to ridiculous temperatures, and keep it confined in a very small volume at great temperature and pressure to mimic conditions that give rise to nuclear fusion in the center of stars.
>The immense engineering challenge here is to heat plasma to ridiculous temperatures, and keep it confined in a very small volume at great temperature and pressure to mimic conditions that give rise to nuclear fusion in the center of stars.
This is not exactly true. Inertial confinement fusion has conditions that are similar to stars. The engineering challenge for magnetically confined fusion to keep the low density plasma confined for long time durations for fusion.
For anyone interested in further reading, look up the Lawson Criterion.
> If the plasma "escapes" the confinement and contacts anything (ie. the walls of the Tokamak) it rapidly cools down to temperatures below where fusion can happen.
Sounds relieving. I used to think that «if the plasma "escapes" the confinement and contacts anything (ie. the walls of the Tokamak) it rapidly…» disintegrates everything around or, when the power is huge enough, causes an apocalypse…
One of the beautiful things about nuclear fusion reactors is that they are inherently unstable at STP. In the event of a catastrophic failure, they will simply stop working (potentially after some large bangs).
Nuclear fission reactions can continue on their own for quite a while. This is one of the reasons they can be so dangerous.
At those temperatures, it will disintegrate whatever it touches. It's just that, unlike fission, fusion is unstable[1] so it quickly fizzles out, and damage will be local.
[1] Unstable in the sense that it is hard to maintain fusion conditions, not in the Hollywood sense that it blows up if you look at it sideways.
That description of destruction is entirely correct, it's just that the amount involved is tiny. Just like a big enough firecracker could destroy anything, but the ones we make just go pop.
This is not a very useful comparison. You can say that there is less energy in a stick of dynamite than a chocolate chip cookie, and yet, that stick of dynamite should still be handled carefully.
Yep even if it's hot enough it still needs the correct densities. It's akin to trying to compress a balloon with your hands. If enough 'hands' all push simultaneously it can work, but you can imagine the instabilities.
- When two hydrogen nuclei combine, they produce an enormous amount of energy. That process is known as nuclear fusion.
- Light nuclei have to be heated to extremely high temperature, it is challenging to create a controlled, safe fusion reactor that offers more energy than it consumes. Once we have such we’d have a near-limitless source of clean energy.
- Nuclear fusion does produce radioactive waste. However, in contrast to fission produced wastes, they are short lived and decay to background levels in a very short time.
- While the products of the fusion reaction are short-lived, operating a fusion reactor will active materials in the reactor and create some longer-lived radioisotopes.
- Unlike a fission reactor, which is loaded with months to years worth of fuel, a fusion reactor would have fuel constantly injected. So operator action to stop injecting fuel would stop the nuclear reaction.
There is ongoing research about what to use as chamber wall material. The difficulty is that the material needs to be able to withstand high temperatures, minimize the impurities released when hit by a particles from the fusion plasma and ideally produce short lived and/or harmless isotopes when activated by the fusion radiation.
Unfortunately, the metals most used and best known in engineering have the tendency to produce pretty nasty isotopes. The current best candidates are tungsten based alloys.
Longer-lived would be thousands of years for the steel structure until it is manually handable. 50-100 years for remote handling.
> It was shown that wait times are required in the order 50–100 years for the remote handling recycling option and
hundreds (Li4SiO4) to thousands (Eurofer) years for hands-on handling.
I recall from a talk a couple of years back ( so not sure I got it correctly) that after 50-100 years you can basically walk into the fusion reactor without having to worry about radiation.
As there seems to be quite a lot of confusion in this thread about what this is, here's an excellent video giving an overview of the state of the art in fusion energy research that is understandable by a lay audience: https://www.youtube.com/watch?v=L0KuAx1COEk
(somebody posted that video on another recent HN fusion thread)
Is it just me, or have we converged on 1 reactor design (the tokamak) relatively early on in the process? I appreciate that funds need to be concentrated in order to have an impact, but we have built dozens of this design since the 1950s, yet here we are.
And YC is invested in Helion Energy using pulsed inertial fusion. Though the tokamak seems closest to being functional. The MIT ARC design is theoretically supposed to be able to work, at least on paper:
>The ARC design aims to achieve an engineering gain of three (to produce three times the electricity required to operate the machine) while being about half the diameter of the ITER reactor and cheaper to construct. (Wikipedia)
It's not been built because of the $5bn or so cost, though given global warming / saving the planet type issues I'd be happy enough as a tax payer to have governments fund one and maybe knock 1% off the defence budget to counter that. It'd probably do more for world peace than churning out some more f35s.
The ARC reactor would have a net power output of something like 190 MW(e). At $5B, that's $26/W(e), which is an order of magnitude too high, especially given how high the operating costs are likely to be.
So, the mainline effort is focused on Tokamaks with good reason. A huge population of the smartest physicists around have been debating this topic for several decades. Not to be rude, but armchair opinions here aren't worth much vs that.
Yes we have been building dozens of these. But that's also because it's fundamentally a very difficult project. Do you somehow expect airplanes to go from the Wright flyer to a jumbo jet in less than a dozen steps?
There's also been plenty of research into alternatives. Germany is building a stellarator.
There's also a whole cohort of startups looking at more speculative ideas.
I'm not saying there can't or shouldn't be more research into more alternatives. I am saying it's unreasonable to look at tokamak research as some sort of dead horse we're flogging.
I think there are a number of startups building on the foundational research done for the other, larger experimental reactors that are trying to build small-scale fusion reactors, some with quick iterations on their designs (i.e. a reactor generation only takes half a year instead of 15 years). Some of them are based near the JET lab, afaik. TAE Technologies also has an own approach for example.
Since I don't see any comments mentioning this, I've heard a lot of talk about skepticism on other forums regarded Chinese scientific breakthroughs. I know on the state level, their numbers are not considered reliable (economic data for instance). Does anyone have thoughts on the reliability of this?
Let me ask a different question to the knowledgeable folk here. It has been noted that producing the temperature is not the hardest part but confining the plasma for long periods of time is.
On this note, do we have any reason to be particularly confident that magnetic confinement will ever break even and produce surplus energy? In nature fusion seems to occur through gravitational compression, so what makes us sure that we can simulate this by other means that will ever amount to more than just demonstrations?
ITER is designed to address exactly what you have asked.
"The ITER thermonuclear fusion reactor has been designed to produce a fusion plasma equivalent to 500 megawatts (MW) of thermal output power for around twenty minutes while 50 megawatts of thermal power are injected into the tokamak, resulting in a ten-fold gain of plasma heating power."
Basically we have good enough materials to bear ignition, we need better materials to keep it running long enough for demonstration, we need even better materials to make it approved and viable.
As far as I know temperature is one of the important but not the only important parameter. Density and confinement time can be taken into account as well to get a number that better characterizes the performance of a reactor, iirc.
If we can build plants that are as huge and complex and expensive as fission plants, then about the same effect, but without the fuel / proliferation / waste problems (which is nothing to be sneezed at).
If the cost is low, a brave new world with fusion replacing fossil generation as quickly as they can be built. Energy-inefficient processes like desalination and cracking water for hydrogen become attractive.
If the cost is very high, it may be that renewables have stolen fusion's market slot. In that case we'll see some national prestige projects, but against a broader renewable energy market.
I'm not sure that it would have a profound effect at all. Fission is very power dense as well and fission fuel cost is not the primary driver of overall costs in existing nuclear designs.
Fusion might end up looking a lot like fission, but hopefully with a lower perceived safety risk and thus more public acceptance.
Granted, modern fission designs aren't actually unsafe, but that doesn't matter for PR purposes.
100M seems insanely high, beyond what anything man made would be able to contain. Is it extremely short lived? Or over a very small area? Very interesting.
It is incredibly high. We hold it in not by any material, but by magnetic fields. Plasma has the handy feature of being magnetic. The Tokamak[1] design used in ITER and this example uses a whole heap of magnets to hold this plasma in a doughnut-shaped area.
Fundamentally it doesn't have to, because like a weight resting on the top of a ladder magnetic fields store energy and do not dissipate it. In practice it does because our ways of producing and maintaining strong magnetic fields require a lot of upkeep. This, like most things in fusion, is an example of where the realities of present day engineering are far cruler than the laws of physics.
In inertial confinement fusion, it's a very small volume for a very short time. But this is magnetic confinement, which means it's a pretty large volume (probably cubic meters in this one), and they want to keep it going as long as possible (I think 30 seconds is the record so far). But the plasma is very thin; the atoms move really fast but there aren't many of them, so the actual amount of heat isn't remarkable.
Well.. assuming the room is about 3 meters high, that's a room volume of 15 cubic meters. Assuming it's filled with air, and air weighs about 1,29 kg per cubic meter, that's 19,35 kg or 19350 grams of air. 15 degrees C is 288 Kelvin, so that's 19350*288 = 5.572.800 gram-kelvins of energy in the room initially. Now we add 100.000.000 gram-kelvins (the one gram of hot stuff) to it, and assuming this energy distributes over the air contents, our air now has a heat energy of 105.572.800 gram-kelvins. Dividing it back the same way (over the 19350 grams of air) gives us 105572800/19350 = 5456 gram-kelvins per gram of air, so a temperature of 5456 kelvins or 5183 degrees C. Still pretty hot.
Once we achieve sustainable fusion, will it be possible to "share" the energy with everyone else to create more independent fusions? Kinda like keeping the candle burning so as to light more candles because matches are too costly.
Now, I don't expect politics to allow sharing of fusion energy to help other countries.
Great design. I have a friend who took it a bit further, and build the Fusor with Lego bricks. Never since any worries about energy bills, he went completely off the grid!
> I don't expect politics to allow sharing of fusion energy to help other countries.
You are very wrong. ITER is a $20B international collaboration to construct an energy positive fusion reactor. The participants, including China the underwriter of the experiment detailed in this thread's article, are very much sharing.
Isn't fusion reactor basically infinite energy, I mean sure it makes total sense to no share it with other because you can sell your free energy for cash, but considering that a lot of global scientists work on it and most of the findings are published, don't think that such strategy would last long.
Just like previous predictions that the introduction of fission power plants would result in electricity 'too cheap to meter', I suspect the same will happen for fusion. Even if the fuel is nearly free, you still have to build and maintain the power plant, the grid etc., neither which is cheap.
But yes, the first ones to 'crack' the problem will have a head start in the commercial fusion power plant market, but I don't think it will last very long. As you say, most of the research is being published, and even if somebody manages to initially keep that final 'dot on the i' secret, it wouldn't take other researches long to figure it out.
From what i've read[0] it will remove about a third of the cost (probably less). About a third of energy cost goes on fuel (now almost 0), a third on the grid, and a third on the station (now probably more).
Whilst a 20-30% reduction in energy cost would be great. The really important change is that it is much more scalable and easy to get. You don't need to dig up coal, or drill for oil. All of which can be limited. With a working fusion reactor a country like Singapore can be energy independent, in a way it never could with any other type.
This independence would mean no more need to mess around in areas with these resources (think middle east) or have your countries energy rely on a third party you'd rather not rely on (think Germany and others and their reliance on Russian gas)
[0] A Piece of the Sun: The Quest for Fusion Energy - throughly enjoyed this book.
Even though the cost of energy is divided by thirds, fusion still generates so much more power that the sheer scale can overpower lots and lots of bottlenecks.
For instance, it is chemically possible to combine water (either atmospheric or from a normal water source) with atmospheric CO2 to chemically synthesize hydrocarbons. It just takes energy. So you could have a single, absolutely massive fusion plant next to an atmospheric fuel refinery and use the hydrocarbons for fuel storage and distribution. You could do likewise with hydrogen and oxygen if you wanted to make fuel cells or rockets. And this would probably be cheaper than refining fossil fuels. You could actually run OPEC out of business this way, make the electric car obsolete, make airlines carbon-neutral, etc., etc.
Climate change? Just extract the atmospheric CO2 using cheap fusion energy and turn it into an easily sequestrable form. Nitrogen fertilizers? You can make those from the air too. Drought? Use fusion power to run desalination plants. Arable land becomes a non-issue with fusion because vertical farming becomes easy. Every city could just grow whatever they needed in exactly the right climate conditions in a set of enclosed vertical farms, though they might not need to because energy will be so cheap that there's no problem shipping the stuff from Honduras anyway.
Computation scales with power, too. More power, more computation, except computation produces heat, which you need even more power to chill.
That's a great explanation of some of the benefits! These are the most important in my opinion: "You can make those from the air too. Drought? Use fusion power to run desalination plants. Arable land becomes a non-issue with fusion because vertical farming becomes easy." You could also use this water to stop desertification, which would be a great thing.
I do disagree with you on Hydrogen fuel cells replacing electric cars though. I think battery powered cars will be better than hydrogen powered ones. The power storage is more reliable, and the power transfer is more simple (lines as opposed to truck delivery or pipeline. And if we assumed that it would be done on site at stations, then they would already have power lines, so just use that!)
Even if the fuel is pretty cheap the cost of fuel isn't a huge factor in the operation of traditional nuclear power plants - a fusion power plant will still have enormous capex and opex - so in no sense will the power be "free".
Not really, it will wear out due to radiation so you need to change the body periodically, unless you use other fuel like helium-3, but it's got it's own problems - much higher fusion temperature and very rare on our planet (a lot of it on the surface of the moon though, it would be worth mining if we had working reactors).
> I don't think it'll be economically viable. I'll be happy to be proven wrong though.
I don't think current designs like the one discussed here will be. But I'm hoping it will lay the groundwork for much more useful reactors in the future.
Stories like this scare me. With all of the precautions, even things like Fukushima failed and will poison our ocean for millennia. What happens if we have a runaway fusion process through some pathway that was unexpected?
With all the talk about the LHC possibly producing mini blackholes or magnetic monopoles that could potentially cause protons to decay spontaneously, I don't have enough nuclear physics background to know whether we are inherently safe, or if there is a real risk here.
Uh, worst case it blows up pretty conventionally, lots of heat (but not really, not a lot of actual material is used), and, uh, as far as byproducts... helium? Maybe some lithium or boron o carbon or aomething if things get really wild?
Basically it's really really safe as far as byproducts. And yeah, it's a teeny sun, that instantly goes out if you stop feeding it juice, which sounds bad, but it's all the tiny stable particles, not the big slowly decaying scary ones.
I read in another thread that the radiation in that dissipates in about 30 years, vastly preferable to the tens of thousands of years of nuclear fission waste. I don't know how strong the radiation is either, whether it's more or less dangerous than fission waste.
The way radioactivity works, the faster a substance decays, the more dangerous it is. Something being slightly radioactive for ten thousand years is much better than some searing-hot exotic isotope sitting around for 40.
Not really. The big issue is with long-term deterrence, not short-term. It is fairly easy to isolate something for 30/50/100/200 years. It is much harder to isolate something from future humanity with a high level of certainty for 200,000 years. Even things with low levels of radioactivity will kill you if ingested and do a lot of harm if kept near. The problem of communicating this to future humans is a big one. How do you keep future humans who may not have the level of technology we have from deciding that the magic, glowing, heating stone is a source of healing and prosperity instead of something to be avoided?
Sustaining fusion is very difficult (we really haven’t solved it yet!). Runaway fusion is only really possible if you can simulate stellar conditions, which requires extreme amounts of energy that must be fed in continiously to maintain temperature and confinement. Plus, the radioactive byproducts are much shorter-lived, and so pose much less of a risk of causing lasting ecological damage. Also, just as a FYI, the LHC has not produced any mini black holes or magnetic monopoles. Anything of the sort would be a major accomplishment, far surpassing the discovery of the Higgs Boson.
Fusion processes cannot go "runaway" (unless you manage to recreate an entire star in the lab by accident, which requires a lot more mass than we have).
If a fusion reactor is breached, the plasma will likely dump it's thermal energy into the air (likely this will cause a minor detonation, it's strength depends on the energy in the reactor and the amount of fuel). Additionally it'll leak some short lived isotopes and maybe create a few long live ones.
All in all, a detonated fusion reactor is likely save to walk in the same year it exploded, if not significantly earlier.
There's most likely not even a risk to the experiment itself from runaway plasma. Experiments like these use very low-density plasma (mBar) so despite being extremely hot, it's a fairly low mass overall of heated plasma. These experiments regularly lose deconfinement of the plasma within the magnetic fields and it collides into the inner walls of the tokamak and cools back down. This is the main thing these experiments are designed to study: how to keep the plasma stable for longer. Even if there were a larger and hotter mass, we don't have the pressure from gravity here that makes fusion work in the center of stars, so it will just kinda melt things and then cool off, if that.
https://news.ycombinator.com/item?id=18004980 Calculations have been done in general about runaway fission/fusion already. Hawking showed that mini black holes "evaporate" very rapidly, before anyone would even notice.
As for some unexpected pathway, energies like this are reached in stars all the time, and they don't spontaneously explode on a regular basis. If you're talking about a basic science experiment being the Great Filter -- well, maybe. I'm skeptical.
> As for some unexpected pathway, energies like this are reached in stars all the time, and they don't spontaneously explode on a regular basis.
Well, they kind of do…
But really, the only reason why stars manage to stay in hydrostatic equilibrium is because of gravity, which we really can’t do on Earth so we settle for fancy magnetic confinement vessels.
Wasn't said ever-present background radiation only a thing after the nuclear bomb tests? I read a while ago that steel from old ships that sunk deep is highly valuable because it's not irradiated yet.
To a first approximation all matter on earth radiates because all matter is present with some isotopes that will eventually decay. Don't forget that we're sitting on top of a huge ball of molten metal kept hot by ongoing nuclear decay. That process produces all kinds of isotopes that eventually make their way to the surface.
The bomb tests did increase background radiation for us; irresponsibly so in my opinion. But so does burning coal. We have been lifting the level of background radiation over the natural level for centuries by now.
I don't see how sunk steel would be any more valuable than steel from freshly mined ore. But I like to be surprised about these things :-)
> I don't see how sunk steel would be any more valuable than steel from freshly mined ore. But I like to be surprised about these things :-)
Steel production uses air from the atmosphere. Thus it picks up the increased background radiation while it is refined. It may be possible to scrub the radioactive components from the air to avoid contanimating the steel, but I expect the cost would be prohibitive (at the very least, more expensive than getting it from old battleships).
It is low-backgroud steel([0]), not no-backbround steel. Potassium-40, for example, is a naturally occurring radioisotope. See Banana equivalent dose ([1]).
Amount of seawater on Earth: 1.338 billion km^3 (according to Wikipedia). Assuming 1.0 kg/L, this is 1.338 * 10^18 ton.
Seawater is about 0.04% potassium, so this is about 5.35 * 10^14 ton of potassium.
Potassium contains naturally occurring radioactive isotope (40K): the radioactivity of potassium is 31 Bq/g. Hence the natural radioactivity of all potassium in seawater is 1.66 * 10^22 Bq.
For comparison, again according to Wikipedia:
> In May 2012, TEPCO reported that at least 900 PBq had been released "into the atmosphere in March last year [2011] alone"
...which is 9 * 10^17 Bq, or about 1/18,000 of naturally occurring radioactivity in the ocean due to potassium alone.
(We didn't even start on stuff that are commonly considered "radioactive", like all the uranium and thorium lying beneath where you are.)
* Meanwhile we're busy burning fossil fuels, increasing the amount of CO2 in the atmosphere by ~30%.
D-T fusion, which is the easiest to achieve and perhaps sustain, directly produces He nuclei (that is, alpha particles) and rather energetic neutrons at 14 MeV. Those neutrons, aside from being a form of ionizing radiation themselves, are bound to transmute some of the surrounding material into radioactive isotopes. So, I don't think that "no toxic or radioactive byproducts" is correct. However, the results are easier to handle than those of fission reactors.
I don't know a lot about fusion, however «no radioactive bioproducts» seems to be off. I just read Wikipedia, and the deuterium-tritium reaction produces neutrons. Radiation seems to be not as bad as for fission reactions, however it is NOT radiation-free.
The earthquake was the most powerful to ever hit Japan and the fourth most powerful in the world since modern record keeping began, and the investigation into the disaster showed that the safety precautions weren't adequate in the first place.
You can't dismiss the technology based on that incident. Just like we don't ban cars because a lot of people don't operate them properly.
When I was in high school we visited a small 5MW nuclear reactor. It was a few years after Chernobyl and we got a very long lecture about how this was all the fault of the terrible Soviet design and that it could never happen in a western-designed nuclear reactor.
As far as I remember from the news at the time, the tsunami was terrible, but not unprecedented. If this obvious risk was ignored, what other risks are being ignored elsewhere?
A Chernobyl accident can't happen in a western reactor, and it wasn't what happened in Fukushima either, so the lecturers weren't wrong.
I don't know the facts of the tsunami whether it was unprecedented or not, but considering the magnitude of the quake I'm guessing it was one of the largest tsunamis to ever hit Japan. This is speculation on my part.
They didn't ignore the risk of tsunamis, they had precautions against them but they weren't up to par. It was a series of malfunctioning safeties that caused the accident. The backup power generators conked out, and the backup to the backup was washed away by the floods. And the floods only managed to get that far because the protective walls weren't enough.
"Build better walls" seems like a trivial problem to solve, don't you think?
International regulatory bodies could also be more proactive in finding these flaws prior to accidents.
It's not a hard problem to solve in the long run. It'll be easier and quicker than finding a viable non-nuclear energy option anyway.
> A Chernobyl accident can't happen in a western reactor, and it wasn't what happened in Fukushima either, so the lecturers weren't wrong.
He was referring to the core meltdown, not the very specific fault mechanism. I should have made it clear. Otherwise, it would be an uninteresting technicality.
The entire Fukushima incident makes me suspect it's a result of defining an exact fault model and then optimizing to the model. This way, if the fault slightly exceeds the model, the result is not graceful degradation, but catastrophic failure.
Can we put the generators below sea level? sure, the sea wall is high enough, no need to worry about that at all.
> "Build better walls" seems like a trivial problem to solve, don't you think?
It's also important to note that literally one person has died as a direct result of the reactor failing in Fukoshima. Around 1600 died in the evacuation process, mainly elderly people.
The earthquake itself killed over 15000 people.
Imo it's a massively overblown disaster. Yes, it's bad, especially the environmental effects, but it's absolutely nothing compared to the earthquake and tsunami itself.
The distinction between direct and indirect deaths is completely irrelevant. The decision to evacuate will always be taken under conditions of uncertainty, when it is impossible to know the eventual scale of the disaster.
It is also irrelevant to compare the earthquake and tsunami to the nuclear disaster. Earthquake and tsunamis are unavoidable natural disasters, but the Fukushima Daichi disaster could have been easily avoided.
You're also ignoring the massive costs of the disaster and the monumental scale of the cleanup. The official costs put it at $188B and counting.
No one is dismissing technology. My point is that humans are terrible at assessing risk.
The Japanese built Fukushima to standards that they felt were acceptable, and they were wrong and now the ocean is being polluted for the next thousands of years with the continuous risk of things getting worse at Fukushima.
In the same manner, they may be building this fusion reactor or LHC with what they feel is acceptable risk, but could they be wrong, with extremely catastrophic results? This is something I don't know but would love to know the answer to.
Low quality news aside, the LHC never had a risk of producing micro-blackholes that could cause any damage. Very tiny blackholes evaporate instantly, before they have time to do anything. The LHC would detected this, which has not happened.
Fusion was, and for the foreseeable future will be, a boondoggle. In the US it was a cold-war-era arms race program intended to scare the USSR and have them overextend, and now the Chinese are using it for propaganda and scientific Keyensianism.
The fact is, fusion generates neutron radiation that destroys the reaction vessel, making it an unviable technology. Nobody takes it seriously as a source of energy, aside from uninformed people. As cool as the idea of controlled fusion is, it is and will remain science fiction.
I dunno, my buddy who just got a degree in high energy plasma physics working on fusion reactors might disagree with you.
And really, you just sound like every crank ever who thought X technology was totally unfeasible and always would be -- until it wasn't. So currently attempts haven't found a solution to the reaction vessel destruction problem. That does not mean someone in the future couldn't figure that one out.
It's not false. But it's also not a universal constant like the speed of light or something, absent a reason why it couldn't be planned for and dealt with, I'm inclined to treat it as an as-yet-unsolved engineering challenge.
Neutron radiation, plus the unfortunate geometric fact that the surface area/volume ratio of a fusion reactor will be low, compared to the fuel rod surface area/volume ratio in a fission reactor.
What this does is ensure that even operating right at the limits of neutron damage to the wall materials, the volumetric power density of a DT fusion reactor will suck. And that will destroy the economics.
Research into vessel liners is ongoing and promising. 7-X is getting carbon-carbon, JET has an upgrade going in about now. It's a materials-engineering challenge, but not a showstopper.
In MIT's ARC design, the reactor is designed so you can easily open it up and replace the inner vessel. The vessel is 3D-printed and replaced once a year. Surrounding the inner vessel is a molten salt mixture which breeds more fuel from lithium but is otherwise unaffected by neutron radiation.
This is a regular tokamak design, with a high chance of success since we understand tokamaks very well at this point. Various startups have more speculative designs that deal with the issue in other ways.
A single ARC reactor will use 40% of the world's annual production of beryllium.
The power density of an ARC reactor will be around 0.5 MW/m^3. In comparison, the power density of a PWR reactor vessel is 20 MW/m^3.
Replacing the entire inner vessel once a year would be an operational nightmare. For one thing, it ensures the building the reactor is in will have to be very large, with very large secondary bays where the intensely radioactive material of a spent reactor vessel can be moved and disassembled (generating radioactive fragments and dust).
Boron, however, is more easily concentrated (in evaporites). Beryllium is found in pegmatites, which are less common. The estimated resource (not reserve) of Be is 100,000 tons (USGS). This would be enough for ARC reactors supplying just 1% of current world primary energy demand. The estimated world resource of boron is in excess of 1 billion tons.
Somehow dealing with neutron radiation seems like the lesser problem between it and confining fusing plasma at 100 million K. You make it sound like there's literally nothing to be done about neutron radiation, but something tells me there's probably material scientists interesting in working on that.
Oh sorry, we didn't realize that you completely understand all of physics and can qualify, without any possibility of error, that any configuration, period, that utilizes fusion will necessarily be impossible because of this physical restriction.
"100M degrees" (Kelvin) corresponds to 10 KeV (kilo electron volts), which is an important figure to exceed for D-T fusion. D-T fusion which is the kind of fusion the ITER Tokamak (a forthcoming fusion reactor and international megaproject) intends to demonstrate.
An older fusion experiment, JET (Joint European Torus) reached these levels, so this does not break new ground, but it is important if this Chinese Tokamak is going to provide data useful for ITER.
I will note that it's rather unusual to refer to plasma temperature in Kelvin rather than in KeV. I edited this comment with a few more details to try to make it easier for laypeople to understand.