Are these standards being relaxed because they were initially set overly cautiously tight, or are they being routinely relaxed without appropriate engineering?
For example, the steam valve leakage. Apparently some leakage is acceptable. Over the years they have quadrupled the allowed leakage and gone from a "per valve" measure to a "total for all valves" measurement scheme. Is the measure there to predict valve failure? Is the measure there to control total radiation release? Cooling water loss? Steaming the staff? I'd have gone "per valve" if I was monitoring the health of the valves and "total aggregate" if I was concerned about the other effect. Did they decide that the "per valve" number was dominated by the other measure?
The article claims the existence of an engineer that thinks the standards are being loosened too much, but if you have three engineers and you can't make one disagree on some things, you have too many engineers.
The motivation for moving from "per valve" to "total for all valves" was cost savings in monitoring. There are thousands of these in plants. Operators didn't want to have to check each one individually, so they got the rule changed to a total measurement.
To actually measure they "randomly" sample their valves, monitor, and then extrapolate to the rest. I say "randomly" because the best ones get checked repeatedly.
The steam valves are designed with pressure relief valves. When pressure exceeds a specified limit, the release is opened to the atmosphere.
Increasing releases is a symptom of other issues in a plant, some very dangerous and some just a reflection on increased utilization.
I have a BS/MS in Electrical & Computer Engineering. For a while I designed/manufactured sensor systems. Our customers were nuclear plants, oil refineries, chemical plants, etc. My company had a line of products that monitored steam valves, among other systems.
For a bunch of reasons I decided to go back to law school and added that skill.
So now I operate at the intersection of law, management/operations, and engineering for small companies. So far it is working out as a good niche.
Are these standards being relaxed because they were initially set overly cautiously tight, or are they being routinely relaxed without appropriate engineering?
I think the problem is that no one really knows. No doubt the original standards were set with substantial safety factors. No doubt time and experience have given us a lot more data on what we can usually get away with.
But the keyword there is "usually". However unlikely a major accident has been, it is getting more likely. And we won't know just how much more likely until it actually happens.
Since the regulators have no "skin in the game" and will never be imprisoned for a dangerous decision - why exactly do they care? What's their motivation for decisions?
Whistleblowers are far more likely to be prosecuted than the regulators.
This is one very strong reason why we need new nuclear reactors. For safety lapses such as these, I blame the current moratorium against new nuclear facilities in the US.
Lobbying can only do so much. In the face of an ideology dedicated to putting humanity back in the stone age, it can do little:
"Complex technology of any sort is an assault on human dignity. It would be little short of disastrous for us to discover a source of clean, cheap, abundant energy, because of what we might do with it." -- Amory Lovins
"A massive campaign must be launched to de-develop the United States. De-development means bringing our economic system into line with the realities of ecology and the world resource situation." -- Paul Ehrlich
What is needed is full a refutation of the quotes above, and a system of ideas (Google "Alex Epstein") to show us how things could and should be. This is why I say that simple political lobbying, while necessary, isn't remotely enough.
The nuclear industry is pretty bad at lobbying: it has few workers and a bad image. It usually gets trumped pretty thoroughly by the coal industry lobby.
"Keeping them in service is much more profitable."
Perhaps, but for how long? Profit over time from honest, new, innovative work can be much higher than scraping from the bottom of the barrel or from milking old achievements. (Just ask RIM or Nokia about how well that worked out for them.)
But to do that, you have to have the freedom (from coercion, wherever it may come from) to engage in that honest, new and innovative work. For the most part, the computer industry has that freedom, but the nuclear industry does not.
The nuclear industry can't just drop their current equipment and move on to new stuff like the computer industry can. Getting rid of an old reactor is horribly expensive.
What moratorium??? There is no moratorium against new nuclear facilities in the US. It is true that there haven't been any new nuclear reactors in the states for a long time, but there is no moratorium. (I think a couple of individual states may have moratoriums, but there is no federal moratorium).
The reason why there haven't been any new reactors is probably commercial. Investors are realizing that nuclear power is actually very expensive and will not build new reactors unless the taxpayers somehow pay for them.
The regulations are such that building new nuclear reactors anywhere within the NRC's jurisdiction is a long, slow, uncertain, and excruciatingly costly process. It's a de facto moratorium.
>The Vermont Yankee plant has the same design as Fukushima Daiichi and was approved for a 20 year extension, ten days after the Fukushima meltdown
Are you suggesting that Vermont is likely to be hit by a tsunami?
The Fukushima plants did not 'fail', they were beaten to fucking death. For whatever reason, the regulations in Japan did not consider tsunamis of that magnitude possible, despite the fact that they have happened several times in recorded history in the same region.
>Also, the Nuclear Regulatory Commission (NRC) "has never turned down a plant relicensing"
That's probably literally true. This is because the process involves a long period of examination and overseen investigation by parties with no stake in keeping them running - it would be obvious that such a requested license would be turned down years before any kind of formal 'No' would be issued, and continuing to seek one would waste millions of dollars.
From the article, that plant is moving its spent fuel rods from the roof top pools to a passive containment system.
I guess actively cooled, roof top swimming pools full of used fuel rods makes sense from the standpoint of monitoring, but it carries more failure modes than anyone wants now.
Not terribly high, but that doesn't seem like a very good excuse for lackluster design specifications.
Also, it's worth noting that while Vermont isn't exactly the the SF Bay Area when it comes to unfavorable proximity to fault lines, it's not entirely devoid of seismic activity.
Map is USGS, not TreeHugger original, but I couldn't find it on USGS.gov.
"As a result, the minimum standard was relaxed first by raising the reference temperature 50 percent, and then 78 percent above the original — even though a broken vessel could spill its radioactive contents into the environment."
Amazing. I know that there are many who still think nuclear energy is safe, but I think it's inherently unsafe, not because the original designs are flawed or anything technical like that, but because people seem to get greedy or too complacent around them, and that's what can ultimately lead to nuclear disasters.
I am well out of my specific expertise here, but consider these facts for a less frightening interpretation that would not sell newspapers…
• There are reactors in operation that were designed in the age of slide rules. Even in 1980, $5 million "supercomputers" ($15M in today's dollars) were less powerful than my phone.
• In "An Overview of Radiation Embrittlement Modeling for Reactor Vessel Steels" 1993, http://books.google.com/books?id=quzCvTWt9CMC&pg=PA99... the oldest work in the bibliography specifically on embrittlement that I see is from the 1970s with the bulk being in the 1980s.
• When faced with uncertainty, engineers set the limits high enough to encompass the worst case.
• When faced with approximation, engineers set the limits to encompass the error of the calculation.
A poorly understood phenomenon coupled with primitive, coarse grained models (in order to be computable by the existing resources) is going to get you an overly conservative operating range.
25 years of studying what actually happens in live reactors, 25 years of theoretical studying, and unimaginably fast computers for verifying models and simulating situations is going to get you a different answer to the question.
I wish there had been a careful reexamination of standards based on new knowledge.
More likely is that bringing old plants up to standard was too expensive. Or even worse, taking them off the grid.
In my view this is the real danger of nuclear power: New plants are probably quite safe but they are very expensive to maintain when they get old. On the other hand, if you relax the standard they are more profitable the older they get. So at some point you end up with a lot of old plants that are extremely expensive to maintain or shut down.
On the other hand, the commercial pressures of operating complex metal and concrete structures way beyond their design-life may lead to pressure to relax standards, that when combined with aging materials, lead to a chain of events that creates a serious accident.
We've had quite a few very narrow escapes in the US. Here's just one:
Davis-Besse
http://events.nace.org/library/corrosion/NuclearIndustry/nuc...
>High pressure inside the reactor vessel pushed the stainless steel outward into the cavity formed by the boric acid. The stainless steel bent but did not break. Cooling water remained inside the reactor vessel not because of thick carbon steel but due to a thin layer of stainless steel.
Personally, I'll support nuclear power if based on a reasonably clean low-waste producing technology, and if it can be proven that a safe passive shut-down is possible under station black-out conditions, without human intervention.
I can't point to a "similar legacy of birth defects suffering and eviction from the land" in conjunction with any other kind of disaster, up to and including earthquakes causing millions of deaths.
The 'league' of a disaster in your head is not an important metric; the oil spill is responsible for far more environmental contamination, food supply reduction, and long term ecological damage than Chernobyl.
Coal plants kill more people every year without even being called 'disasters' - the birth defects caused by the radiation being put in the atmosphere by those plants is distributed over a far larger area, but it's way out of the 'Chernobyl league'.
You are attributing extra importance to Chernobyl because it causes an unusual type of localized damage, but it never caused that much of it.
For another stark example, compare them to the damage done by properly-operating coal plants every year. Mining coal, just mining it, directly kills more people every year (on the order of 10k) than Chernobyl will. Then there are the various mining-related lung diseases. Then there are the 10–20k (in the US alone, depending on which estimate you believe) deaths/year from the pollution.
The worst case scenario for coal is very close to the average case scenario, which is that tens of thousands of people die every year from pollution-induced heart attacks and strokes, as well as mining accidents. But that doesn't make exciting headlines.
This is a tragedy, of course, but nowhere near the level of a nuclear disaster like we're seeing in Japan right now.
Last I heard the current death toll from Fukushima was 2, and those workers were killed by the tsunami itself.
What really is the worst case scenario in nuclear meltdown?
I've understood that actual fission explosion is nearly impossible, given the delicate structure inside a nuke to make this happen.
Chernobyl scene will not be seen again (except maybe in Russia) because graphite is somewhat rare slowing material.
So it's something about bunch of radioactive materials that have buried themselves below ground. Then they get in contact with water, and steam explosion can occur. How big is this? How much radiation will spread? How probable is this?
Are these standards being relaxed because they were initially set overly cautiously tight, or are they being routinely relaxed without appropriate engineering?
For example, the steam valve leakage. Apparently some leakage is acceptable. Over the years they have quadrupled the allowed leakage and gone from a "per valve" measure to a "total for all valves" measurement scheme. Is the measure there to predict valve failure? Is the measure there to control total radiation release? Cooling water loss? Steaming the staff? I'd have gone "per valve" if I was monitoring the health of the valves and "total aggregate" if I was concerned about the other effect. Did they decide that the "per valve" number was dominated by the other measure?
The article claims the existence of an engineer that thinks the standards are being loosened too much, but if you have three engineers and you can't make one disagree on some things, you have too many engineers.