Hacker News new | past | comments | ask | show | jobs | submit login
Not-so-secret atomic tests: Why the photographic film industry knew (2013) (imaging-resource.com)
249 points by arprocter on Feb 8, 2016 | hide | past | favorite | 77 comments



It's strange that the article is centered in Iodine-131 [1]. It has a half life of 8 days, so if the corn is contaminated, it would be almost clean at the time it's distributed. It's different for milk, because it's collected, sold and drink in two or three days.

For the film industry I'd be more worried about Strontium-90 [2] that has a half life of 29 years, so it can survive long enough to get collected, processed and used.

[1] https://en.wikipedia.org/wiki/Iodine-131

[2] https://en.wikipedia.org/wiki/Strontium-90


If you look through the comments on this article, somebody else made the same point, and claimed that it was actually cerium-141, and not Iodine-131:

<<Tim, Thanks for this article and I think you did a great job on a very important but little known topic. I've researched this quite a bit and I'm certain it was cerium-141, not iodine-131, that was lodged in Kodak's packing material that fogged their film, and the packing material was strawboard, not corn husks. (There's some disagreement on the internet - even a U.S. senator who had misspoken on the facts - but if you really dig deep, you'll find it was strawboard.) >>


There's the reason: the affected children who drank the milk, as the Iodine-131 was the topic of this article:

http://www.ieer.org/latest/iodnart.html

"The exposure of millions of children is especially troubling because much of it could have been avoided."

"American children were actually exposed to 15 to 70 times as much radiation as had been previously reported to Congress."

The same article mentions the Kodak story ("The physicist's knowledge of the secret project was not altogether surprising: the Kodak Company ran the Tennessee Eastman uranium processing plant at the Oak Ridge National Laboratory") and has many more valuable and relevant details.


Half-life is not life-time. If you have a very high dose of I-131, 7% of that dose will still be around after a month, and 0.04% after 3-months. That might be enough to still be very hazardous.


Sr-90 is a huge hazard if it gets into your food supply. It replaces/binds Calcium and pings your bone marrow with ~.5MeV betas (electrons).


It was really the US politics of these times to present the atom bomb and the tests as "a good thing" in general:

http://www.slate.com/articles/arts/culturebox/2011/07/wish_y...

Also, the "bikini" famously got its name from the

https://en.wikipedia.org/wiki/Bikini_Atoll

"Between 1946 and 1958, 23 nuclear devices were detonated by the United States at seven test sites located on the reef, inside the atoll, in the air, or underwater.[4]"

"Building on this attention, in 1946, French engineer Louis Réard named his new swimsuit design the bikini, in hopes that its revealing style would create an "explosive commercial and cultural reaction" similar to the 1946 nuclear explosion at Bikini Atoll.[5][6][7][8][9]"

The press loved it.

I also remember reading about some US leaflets containing something like "if somebody comes to you showing the reading of a Geiger counter, don't trust him, he's doing the foreign propaganda and just tries to make our citizens feel less secure." Maybe somebody remembers that better?


Also:

"From 1951 to 1957, the United States military and the Atomic Energy Commission collaborated in a series of human radiation experiments called the Desert Rock exercises."

It seems they ordered soldiers to observe the explosion and then march through the site.

http://www.strangerdimensions.com/2012/07/09/the-united-stat...

The voice of the announcer in the video after the detonation: "these experiments are designed to dispel much of fear and uncertainty surrounding atomic radiation" "only 10% of injuries are due to the gamma rays"

The report of one of the soldiers:

http://www.angelfire.com/tx/atomicveteran/my-story.html

"When getting a reading almost anywhere near Ground Zero the Geiger needle just lay over on the maximum, so we started not to pay much attention to whatever the reading was on those Geiger Counters."


A large part of the "Atoms for Peace" and "Our Friend the Atom" were a result of Eisenhower's deliberate attempt to normalize the usage of nuclear weapons in combat.


The National Atomic Testing Museum in Las Vegas is a fascinating museum about the era of atomic testing if parent's post piques your interest.


My observation is that the government tendency to keep secrets apparently does not (or at least in this case did not) extend to keeping the secrets from those who have enough information to detect that there IS a secret. Kodak's scientists could tell that nuclear radiation bursts were occurring, and so the US government told them the details of the nuclear testing.

This has implications on the kinds of systems that we want to set up to protect against governments keeping secrets that they shouldn't. If we make sure that certain groups can tell THAT there are secrets (without knowing what the secrets are) perhaps that provides an effective check on government abuses.


"If we make sure that certain groups can tell THAT there are secrets (without knowing what the secrets are) perhaps that provides an effective check on government abuses."

I'm a group of one, and I know that there are secrets. I have no idea what the secrets are. How does this help? Maybe you meant to say that the groups should be able to detect the presence of certain types of secrets without knowing all the details -- evidence of nuclear testing, evidence of crypto backdoors, evidence of mass spying on the public, evidence of advanced aliens... But how general should this detection be and what types and strengths of evidence should be granted to these privileged groups? How do you get a government with no intention of letting any evidence about their secrets getting out to go along with the idea of allowing other non-government groups to get a flavor of their secrets? (Come on downvoters, let's tease out the details of this way-too-vaguely-specified idea. For any given covert-but-shouldn't-be government activity X, how can we make sure X is done such that it leaves trails that could in theory be picked up by third parties to detect something is going on without knowing all the details?)


Are you guessing that there are secrets, or do you KNOW that there are secrets?

Can you testify to Congress that there are secrets?

Notice how the story changed after Snowden revealed the fact that the US government was doing ubiquitous surveillance. Previously other people had reported that but not been believed. Some Senators knew about it but were prohibited from saying it was happening. Once it was revealed by a trusted source (trusted because his evidence proved he wasn't just making it up) the whole story changed.


I know in the bayesian sense of knowing, and could testify to Congress that to the best of my knowledge, there are government secrets. My evidence is simply the existence of old secrets being revealed (in detailed form or not) indicating the government kept secrets, plus a steady stream of declassified secrets presently being revealed indicating that there are still secrets left. I have no idea about the character of these secrets, so this knowledge isn't very useful, compared to the knowledge Snowden leaked which not only gave the character of the secret but many of its details too (and his dump included still undisclosed material that the journalists have so far determined would be unwise to share, another piece of evidence that lets me know there are secrets).

What sorts of factors besides "don't do it" could have been done up front to make the surveillance story less evil? Perhaps make Congress be aware of it and able to audit the activities at any time, even setting up alerts? Jeff Jonas has written a lot about the importance of immutable audit tables, which would help other government parties (like Congress) verify what the NSA is doing complies with the law and respects the common citizen's privacy. But having proper auditing in place probably would have made Snowden's leak much more difficult to pull off, so perhaps in this situation the better strategy is to hope for common morality to win out in the end and design systems to make it easier for whistleblowers and leakers to produce strong evidence about some covert activity at the cost of short-term potential abuse from employees using the data for their own personal gain?


Kodak was the supplier of film for aerial surveillance cameras. They probably sold it as a national security risk if they couldn't manufacture usable film.


This article paints it as a dark conspiracy that Kodak was informed but not farmers or the general public. It seems to me that there's a simpler explanation: the dangers of radiation were not well understood at the time, and they probably legitimately thought that e.g. the dangers of malnutrition from disposing of contaminated milk were greater than the dangers posed by the contamination.


I'd buy that for 1945. Not for 1960.

A conscious decision was made that the value of convenient testing outweighed public welfare. IMO, secrecy leads to a lack of accountability which leads to bad behavior -- a pattern seen time and again with nuclear weaponry and supporting industry.


Absolutely not for 1960. For example, the plot of this 1954 movie is based on Jerry Lewis' character being expected to die for only driving over the old nuclear test site:

https://en.wikipedia.org/wiki/Living_It_Up


The dangers of acute radiation poisoning were certainly well known by then. Two Manhattan Project scientists died in 1945 and 1946 from radiation poisoning, and of course there were a lot of Japanese civilian deaths.

The long-term danger of lower doses is a rather different thing, though, and that took a lot longer to be well understood. There are a lot of things which are dangerous in large doses but harmless or even beneficial in small doses, after all.


No. At least the first page of the 1964 paper that makes a review of the previous work mentions the work done since mid-forties (that means, since around 1945) at Hanford.

http://www.sciencedirect.com/science?_ob=PdfExcerptURL&_imag...

About that work at Hanford:

http://www.hanfordproject.com/atmospheric.html

"Classified studies done at Hanford revealed how radioiodine traveled through the food chain, increasing in concentration as it moved upwards into higher consumer species. Clandestine studies conducted at Hanford revealed how high dosage levels of ingested iodine-131 in sheep showed "virtually complete destruction" of their thyroids and those of their offspring. [3] Hanford officials repeatedly assured the public that, "not one atom" has escaped the facility. At one press conference officials stated bluntly that the facility was "safe as mother's milk" even as they were well aware of the possibility of compromising the health of both workers and the general public through Hanford's plutonium production activities. [4]"

And about Hanford as the facility:

"For more than forty years, Hanford released radioactive contamination into the environment while producing plutonium for the U.S. nuclear arsenal during the Cold War era. Although the majority of the releases were due to activities related to production, some were also planned and intentional."


That paper appears to discuss how iodine is absorbed and transported within the body, but not the long-term effects of it. Similarly, your italicized Hanford quote mentions acute effects from high doses, not long-term effects from low doses.

I certainly could be wrong, but you and the others replying to me do not seem to be doing a very good job distinguishing between short term, high-dose effects (which were pretty well known) and long term, low-dose effects (which seem to have taken a lot longer to figure out).


> not the long-term effects of it

We know about the DNA since 1954, we know that the radiation particles destruct it as soon as they hit it. There's no any known mechanism under which the long-term effects can be anything but negative.

The only ones who "benefited" are comics heroes like Spiderman. But do I have to point they don't exist? The only others who "benefit" are those who are doing the nuclear projects.

And I'm not against nuclear in general. If that's the best we can do to provide the energy we want to use, we have to do it. But effectively inventing the health "benefits" is too much.


Although that's totally the common understanding, surprisingly there's rather a lot of evidence (https://en.wikipedia.org/wiki/Radiation_hormesis) at this point that it isn't actually true; it's just a little hard to say "a little radiation might be good for you" without people writing you off as some sort of crazy person at the outset, so it's not talked about very much.

What you're describing -- and what most people who don't have a specialized education in the nuclear field commonly believe -- is something called the "linear no threshold" hypothesis (https://en.wikipedia.org/wiki/Linear_no-threshold_model) of radiation exposure, and although it's commonly used because it's definitely the safest and most conservative way of responding to radiation exposure, it's rather controversial and probably wrong.

If you're curious and want to find out more, there's a lot of fascinating material about the subject available publicly, including some analyses of life expectancies in areas that have much-higher-than-normal background radiation (off the top of my head, Kerala, India is one such locality).


All of the research from the timeframe of atmospheric atomic testing, is no doubt classified.

There is plenty of evidence suggesting that exposure to low levels of radiation for extended periods increases cancer probabilities.


I'm not disputing that there's a cancer risk. But was that known at the time?


> But was that known at the time?

Yes. They even knew they were going to fake the results:

http://www.ieer.org/latest/iodnart.html

"For example, in 1953, the Public Health Service was asked to obtain milk samples in St. George, Utah, near the test site. But the service took a sample from a carton of milk purchased in a store, not from a local farm or dairy -- at a time when the majority of residents of southwest Utah obtained milk from their own cows and many others purchased milk from neighboring farms.(12)

According to Morgan S. Seal, a fallout monitor with the Public Health Service, the testing procedure was not very useful either. "In the case of milk, we even treated it with perchloric acid to get rid of all the organic residue....we knew for a fact then that those oxidating techniques completely eliminated any iodine in the material that you were treating."(13)"


This once again appears to confuse short term acute effects with long term cancer risks.

It should be pretty clear just from looking at the timeline. How are they going to discover a long term cancer risk by studying the population only eight years after exposure began?


I'm sure there were strong suspicions. The practices of the era were... questionable in many moral, ethical and legal dimensions.

(http://nsarchive.gwu.edu/radiation/dir/mstreet/commeet/meet1...)

"From 1963 to 1965, at the Atomic Energy Commission National Reactor Testing Station in Idaho, radioactive iodine was purposely released on seven separate occasions. In one of these experiments, seven human subjects drank milk from cows which had grazed on iodine-contaminated land. This experiment was designed to measure the passage of iodine through the food chain into the thyroids of human subjects. In a second experiment, three human subjects were placed on the pasture during iodine release, and seven subjects were placed on the pasture in a third experiment. In addition, "several" individuals were contaminated during yet another experiment when vials of radioactive iodine accidentally broke. Cows grazed on contaminated land and their milk was counted in four of the experiments; in the remaining three, radiation measurements were made only int he pasture. (Category 10.001, Number 173)."

"During the 1960s, at the Los Alamos Scientific Laboratory, 57 normal adults were fed microscopic spheres containing radioactive uranium and manganese. These experiments were designed to determine how fast such spheres would pass through the human body after ingestion. It was believed that particles of this size could be produced by the atmospheric reentry and burnup of rockets propelled by nuclear reactors, or of radioactive power supplies. (Category 1.003, Number 106)."

"From 1961 to 1963, at the University of Chicago and Argonne National Laboratory, 102 human subjects were fed real fallout from the Nevada Test Site; simulated fallout particles that contained strontium, barium, or cesium; or solutions of strontium and cesium. This experiment was designed to measure human absorption and retention of these radioactive substances. (Category 11.001, Number 186, Part A)."


Trivial point of logic: knowledge of the long-term risk of low-level exposure was arrived at by having this kind of low-level exposure and compounding it over the duration of those long-term periods...


Radiation hormesis is still a debate:

https://en.wikipedia.org/wiki/Radiation_hormesis


So who actually tries to make this a "debate"? The world effectively agrees (from your link):

"Reports by the United States National Research Council and the National Council on Radiation Protection and Measurements and the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) argue[16] that there is no evidence for hormesis in humans and in the case of the National Research Council, that hormesis is outright rejected as a possibility despite population and scientific evidence.[17] Therefore, estimating Linear no-threshold model (LNT) continues to be the model generally used by regulatory agencies for human radiation exposure."

There's no even any logical proof that the radiation, that even in the lowest dosages destructs the DNA whenever it hits it, can make beneficial mutations instead of simply producing a cancer. That the "chemical agents" can be beneficial in low dosages isn't the issue as the radiation isn't a chemical agent but a DNA destruction agent.

Note: "Radiation hormesis is the hypothesis that low doses of ionizing radiation (within the region of and just above natural background levels) are beneficial."

No proof of that.


You are misinformed. There is absolutely a debate about the effects of radiation at very low doses. This is not like Climate Change where there is a scientific consensus. Look at e.g. the 2009 April edition of Radiology, which featured the competing articles,"The Linear No-Threshold Relationship Is Inconsistent with Radiation Biologic and Experimental Data" [1] and "Risks Associated with Low Doses and Low Dose Rates of Ionizing Radiation: Why Linearity May Be (Almost) the Best We Can Do" [2].

[1]http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2663584/

[2]http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2663578/


There is a "debate" like in all areas where there are big money interests. How much of scientists are on "there's nothing" and how much are on the "there's a debate" side?

Linearity in the observed statistics is absolutely not the same as "beneficial." Even if the statistics gets "fuzzy" for low enough doses who can actually claim the real benefits?

Also: the actual dosages matter. The low enough dosages are something where we simply don't have anything to compare to as "full absence". We get some level of radiation all the time. Even eating a banana gets you a little of the radioactive stuff. And we (as animals) get cancer all the time.

http://www.livescience.com/9680-cancer-kills-wild-animals.ht...


> The low enough dosages are something where we simply don't have anything to compare to as "full absence".

Which is exactly why the Wiki article I originally linked discusses laboratories specifically built to reduce background radiation in order to further this research. (Which is certainly a strange step to take, given that there's apparently no debate.)

https://en.wikipedia.org/wiki/Radiation_hormesis#Effects_of_...


No, independently of your claims, these proposed and never performed experiments theoretically could provide insight of how much of the observed mutations occur due to the existing radiation to which we're permanently exposed. We simply never observed organisms under so low radiation as the experiments planned to do.


Really? That very quote says that they reject it "despite population and scientific evidence."

And from further down in the article:

"Until the [...] uncertainties on low-dose response are resolved, the Committee believes that an increase in the risk of tumour induction proportionate to the radiation dose is consistent with developing knowledge and that it remains, accordingly, the most scientifically defensible approximation of low-dose response. However, a strictly linear dose response should not be expected in all circumstances."

There's a lot of confounding factors that make studying low-level radiation doses hard, including background radiation and the long timelines involved. I'm perfectly fine regulatory agencies working with an assumption of a LNT model until conclusive proof otherwise is provided. These agencies should be focused on safety and lean conservatively when in doubt. But attempting to call it not a debate is misleading at best.


Read the linked text, don't cherry pick. "Evidence" as in "one single study, contrary to all other studies" and as "the DNA was less damaged with the lower doses."

It's a long jump from "the destructed DNA is in percentages better repaired at low radiation doses than at higher ones" (if even that is provable, compared to simply "the DNA was less hit") to any reliable proof that the low doses are actually beneficial.

Again, who actually has the interest in presenting this unsupported "beneficial" claim as the "debate"?


I'm not cherry-picking anything; these are quotes from the published opinion. And both leave space for doubt, because the science is not conclusive. As I said, I support their decision; it's definitely a "better safe than sorry" situation.

And why does there have to be an agenda, as opposed to simply being an unresolved scientific point of contention? We don't know if there's life on Europa either, and I'm sure there's scientists on both sides. But I wouldn't say any of them have an agenda except for making the best educated guess they can, lacking any solid evidence one way or another.

EDIT: Also, to reply to an earlier point:

> There's no even any logical proof that the radiation, that even in the lowest dosages destructs the DNA whenever it hits it, can make beneficial mutations instead of simply producing a cancer.

It can also just kill the cell. If it kills problematic or pre-problematic cells more proportionately than healthy cells, then the net effect could be positive. Such disproportionate effects are the very basis of otherwise indiscriminate treatments like chemotherapy [0].

[0] https://en.wikipedia.org/wiki/Chemotherapy#Mechanism_of_acti...


> And why does there have to be an agenda, as opposed to simply being an unresolved scientific point of contention?

Because there's no way that the radiation particles don't destruct DNA if they hit it. The "debaters" don't have any scientific explanation how it can be even "harmless" not to mention "beneficial."

Chemotherapy is just "let's kill enough of this guy's fast replicating cells to kill the cancer cells before they kill him." There's absolutely no any connection there to the radiation effects on the DNA.


We're constantly bombarded with radiation by simply living. We wouldn't make it to adult hood if all DNA damage was lethal. DNA damage is more likely to be repairable or outright kill a cell than cause cancer, just by pure numbers.

> Chemotherapy is just "let's kill enough percentage of this guys cells to kill the cancer cells before they kill him."

That's exactly what I said with "otherwise indiscriminate treatments." Our body is not a homogenous mass of one type of cell; different cells react differently. If cancer cells react disproportionately negatively to a substance than other cell types, then that substance could be useful to treat cancer. All forms of hormesis are basically an extension of this idea: that the harm done by a substance can be otherwise offset by positive effects.


> If cancer cells react disproportionately negatively to a substance

The radiation is not a "substance" it's a radiation, the destructive and completely non-selective bombardment of all the molecules reached by it. It's not a chemical effect at all, but a nuclear one.

I suggest that those who claim the additional low dosages are beneficial start to expose themselves to such. Somehow I doubt they'd do it. It's always for somebody else.


There's no substance here other than being pedantic and silly in equal measure.


Somebody denying knowing the difference between physical and chemical processes and then repeating it as the argument is really rare for me to see.


I was using the chemical process of chemotherapy as an example of a disproportionate effect of a process. The specific process chosen by me was immaterial to the argument, except that it was a convenient example of the effect I wished to describe. My use of the word substance was perhaps imprecise, as it is usually reserved for items with mass, which ionizing radiation does not have. But it was, again, immaterial to the idea of a disproportionate effect, which is what I was actually conveying. And I am choosing to label as pedantic the attempt to disarm my point by discussing the word chosen as opposed to the point being made.


Damaged DNA gets repaired. The proposed mechanism for hormesis is that low levels of radiation stimulate the repair process and get the immune system to step in earlier. I have no idea if this is really true, but it doesn't seem absurd on its face.


There's no observed instance of such "earlier repair," it's just an invented "hypothesis." Only the non-linearity is observed, and not even by majority of the experiments. The explanation for the observed non-linearity is that the single impacted cells simply have time to die (or be targeted by the white cells) at low levels, whereas at higher dosages there are enough cells for cancer to progress more often. It's just the game of probabilities at the levels at which we have enough noise (low enough doses).


It seems like another possibility is that Kodak selling fogged film represented an intelligence risk. Kodak distributes film internationally; an adversary could probably correlate production dates with atomic tests and learn about fallout quantities, species, etc without ever entering the US.

So the decision may have been made solely on those grounds.


That doesn't explain why they continued to detonate bombs within the continental US around farm land. I agree that they cannot roll back what had already occurred, but that ignores that this took place halfway through the most intense bomb testing.


I think this is a somewhat unfair representation of the article, which contains rather less speculation, at least proportionately, than your post here - it does, for example, discuss what was believed, and when, about the risks of fallout.


The article states that the AEC knew radioactive iodine would contaminate milk and enter the population, but is quite vague on the state of knowledge of the dangers of that contamination at the time. Skimming the 1998 Senate hearing linked in the article, the link between radioactive iodine ingestion and thyroid cancer was apparently unsure even in 1998, although considered likely by then.


>Skimming the 1998 Senate hearing linked in the article, the link between radioactive iodine ingestion and thyroid cancer was apparently unsure even in 1998, although considered likely by then.

Skimming certain recent senate hearings, one might conclude that global warming is quite possibly a dark conspiracy by climate scientists.

I have worked in the nuclear industry prior to 1998, and I can tell you that the risks were well enough understood by then.

If the risks were not suspected at the time of the tests, why would anyone be thinking of disposing of milk?


The article says they argued that disposing of the milk would cause malnutrition. An obvious conclusion would be that they thought the risks from the contamination were smaller than the risks of malnutrition. They may well have known of some risks but not understood the magnitude.

When was it understood that radioactive iodine caused thyroid cancer, and that it could be mitigated by using iodine supplements? I couldn't find any info on exactly when that whole chain was put together (not that I tried super hard).


> disposing of the milk would cause malnutrition

You don't actually need to dispose of it - Iodine-131 has a very short half life, so just make powdered or condensed canned milk and store it for 2 months and it's safe.


It is not that obvious when you take human nature into account - in fact, there seems to be a strong implication that the statement was transparently self-serving, and some other statements by the AEC undoubtedly were.


I've always thought that the well-coiffed narrator in this "Report on the Effects of Radiation Fallout "(included in Peter Kuran's documentary Trinity and Beyond) perfectly encapsulates the attitude of the military establishment about the the known fallout hazard:

https://youtu.be/fCJm0jBuTo0?t=3820

"The development of our nuclear power has been an absolutely necessary protection against the development of the Communist threat. In this view, the few casualties, if any, must be seen as those of unidentified soldiers in a war in the service of humanity..."


In that light, read the statement of Brzezinski (the President's National Security Advisor at that time) regarding the US support of Islamism in Afghanistan since 1979 (the same year the Islamic revolution happened in neighboring Iran):

"Q: And neither do you regret having supported the Islamic [integrisme], having given arms and advice to future terrorists?

Brzezinski: What is most important to the history of the world? The Taliban or the collapse of the Soviet empire? Some stirred-up Moslems or the liberation of Central Europe and the end of the cold war?"

Quite consistent.


Brzezinski was right. Statesmanship is about tradeoffs and prioritiztion. This was a great bargain.


Reminds me of Sun Microsystems' 2001 issue with cosmic rays scrambling SRAM caches in customer's servers...crashing them.

http://www.sparcproductdirectory.com/artic-2002-jan-pb.html


Sure, if the radiation was caused by bomb tests where the government knew about the risks, and the machines were running a large percentage of life-support systems.


Or, you know, maybe a form of radiation creating initially confusing side effects within a commercial product.


"You should love your country— but never trust its government."


Extrapolating from the fallout patterns from the map, it appears that Canadians would have been among the more affected victims.

Can anyone comment on how the inquiry and response was handled in Canada? Do we know what the statistics were for the resulting cancers here?


Does this mean Kodak has a lot of nuclear testing related information that can be had without going through FOIA requests?


That's... a weird question. That's like saying 'maybe Kodak has some money I could get without going through applying for a tax rebate'.


It is comic that they would protect milk for children, when the DOE was experimenting on unwilling hospital patients: http://fas.org/sgp/othergov/doe/lanl/pubs/00326649.pdf

Not so much "Government for the People". Why such decision makers are above the law is baffling, really.


"There is something wrong with this picture."



Title should note this is from 2013.


There is a discussion on reddit about how many people got cancer from this and died way before their time because farms got so much radiation fallout from rain.


If you want to read more on this type of subject, check out "The Plutonium Files" by Eileen Welsome. Pretty scary.

http://www.amazon.com/The-Plutonium-Files-Americas-Experimen...


I hope people realize that every living being on the planet now has been contaminated by the atomic tests. Not a one of else escaped that one - if you were born after 1955 that is.


I wonder if Morton salt company was subsidized to add iodine in the national salt supply as a prophylactic measure.


That's to avoid iodine deficiency. Adding that pretty much eliminated the condition.


Has anyone estimated how many early deaths were caused by the radiation exposure?


Indiana is heck of a long way from Trinity, and Trinity was an Atomic Bomb with far less fallout than a Hydrogen Bomb. So while the latter clearly caused health problems for Americans, I'm skeptical of the claims that photos were affected by Trinity.


You would need to look at the weather patterns. As it turns out there is a reasonably steady movement of atmosphere diagonally up and to the right from the gulf of California into Maine. If you have been watching this year's weather you can see it in action. What is more salient is that moist air coming up from the Gulf of California begins to condense out into rain in the plains. The transport mechanism is fairly well understood and well documented.

That said, the film was susceptible because it was packed with corn husks over an extended period of time. The combination of sensitivity, and extended exposure, led to the film fogging.


So, you're suggesting it was one of the many other atomic bomb tests carried out in 1945? The article (and other related information I've read in many sources on the Manhattan projects) is pretty clear about the connection.

Fallout got into the Corn being grown in Indiana, which was then used to make packing material for Kodak, that fogged the film...


No, I'm suggesting the timeline is wrong and the fogging either is apocryphal or occurred later during hydrogen testing.,




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: