Visit our Re-post guidelines
The Low Level Radiation Campaign (LLRC) Press Release (6/20/13)
A new review shows the conventional radiation risk model cannot be used to predict health effects of radioactivity inside the body.
On May 22 InTech (http://www.intechopen.com) published a review of evidence that DNA damage caused by inhaling and ingesting man-made radioactivity is having serious health effects. This is the first time such a wide-ranging review of the genetic mechanisms of harm from nuclear discharges has been published in the scientific literature.
The review, by Professor Chris Busby, is entitled "Aspects of DNA damage from internal radiation exposures" . It is in a book called "New Research Directions in DNA Repair". It vindicates the belief that incorporated (internal) radioactivity is more dangerous than predicted by the International Commission on Radiological Protection (ICRP). Much of the information reviewed has been in the literature for decades but has been sidelined or ignored.
The evidence shows that ICRP's use of "absorbed dose" is invalid for many radionuclides when they are internal. "Absorbed dose" is based on an external irradiation paradigm and therefore averages the energy of radioactive decays across large volumes of body tissue. By contrast, some forms of radioactivity expose DNA to high densities of ionisation. The review defines and discusses situations where genetic damage is massively more likely than from external radiation at the same "dose"; 1) biochemical affinity for DNA, 2) transmutation, 3) hot particles, 4) sequential emitters ("Second Event Theory"), 5) low energy beta emitters, and 6) the "Secondary Photoelectron Effect":
- Some substances (for example Strontium-90 and Uranium) have high biochemical affinity for DNA so a large proportion of what is inside the body will be chemically bound to DNA. For this reason the radiation events associated with them are massively more likely to damage DNA structures than the same dose delivered externally.
- Transmutation, where the radioactive decay of a radio-element changes it into a different element (e.g. Carbon-14 changing to Nitrogen), has mutagenic effects far greater than would be expected on the basis of "absorbed dose". This has been known since the 1960s but it has been ignored by risk agencies such as ICRP, UNSCEAR and BEIR.
- Hot particles, especially those which emit very short-range alpha radiation, have obvious implications for high local doses to tissue where they are embedded.
- The "Second Event Theory" concerns the decay sequences of some radionuclides which decay to a short-lived daughter. Strontium 90 decaying to Yttrium 90 is an example; the Yttrium 90 has a half-life of 2½ days so the theory is that the first event (decay of Strontium 90) may damage a cell's DNA which then sets about repairing itself. The repair process is known to be very radiosensitive and there is a finite probability that the second event (the subsequent Yttrium decay) inflicts further damage which cannot be repaired.
- A good example of a low energy beta emitter is Tritium. (Tritium is projected to account for 99.8% of the radioactivity in discharges from the "generic" design of reactor planned for the UK). The review compares Tritium with Caesium-137. The very low decay energy of Tritium means that delivering the same absorbed dose as the Caesium requires 90 times as many radiation tracks from Tritium. This density of events occurring at low doses suggests a mechanism to explain experimental results that show Tritium is a greater mutagenic hazard than ICRP would expect.
- Elements with large numbers of protons (e.g. Uranium, Plutonium) absorb external gamma radiation efficiently, re-emitting it in the form of very short-range photoelectrons indistinguishable from beta radiation. This is known as the Secondary Photoelectron Effect (SPE). The review criticizes papers which used Monte Carlo methodology in attempts to minimise the importance of SPE after New Scientist  published a report on it in 2008.