[ RadSafe ] Re: Differences in Background radiation and disease incidence
Otto Raabe
ograabe at ucdavis.edu
Fri Feb 9 21:30:27 CST 2007
At 07:04 PM 2/9/2007, Eric D wrote:
>Has anyone addressed the problem of low dose effects by estimating how large
>the radiation risk coefficient would need to be to have statistically
>significant differences in disease rates between areas of differing
>backgrounds?
**********************************
Assuming the ICRP linear model of cancer risk from exposure to ionizing
radiation with a risk coefficient of 0.05 per Sv or 0.0005 per rem, and
assuming the average annual dose in the U.S. from natural background of
ionizing radiation is 0.3 rem per year, a 70 year old person would have
received about 21 rem from background over a lifetime. The cancer risk
predicted by the linear mathematical model would be 21 X 0.0005 or about
1%. About 35% of people in the U.S. develop cancer during their lifetime,
so a 1% "risk", even if it were correct, would be lost in the noise. As you
can verify by checking with the American Cancer Society tables, those
States with the highest background radiation tend to have the lowest cancer
rates. For example, Colorado has one of the lowest cancer rates among the
States, but the background radiation exposure for residents of Colorado is
more than two times the national average. Obviously, there are other
factors beside radiation that are more important in determining cancer
risk. In fact, there is no known or expected risk associated with normal
levels of natural background radiation
Otto
**********************************************
Prof. Otto G. Raabe, Ph.D., CHP
Center for Health & the Environment
University of California
One Shields Avenue
Davis, CA 95616
E-Mail: ograabe at ucdavis.edu
Phone: (530) 752-7754 FAX: (530) 758-6140
***********************************************
More information about the RadSafe
mailing list