[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Note this HPS Newsletter article





About the Validity of the LNT Hypothesis



Gunnar Walinder, PhD, with John Ahlquist, CHP



A fundamental doctrine within the radiation protection community (the linear

no-threshold [LNT] hypothesis) originally arose in the 1950s. Two congresses

were held in Stockholm, in 1952 and 1956 (Advances in Radiobiology 1957).

The dominant personality at these meetings was the Nobel Prize winner

Hermann J. Muller. The report, Advances in Radiobiology, from the 1956

congress was dedicated to Muller.



In 1927 Muller reported that x rays could induce mutations in male fruit

flies. In his investigations Muller found that the mutation rate among the

male fruit flies was linearly related to the radiation dose. Furthermore, he

could not find any threshold doses, that is, doses below which mutations

could be excluded. The reputation of Muller helped lead to a widespread

adoption of this linear no-threshold model as a generally valid relationship

between radiation doses and genetic effects. A very simple reasoning led to

the conclusion that this should be true also with regard to basically

genetic diseases as cancer. At this time, the general understanding of

carcinogenesis was that it is a process by which one normal cell, in one

step, transforms into a malignant state (a malignant phenotype).



Probably, under the influence of the target theory, the genetic effects‹no

matter whether they lead to congenital damage or cancer‹were considered to

be stochastic. They were the result of an "all or nothing" event.

Consequently, stochastic effects in irradiated populations could be studied

by statistical models. The statistical variations were accordingly not a

consequence of variations in the human sensitivity to radiogenic cancer. On

the contrary, they were considered entirely independent of such variations.

Ionizing radiation could increase the rate of tumors, however, not their

severity. Thus, the radiogenic cancer could be arithmetically added to the

existing cancer rate.



This was exactly what the target theory had predicted with respect to lethal

effects in irradiated prokaryotes and in isolated, eukaryotic cells.

However, already in the 1950s the theory had become discredited since it did

not "work" in higher organisms or, perhaps more correctly speaking, in the

presence of water. Accordingly, most of the genetic and tumorigenic effects

in higher organisms were not direct effects of the ionizations but mediated

by radicals produced by the ionizations. However, if so, the genetic effects

of the ionizations could not be stochastic. Instead, they were highly

dependent on the actual state of the cell and on the presence of other

intracellular substances‹for example, oxygen, amines, etc.



Muller carried out his investigations with male fruit flies. Had Muller used

female fruit flies, he would have discovered that no mutations whatsoever

would have appeared after radiation doses below about 800 mSv. That was what

Seymour Abrahamson later found in female fruit flies and W.L. and L.B.

Russell found in female mice. This has led the United Nations Scientific

Committee on the Effects of Atomic Radiation to present two entirely

different tables for genetic effects, one for men and another for women.

Considering Muller¹s great influence, would we then have had a radiation

protection doctrine which had stated that low radiation doses cannot induce

genetic damage and cancer? Is the idea of the absence of threshold doses

only a matter of chance? If so, this is only another example in the

biological science of how general conclusions have been drawn from single

experiments and/or from single strains and sexes.



However, there were many people who were reluctant to accept the new idea of

the LNT. Rolf Sievert found it difficult to reduce complex biological

phenomena such as heredity and cancer to a straight line. He simply did not

believe in stochastic, biological effects and his arguments were very

similar to those later expressed by Lauriston Taylor. The same opposition

could also be found among the oncologists at Radiumhemmet in Stockholm, of

whom the perhaps most eloquent spokesman was Dr. Lars-Gunnar Larsson. They

claimed that the drawing of straight lines has nothing to do with biology

and such methods could never constitute a model of a biological process and,

least of all, the complex kind of dysdifferentiation that we call cancer.



Due to my youth and limited experience, I kept silent about the controversy.

However, Sievert noticed my hesitation and we had some discussions where we

aired our reluctance to stochastic, biological models, however from somewhat

different bases. At this time I was working as a mathematical physicist with

some interest in physical epistemology. I was puzzled by the fact that

people seemed to be fundamentally unaware of what has been going on within

this field since the 1880s.



When speaking with scientists about philosophy and epistemology they usually

look somewhat absentminded and mumble something about metaphysical

blatherskite. However, exactly the same reluctance was the reason why the

physicists in the end of the 19th century criticized the old theory of

knowledge. The famous physicist and thinker Ernst Mach (1959; 1970) brought

his fist down on the table and said that we have to terminate all these

metaphysical theories of knowledge and system buildings. We have no need for

speculations. What we really need are objective facts and scientific

analyses. We cannot acquire knowledge of the nature and the things

themselves but only of our sensory impressions of them.



In this respect, his views were soon approved by almost all physicists and

by many epidemiologists. Some decades later, Niels Bohr pointed out the same

thing by saying that physics does not deal with nature but with what we can

know about it. 



In his comprehensive work Die Prinzipien der Mechanik, Heinrich Hertz (1894;

1956) was able to show that mathematics could be used as a consistent and

general model of mechanics. Boltzmann did show that Hertz¹s model could be

extended to thermodynamics (and other branches of physics) by means of

statistical mathematics. Mathematics is, of course, not identical with

reality-nature; however, it fulfills the necessary conditions (consistency

and generality) for such a model. I would say that without this model,

modern physics would have been impossible. Hertz did also, very strongly,

point out the limitations of the mathematical model. For example,

mathematical and physical methods could not be applied to even the simplest

living organism. This conclusion has, again and again, been repeated by

subsequent physicists. It has puzzled me a great deal that no physicist

working in the field of radiation protection seems to have been aware of

this fundamental, epistemological principle of the 20th century physics.



Still, 50 years later, the basic doctrine in the radiation protection is

expressed (after low doses and dose-rates) by the simple formula:



N = 0.05 x D



where N is the number of radiogenic cancer cases and D is the collective

dose (expressed in manSv).



This formula is considered valid for all populations and independent of

living habits and other factors that normally are considered of significance

for tumor formation. Advocates of this equation cannot possibly have any

knowledge of the generic category of disparate diseases which we have given

the common name cancer. Nor can they have any idea about the epistemological

prerequisites for using mathematical models. As a physicist I have, of

course, always applied mathematics to my problems. However this mathematics

has to be adjusted to the specific task. To me, it is impossible to

understand how one and the same formula can be used as a collective model

for all disparate forms of cancer. How should we explain the fact that

various forms of cancer have different dose-response relationships and that

some tumors cannot, on the whole, be induced by ionizing radiation (for

example, such common forms as the uterine cancer and those in the prostate).

How can anyone believe that such extremely complex processes as the general

carcinogenesis can be adequately described by an equation of the first

degree? This model obviously does not fulfill any demands for consistence or

generality. The formula is not only generally considered valid, it is also

said to be applicable at "homoeopathic"radiation doses. What an unbelievable

pretension to knowledge: "We know everything and we are able to give

quantitative figures of infinitesimally small radiation risks." It reminds

me of MoliPre¹s comedies. Could we not hope that, in a reasonably short

future, such pretensions of knowledge will give rise to the same roar of

laughter as is the case with the precious figures in MoliPre¹s comedies? In

no other scientific field have such deeply unscientific claims been made.



Of course, we don¹t know everything and we can, on grounds of principle,

never achieve knowledge of everything. There are epistemological limits in

the biology as in all other science. This must be realized and we are

scientifically obliged to admit this fact. Radiation risks cannot be treated

in a way that differs from all other kinds of risk analyses.



As I have already mentioned above, the primary, interdependent actions

between the ionizations and various radicals within the cells preclude the

possibility of stochastic interpretations of the observed courses. Modern

oncology has also clearly shown that the transformation of a cell into a

malignant phenotype is a multistep process that demands several changes in

different parts of the genome. All these changes cannot be caused by a low

radiation dose. Thus, here too, the malignant contribution of the radiation

is dependent on the presence or future emergence of other, necessary genetic

effects.



However, the most fundamental uncertainty is connected with the fact that

cancer is a group of highly organismic diseases. This is often forgotten by

the cytologists who consider malignant cell transformations as synonymous

with cancer. However, as the Swedish oncologist Georg Klein (1979) has

pointed out, a malignant transformation in vitro is not synonymous with

cancer in vivo. A single malignant cell is not synonymous with cancer.

Before we can speak of cancer, this transformed cell has to give rise to

about one billion divisions; that is, every tumor cell has, as a mean,

undergone more than 30 divisions. This is a multi-iterative process with a

repeated synthesis of DNA. All iterative processes have two characteristics

in common. The outcome of them is unpredictable and a small,

often-undetectable factor (technical term: friction) can totally change this

outcome. Also for this reason the malignant outcome of a low radiation dose

is, on grounds of principle, fundamentally unpredictable. Murray Gell-Mann

has defined cancer as A multi-iterative process in a complex, adaptive

system (organism). This uncontrolled growth is the basic characteristic of a

malignant, fatal development.



It has often been said that it can only be a value, per se, if we apply a

model that is more safe than necessary. However, this argument loses its

weight when it leads to an incautious cautiousness, that is, when the

measures to get below the maximum permissible radiation doses or keep action

levels implies greater dangers than those connected with the radiation.

There are many examples. The World Health Organization and the International

Atomic Energy Agency have estimated that more than 100,000 entirely

unnecessary abortions were done in Europe after the Chernobyl accident. Many

energy sources (including CO2- and SO2-producing fossil fuels) have been

preferred to nuclear power. After the Chernobyl accident some people were

evacuated to Kiev from low-contaminated, rural areas in the Ukraine, in

spite of the fact that the cancer rate in cities like Kiev is 20 to 30%

higher than that in rural areas. Many other examples can be given.



Conclusions



1. Mathematical models cannot be applied to living organisms and fundamental

living processes like differentiation, as well as to such

dysdifferentiations that may lead to cancer.



2. On grounds of principle, the outcome of multi-iterative processes like

cancer cannot be predicted at low levels of exposure.



The LNT hypothesis is thus a primitive, unscientific idea that cannot be

justified by current scientific understanding. As practiced by the modern

radiation protection community, the LNT hypothesis is one of the greatest

scientific scandals of our time.



___________________



References



de Hevesy G, Forsberg A, Abbatt J, eds. Advances in radiobiology; 1957.



Hertz H. Die prinzipien der mechanik (Gesammelte Werke III); 1894. Am.

Edition: Cohen RS, ed. The principles of mechanics presented in a new form;

1956.



Klein G. Contrasting effects of host and tumor evolution. In: Fortner JG,

Rhoads JE, eds. Accomplishments in cancer research. General Motors Cancer

Research Foundation. 123-146; 1979.



Mach E. The analysis of sensations. New York; 1959.



Mach E. My scientific theory of knowledge and its reception by my

contemporaries. In: Physical reality: Philosophical essays on twentieth

century physics (Red; S. Toulmin). New York; 1970.



Walinder G. Has radiation protection become a health hazard? Madison, WI:

Medical Physics Publishing; 2000. 



************************************************************************

You are currently subscribed to the Radsafe mailing list. To unsubscribe,

send an e-mail to Majordomo@list.vanderbilt.edu  Put the text "unsubscribe

radsafe" (no quote marks) in the body of the e-mail, with no subject line. You can view the Radsafe archives at http://www.vanderbilt.edu/radsafe/