[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Linear Model



Dr. Strom writes:
 
> I was going to stay out of this, but some of these comments require a response. 
> >Subject: Linear Model
> >Author:  HWADE@aol.com at -SMTPlink 
> >Date:    8/8/95 0:41
> 
> >1. Cohen's 1995 paper answers the previous criticisms that the "ecological 
> >fallacy" marred its conclusions. HP,v68,no.2.
> >Therefor, previous criticisms are no longer valid
> 
> There are several issues regarding Cohen's work in addition to the ecologic 
> study: self-reported data not field validated, short-term measurements, 
> measurements taken long after causal exposures may have occurred, measurements 
> representing only a portion of an individual's exposure, no controls for 
> *individuals* on smoking, migration, etc.  

None of these "potential problems" shows an actual problem with Cohen's study. 
Because of the large data-base with data that are not substantially in
question, and eliminating the "retirement states", no specific problems on
these topics are identified that question the results. 

(And of course the results provide evident comparisons that would be totally
different if the "linear model" were true. But most importantly, it shows on
its face that spending $20 billion to reduce radon in houses is a total waste, 
and even a scam, separate from the arguement about the precision of absolute
numical values that may be affected from any supposed errors in these results. 
Any other "conclusion" is to foster unjustified public fear for unseemly
purposes.) 

> The ecologic fallacy is "An error in interpreting associations between ecologic 
> indices.  It is committed by mistakenly assuming that, because the majority of a
> group has a characteristic, the characteristic is related to a health state 
> common in the group" (Slome C, Brogan DR, Eyres SJ, Lednar W.  Basic 
> Epidemiological Methods and Biostatistics - A Workbook. Boston: Jones and 
> Bartless, 1986, Chapter 9 & p. 306).  The problem with the ecological study 
> design is that it doesn't have individual doses linked to individual people.  
> No amount of "correcting" can get around this.

Again, this is problem that may lead to incorrect results in specific cases.
The identification of specific problems over the last 7 years have been
addressed and shown not to be relevant or to substantially effect the general
results of Dr. Cohen's study. To simply reject the study because of such a
"problem", as opposed to testing the validity of the study for the
significance of the condition, is to arbitrarily deny the data because you
don't like the result. (Also, these statements aren't Biblical truth, either
as the authors write them, or as you use them here.  :-)  These are presented
as a religious faith based on no substance - not to denigrate religion, but
its not a basis for science which should address the data and the analysis.
There's a lot riding on epidemiology funding, the hyper-costly kind, here.) 

> Studies with the ecological or geographical design are *hopelessly* flawed from 
> the basis of deriving causal inferences.  Counties don't get lung cancer, people
> do.  Furthermore, you can't "correct" for migration.  If you want to understand 
> lung cancer in Miami, then measure radon levels in New York and Havana 40 years 
> ago, where the people in Miami who are dying of lung cancer today would have had
> their causal exposures.  Cohen's *self-reported* measurements were made in the 

Your use of "hopeless" is simply ignoring the data, not assessing its
validity. Miami is not in the database, for the valid reason you imply (as you 
know). The remaining data however is valid unless shown to be affected in a
skewed way by the out-migration of people to Miami (and the other "immigration 
centers). 

> 1980's and 1990's, not in the 1950s.  One properly-done study linking doses 

Yes. A potential problem. You don't show that radon levels (or more precisely, 
_relative_ home radon levels) by geographic area, are likely to be
substantively changed from the 1950s to the 1990s (and they are obviously not, 
so this "problem" is shown to not be relevant to this study). 

> (before diagnosis, with an appropriate lag-time to account for latency) to 
> individuals with known smoking histories is more cogent than a one hundred 
> ecological studies.  

True. Where are they? Clearly the supposed "case-control" studies that seem to 
have been performed to date are ecological studies in substance, since there
are no dose data, but only home radon measurements that are imprecisely
related to the dose to individuals in the home. 

Since the precise radon inhalation/exposure histories of the selected
individuals do not allow specification of dose (time in or out of the house,
time in the various areas of a house with dramatically different radon
concentrations, imprecise time histories of the individuals in the residence,
life in previous residences, and life/exposure during time away from the
residence, etc., etc.,) there is a VERY weak association between the house
concentration and individual dose. 

Since the variation of a subject individual in a house with known radon
concentrations is VERY uncertain, the group of case-control individuals in a
study have a VERY large statistical variation in the presumed dose relative to 
the home radon measurement. 

This population, due to its small size (100s or 1000s), can be reasonably
expected to have a larger statistical uncertainty than the variation of the
population in a county related to the mean radon exposure of the county, with
more statistical validity in the comparison between the relative dose/radon
levels in comparing counties than in the grotesque differences in the
individuals in a case-control study (which is likely why case-control studies
can not consistently find any meaningful relationship, even with the typical
study using very much narrower "uncertainties" in reporting statistics than is 
justified by the data). 

> See two articles debunking of ecological designs in the HPS
> Newsletter:  Conrath S.  Study Design as a Determinant of Radon Epidemiologic 
> Study Validity.  HPS Newsletter 18(7):1-5, July 1990.  Strom DJ. The Ecologic 
> Fallacy. HPS Newsletter 19(3):13, Mar 1991.  

As above, these comments make rhetorical arguments about potential problems
about "ecological studies" do not show problems with this study. These matters 
have been considered and addressed, and do not show the effect on the validity 
of the study, much less "debunk" it. If these potential problems could have
been shown to effect the study results they would be addressed directly rather 
than this kind of indirect rationalization. 

> As I wrote to Dr. Cohen, January 3, 1990:  "The enclosed article by Alvan R. 
> Feinstein entitled 'Scientific Standards in Epidemiologic Studies of the Menace 
> of Daily Life' (Science 242:1256-1263, 2 DEC 88) may be of interest to you.  The
> way I read it, ecologic studies fail to meet four of Feinstein's five criteria 
> (ii-v)  Perhaps his work gives some insight on the unwillingness of mainstream 
> epidemiologists to accept ecologic studies."  The uses of the ecologic design, 
> if any, are for hypothesis generation or temporal trending, but not for 
> quantitative tests of dose-response models.

Obvously these concerns have been substantially addressed and are not
significant to the results of Cohen's data and analysis. 

> >2. Cohen has now completed a case-control study that shows a negative 
> >correlation between radon concentration and case-control ratio. Annual 
> >Meeting abstracts, HP,supp to v68, no.6.
> 
> As Dr. Otto Raabe pointed out to Dr. Cohen in Boston, and despite the 
> fact that it has been accepted for publication in *Health Physics*, this 
> was not a case-control study.  In a case-control study, individual cases 
> are matched on age, gender, smoking, and other variables with individual 
> controls.  Dr. Cohen simply took a group of *self reported* diagnoses 
> (with no attempt to control for whether the lung cancer was a primary 
> tumor or a metastatic tumor, since no medical records were examined) of 
> cancers other than lung and compared their present radon levels with 
> those of *self reported* diagnoses of lung cancer.  This is not a 
> case-control study - ask any formally-trained bona fide epidemiologist.  
> Furthermore, with no histological data, no site of tumors, no medical 
> records, and all self-reported diagnoses, this work would have trouble 
> with peer review for an epidemiology journal.  

Though I'm not familiar with the work, you have clearly identified another
"potential problem" potentially affecting the result. You have not addressed
whether the result is therefore valid or not. Hopefully these will also be
able to be tested, perhaps with data for any conditions that do warrant being
addressed to sharpen the basis for the result. 

> >3. The Japanese data are subject to the same "ecological fallacy" as other 
> >studies, yet we seem to be able to accept them, and indeed use them to 
> >build a case for linearity.
> 
> No, the life span study (LSS) does not have the ecological fallacy.  It is 
> not a great stretch to assume that those "not in city at time of bomb" were 
> not exposed; for those in the LSS cohort, each *individual* has had a dose 
> assessment performed (some 75,000 are in LSS).  Thus, individual doses are 
> associated with individual health outcomes (causes of death), making the 
> study far, far more cogent.

Yes, in part. But that's not true for the <1 cGy "not in city" control
population since they WERE exposed to fallout, and are not a true "unexposed"
population for "case-control" assessment at low-to-moderate doses. Nor is it
true for the lower-exposed groups for which the dose-estimates are more
imprecise, really only put people into a rough range of doses but who also
have an ambiguous contribution (or not) from fallout. 

The high-dose group identified by location re gamma and neutron doses are
individually identified with dose estimates. However, even in this group,
which has better "dose estimates" than any house radon concentration/dose
estimate relationship in "case-control" studies, they are still only
identifiable as a relatively wide-range dose estimate based on their
geographic location at the time. 
 
> >4. The paper by the IARC study group on cancer risk among nuclear industry 
> >workers, Lancet, 344:1039;1994 fits the data to a linear model and therefor 
> >cannot be used as an argument for the validity of the linear model. (not 
> >unless you believe one can pull himself up by tugging on his bootstraps.)
> 
> If you're interested in arguments *for* the linear model, see the many 
> publications of the ICRP, National Academy of Sciences, UNSCEAR (in 
> particular UNSCEAR 1994, as reported at HPS Boston by Warren Sinclair and 
> Burton Bennett).  Also, see the Radiological Protection Bulletin of the UK 
> National Radiological Protection Board, July 1995, pp. 8-12, for a very 
> cogent review supporting the linear, non-threshold dose response model.  
> Also, see my upcoming letter in the HPS Newsletter.

First, the IARC study is an example of the fabrication of "linear" results
(see a separate note on the misrepresentation of the data by the IARC
Committee). And that doesn't even include the problemmatic aspects of the
historical lack of consideration of the known poor dosimetry data, ignoring
work histories and badging problems, and ignoring confounding factors,
especially internal contamination (radiological and chemical) in many in this
workforce that severely limits the scientific validity of the data (but not
its selective use). 

There are almost no arguments *for* the linear model since no good quality
data show the linear result at less than roughly 20-50 cGy (except high
dose-rate x-rays greater than about 5 cGy to a fetus during cell
differentiation, 2nd trimester), but have hundreds of studies with relatively
good dosimetry and health followups that show null and positive effects, now
with growing biological 

 that statistically contradict and refute the very concept that the linear
model could. These reports consistently _presume_ the linear model, dividing
excess cancers expressed at high doses by total doses, mostly from the
high-dose population, and report effect/Sv (or person-Sv, or person-Sv-year,
contrary to and misrepresenting the actual data at low-to-moderate doses, and
report the result (even making graphs that obscure null and contrary results
in the low-dose region). They also explicitly suppress data that shows null or 
negative effects at moderate doses (e.g., 22,000 I-131 patients with 10-15 cGy 
whole-body that show NO adverse effects while the linear model predicts more
than doubling, and dozens of other studies -- of those that even got published 
since such work and results get cancelled when 

> Frankly, I don't get my risk analysis from health physicists, I get it from 
> scientists and scientific organizations doing the work first-hand, as published 
> in peer-reviewed literature (such as Radiation Research or American Journal of 
> Epidemiology) or reports of NAS, ICRP, UNSCEAR, NRPB, etc.; and then, everything
> must be examined critically and taken with a grain of salt.  

And these don't include the most selective, biased, insider, self-promoting,
government-funded and controlled, and religious fanatics, and threateners of
funding, programs and careers for those who dare to take on an honest
scientific debate on these subjects? 

The beginning of the end is at hand of this linear house-of-cards. Note the
quote of the former UNSCEAR chairman, Zbigniew Jaworowski: "After 12 years of
deliberation, the UNSCEAR decided in March 1994 to publish its report on
radiation hormesis, the beneficial effects of radiation. The report, "Adaptive 
Responses to Radiation in Cells and Organisms," dispels the common notion that 
even the smallest dose of radiation is harmful." 

But don't take his word for it, just look at the data (that BEIR, et al, have
not been using or reporting for more than 20 years). But we need to do science 
instead of just supporting government radiation protection policy and funding. 

> Daniel J. Strom, Ph.D., C.H.P.
> Health Protection Department K3-56
> Pacific Northwest Laboratory
> P.O. Box 999
> Richland, WA 99352-0999 USA
> (509) 375-2626
> (509) 375-2019 fax
> dj_strom@pnl.gov
> 
>