[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Calibration of blood irradiator




 
Hello all!
 
Have any of you had any experience calibrating a blood irradiator?
We have just received one at our institution and our other
physicist, in conversations with the company who
makes the unit seems to have turned up a misunderstanding on their
part.  They give you a dose rate to the blood (or blood product)
as their calibration that they determine in the following way:
 
They have a chamber that they can put in the position of the
center of the container that you put the blood in.  This chamber has
a cesium-137 factor and buildup.  They get a reading in air and 
another reading in the same place but with the container filled.  Their
dose to the blood is then calculated thus:
 
(R(air) x 0.87 x R(filled)/R(air))/100 = Dose to blood (Gy)
 
What they seem to have done here is to give a dose to _air_ along
with an attenuation factor of sorts (almost looks like a TAR, doesn't it?). 
I am reporting this to you second-hand but it appears that they consider
ratio of R(air)/R(filled) to be a sort of conversion to dose in the blood,
which we might call the dose to water in the radiation therapy business.
 
Any comments?  While you are at it, we are now in the process of
doing our own calibration with TLD (LiF) chips so if you have any
sage advice to pass along regarding that we would appreciate it.
 
Thanks for any responses!
 
Carl F. Landis, II                      clandis@neoucom.edu
Associate Medical Physicist
Southside Medical Center                216-740-4409
345 Oak Hill Ave.                       FAX: 216-740-6581
Youngstown, OH  44501-0990