By Dr Ananya Mandal, MD
Like all procedures that involve a radioactive substance, nuclear medicine techniques that use radiopharmaceuticals generate radiation that could potentially have adverse effects on a patient.
According to current international guidelines, it is assumed that even the smallest radiation dose poses a risk to patients. Therefore, as a general rule, the amount of exposure during a nuclear medicine procedure is kept “As Low As Reasonably Practicable” (ALARP), meaning that the smallest dose of radiation possible is used that will still achieve accurate results.
The effective radiation dose varies according to the type of study performed. The dose may therefore vary from less than or similar to that of a person’s every day environmental background dose, to a dose that far exceeds it. Similarly, the dose may be less than, similar to or higher than the dose received during a pelvic or abdominal computed tomography (CT) scan.
The radiation dose from a nuclear medicine procedure is measured in units of sieverts (usually in millisieverts, mSv). The effective dose that needs to be administered during an investigation is determined by the amount of radioactivity administered in megabecquerels (MBq), as well as the physical properties of the radiopharmaceutical used, its clearance rate and how it is distributed in the body.
Previously, units of measurement included the curie (Ci); the radiation absorbed dose or RAD; and the Röntgen equivalent man (REM). The RAD and the REM have now been replaced by the gray and sievert, respectively.
Reviewed by Sally Robertson, BSc
Last Updated: Jan 12, 2015