By Dr Ananya Mandal, MD
Nuclear medicine is associated with a long history, to which scientists from various different fields such as physics, medicine chemistry and engineering have contributed over the decades.
This multidisciplinary involvement means it has been difficult for historians to determine the origins of nuclear medicine. However, researchers believe the birth of this medical speciality probably occurred somewhere between 1934 when artificial radioactivity was first discovered and 1946 when radionuclides were first produced for medical use by the Oak Ridge National Laboratory.
Nuclear medicine first became recognised as a potential medical speciality in 1946 when it was described by Sam Seidlin in the Journal of the American Medical Association. Seidlin reported on the success of radioactive iodine (I-131) in treating a patient with advanced thyroid cancer. Later, the use of I-131 was expanded to applications such as thyroid gland imaging, hyperthyroidism treatment and quantification of thyroid function.
By the 1950s, the clinical use of nuclear medicine had become widespread as researchers increased their understanding of detecting radioactivity and using radionuclides to monitor biochemical processes. Several researchers worked tirelessly to establish the efficacy, safety and diagnostic and therapeutic potential of this speciality.
Benedict Cassen developed the first rectilinear scanner and Hal Anger’s scintillation camera helped establish nuclear medicine as a fully developed medical imaging speciality. The Society of Nuclear Medicine was formed in 1954 in Spokane, Washington, USA and in 1960 the society launched its first publication of the Journal of Nuclear Medicine, which became the flagship journal associated with the field.
In 1971, the American Medical Association acknowledged nuclear medicine as an official medical specialty and in 1972, the American Board of Nuclear Medicine was formed.
Reviewed by Sally Robertson, BSc
Last Updated: Jan 11, 2015