The modern concept of cryonics was first brought to public attention by American physics professor Robert Ettinger in his 1962 book, The Prospect of Immortality. In it, he laid out the scientific argument for preserving people at low temperatures, earning him the title of "the father of cryonics".8
The theory was put into practice for the first time in 1967 with the cryopreservation of Dr. James Bedford, a 73-year-old psychology professor. His preservation marked a pivotal moment in the field, and his body remains in care at the Alcor Life Extension Foundation to this day.1 The early years of cryonics, however, were fraught with challenges. Several small organizations emerged in the 1970s but ultimately failed due to financial instability and a lack of robust long-term planning, resulting in the loss of a number of early patients.5 These failures underscored the critical need for stable, non-profit organizational structures to ensure the indefinite care required.
A major scientific evolution occurred with the shift from "straight freezing" to vitrification. Early procedures simply froze patients, causing extensive cellular damage from ice crystal formation. The modern era of cryonics is defined by the use of vitrification, a sophisticated process that avoids ice formation altogether, resulting in a much higher quality of structural preservation.1 This technique was first proposed for cryonics in 1984 by cryobiologist Gregory Fahy, and the first human to be successfully vitrified was FM-2030 in the year 2000, marking a significant leap forward in the potential viability of the procedure.1