Dating rocks using radioactive isotopes
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. 1979, 1986 © Harper Collins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012 Cite This Source (rā'dē-ō-mět'rĭk) A method for determining the age of an object based on the concentration of a particular radioactive isotope contained within it.For inorganic materials, such as rocks containing the radioactive isotope rubidium, the amount of the isotope in the object is compared to the amount of the isotope's decay products (in this case strontium).The object's approximate age can then be figured out using the known rate of decay of the isotope.For organic materials, the comparison is between the current ratio of a radioactive isotope to a stable isotope of the same element and the known ratio of the two isotopes in living organisms.During those processes, the radionuclide is said to undergo radioactive decay. The unstable nucleus is more stable following the emission, but will sometimes undergo further decay.
After the passage of two half-lives only 0.25 gram will remain, and after 3 half lives only 0.125 will remain etc.
If one knows how much of this radioactive material was present initially in the object (by determining how much of the material has decayed), and one knows the half-life of the material, one can deduce the age of the object.
The method was developed by Willard Libby in the late 1940s and soon became a standard tool for archaeologists.
Libby received the Nobel Prize in Chemistry for his work in 1960.
The radiocarbon dating method is based on the fact that radiocarbon is constantly being created in the atmosphere by the interaction of cosmic rays with atmospheric nitrogen.
Search for dating rocks using radioactive isotopes:
Prior to 1905 the best and most accepted age of the Earth was that proposed by Lord Kelvin based on the amount of time necessary for the Earth to cool to its present temperature from a completely liquid state.