Wednesday, January 22, 2014

Question from a Reader: Radioactive Dating, Part 3

← Read Part 1 for an explanation of how radiometric dating works.
← Read Part 2 for the first two critical assumptions made in radiometric dating.

Critical Assumptions (continued)

Decay Rate
The third and most important assumption made by scientists using radiometric dating techniques is that the decay rate of radioactive isotopes has remained constant.  In fact, constant decay rate is the fundamental principle of radiometric dating; it is what makes this dating method theoretically possible.  Dating a specimen using an isotope with inconsistent decay rates would be like measuring a person's height using a ruler made out of warm Silly Putty; if the unit of measurement is changing in magnitude, the end result is meaningless.

Radioactive decay has long been thought to be an entirely independent process, unaffected by any outside forces or conditions.  Scientific tests demonstrated that decay rates could only be artificially changed by a very small amount, not enough to affect the dating method.  Proponents of radiometric dating also cite the radioactivity of distant supernovae, which appears to match modern decay rates despite (supposedly) being millions to billions of years old.  Recently, however, numerous studies have found that the decay rates of several isotopes appears to be affected by solar activity, though no satisfactory explanation has yet been offered.  Admittedly, the measured variation in these studies is not enough to account for the difference between the old-earth and young-earth timelines, but it does demonstrate that radioactive decay is not an absolute and independent constant.  One experiment in 1996 also managed to drastically increase the decay rate of rhenium-187 by ionizing it (removing the electrons), reducing the half-life from 42 billion years to 33 years.  While creationists do not suggest that all matter was ionized at some point in the past (which would only affect certain types of radioactivity anyway), this study again demonstrates that radioactivity cannot be reliably assumed to be constant.

Creationists propose that radioactive decay rates were higher in the past, possibly during Creation Week, during the Flood, or starting from Creation and steadily decreasing since.  This would explain why carbon dating works on very recent specimens, but not on older ones.  As many evolutionists readily point out, however, this creates a thermodynamics problem.  If billions of years worth of radioactivity (by today's rates) was condensed into the 6000 years or so of the creationist timeline, it would produce enough heat to vaporize all the oceans.  Now, both evolutionists and creationists can agree that all of this radioactive decay happened at some point, and the oceans are still with us, so the point of contention is really with how fast the earth can give off heat.  Most calculations use the current rate at which the earth radiates heat to space, a value which is approximated but not exactly known.  It is likely that the Flood provided a much more efficient heat radiation mechanism, particularly with the "steam spout" hypothesis.  According to this creationist idea, the oceanic crust separated at the very beginning stages of the Flood, allowing the hot mantle to rise up to the ocean floor, vaporizing huge amounts of water.  The resulting steam was thrust to the surface and into the atmosphere, eventually falling as rain.  This model would provide an excellent means of transferring large amounts of heat from the interior of the earth to space.  However, no scientific investigations have yet tested this idea, as far as I know, to see if the physics, thermodynamics, and math work out.

Continue to Part 4 to see some scientific experiments that have tested the validity of radiometric dating →

1 comment:

  1. Good thing I always measure people with my cold, uncaring Serious Putty™.

    Keep these articles coming! I keep meaning to look into radiometric dating, but I never seem to get around to it...