If any man will do his will, he shall know of the doctrine, whether it be of God, or whether I speak of myself. He that speaketh of himself seeketh his own glory: but he that seeketh his glory that sent him, the same is true, and no unrighteousness is in him.  John 7:17-18
How Accurate Is Radiocarbon Dating?

Radiocarbon dating can be quite accurate, and the techniques improve yearly. However, before accepting a radiocarbon date, one should understand how the technique works, its limitations, and its assumptions. One limitation is that the radiocarbon technique only dates material that was once part of an animal or plant. To understand the other capabilities and limitations of radiocarbon dating, we must first understand how it works and consider the flood. 

Most carbon atoms weigh 12 atomic mass units. However, about one in a trillion carbon atoms weighs 14 atomic units. This carbon is called carbon-14. It is also called radio carbon since it is radio active. Half of it will decay in about 5730 years to form nitrogen. Half of the remainder will decay in another 5730 years, and so on. 

Cosmic radiation striking the upper atmosphere converts about 21 pounds of nitrogen each year into radiocarbon (carbon-14). Most carbon-14 quickly combines with oxygen to form radioactive carbon dioxide, which then spreads throughout the atmosphere. Plants take in carbon dioxide, and thus incorporate both carbon-14 and normal carbon-12 into their tissues in the same proportion as occurs in the atmosphere. Carbon-14 then moves up the various food chains to enter animal tissue--again, in about the same ratio carbon-14 has with carbon-12 in the atmosphere. 

When a living thing dies, it no longer takes in radiocarbon. Therefore, its radiocarbon clock begins "ticking" since the radiocarbon in its dead body steadily decreases with a half-life today of 5730 years. If one knew what fraction of the organism's carbon atoms were carbon-14 when it died, then one could attempt to date the time of death. The key questions, then, are "Has the atmospheric ratio of carbon-14 to carbon-12 changed in the past, and if so, why and how much?" 

The assumption usually made (but rarely acknowledged) is that the ratio of carbon-14 to carbon-12 in the atmosphere has always been about what it is today--about one in a trillion. But that may not have been true in the ancient past. For example, a worldwide flood would uproot and bury preflood forests. Afterwards, less carbon would be available to cycle between living things and the atmosphere. With less carbon-12 to dilute the carbon-14 that is continually forming in the upper atmosphere, the ratio of carbon-14 to carbon-12 in the atmosphere would slowly begin to increase. If the ratio of carbon-14 to carbon-12 doubled and we did not know it, radiocarbon ages of things living then would appear to us to be one half-life (or 5730 years) older than their true ages. If that ratio quadrupled, organic remains would appear 11,460 (2 x 5730) years older, etc. Consequently, a "radiocarbon year" would not correspond to an actual year.

Another consequence of the flood would have greatly diluted the carbon-14 to carbon-12 ratio. The precipitation of limestone during the flood involved the release of vast quantities of dissolved carbon dioxide from the subterranean water chamber. (See pages 84 - 115 and the technical note on page 235.) Since that carbon was isolated from the atmosphere before the flood, it would have been free of carbon-14. Much of that released carbon dioxide undoubtedly mixed with some of the carbon dioxide in the preflood seas before all the limestone precipitated. This would have diluted the biosphere's ratio of carbon-14 to carbon-12, resulting in artificially old carbon-14 dates. 

If all of this is true, the ratio of carbon-14 to carbon-12 should have been building up in the atmosphere since the flood. In fact, it should still be increasing. This is precisely what recent measurements show. 

Radiocarbon dating of organic-rich, sedimentary layers worldwide has consistently shown a surprising result. Radiocarbon ages do not increase steadily as we go down into layers of old (but postflood) organic matter, as one might expect. Instead, they increase at an accelerating rate. In other words, the concentration of carbon-14 decreases rapidly with depth. The concentration of carbon-14 starts unexpectedly low just after the flood, as represented in the lower organic layers, and increases more rapidly than expected as the centuries passed. For the reasons mentioned above, the rapidity and direction of this change is what we would expect in the centuries after a worldwide flood. 

One way to infer how the atmospheric concentration of carbon-14 changed in the past is by tree-ring dating. Some types of trees, that grow at high elevations and have a steady supply of moisture, reliably add only one ring each year. In other environments, multiple rings can be added in a year. The thickness of a tree ring depends on the tree's growing conditions, which will naturally vary from year to year. Some rings may even show frost or fire damage. By comparing sequences of ring thicknesses in two different trees, a correspondence can sometimes be shown. Ring patterns will correlate strongly for two trees of the same species that grew near each other at the same time. Weaker correlations (or less confident matches) exist between trees of different species growing simultaneously in different environments. Claims are frequently made that wood growing today can be matched up with some scattered pieces of dead wood so that tree-ring counts can be extended back more than 8,600 years. This may not be true. 

These claimed "long chronologies" begin with either living trees or dead wood that can be accurately dated by historical methods. This carries the chronology back perhaps 3,500 years. Then the more questionable links are established based on the judgment of a tree-ring specialist. Standard statistical techniques could establish just how good the dozen or more supposedly overlapping tree-ring sequences are. However, tree-ring specialists refuse to subject their judgments to these statistical tests, and they have not released their data so others can carry out these statistical tests. 

Several laboratories in the world are now equipped to perform a much improved radiocarbon dating procedure. Using atomic accelerators, the carbon-14 atoms in a specimen can now be actually counted. This gives more precise radiocarbon dates with even smaller specimens. The standard, but less accurate, radiocarbon dating technique only attempts to count the rare disintegrations of carbon-14 atoms, which are sometimes confused with other types of disintegrations. This new atomic accelerator technique has consistently detected at least small amounts of carbon-14 in every organic specimen--even materials that evolutionists claim are millions of years old, such as coal. The minimum amount of carbon-14 found is so consistent among various specimens that contamination can probably be ruled out

Since We See Galaxies Billions of Light-Years Away, Isn't the Universe Billions of Years Old? 

The logic behind this common question has several hidden assumptions. Probably the most questionable assumption is that starlight has always traveled at the same speed. Has it? Has the speed of light always been 186,000 miles per second or, more precisely, 299,792.458 kilometers per second? One simple test is to compare the historic measurements of the speed of light. 

Historical Measurements. During the last 300 years, at least 164 separate measurements of the speed of light have been published. Sixteen different measurement techniques were used. Astronomer Barry Setterfield of Australia has studied these measurements, especially their precision and experimental errors. His results show that the speed of light has apparently decreased so rapidly that experimental error cannot explain it! In the seven instances where the same scientists measured the speed of light with the same equipment years later, a decrease was always reported. The decreases were often several times greater than the reported experimental errors. This author has conducted other analyses that weight (or give significance to) each measurement according to its accuracy. Even after considering the wide range of accuracies, it is hard to see how anyone can claim, with any statistical rigor, that the speed of light has remained constant. 

M. E. J. Gheury de Bray, writing in the official French astronomical journal in 1927, was probably the first to propose a decreasing speed of light. He based his conclusion on measurements spanning 75 years. Later, he became more convinced and twice published his results in Nature, possibly the most prestigious scientific journal in the world. He emphasized, "If the velocity of light is constant, how is it that, invariably, new determinations give values which are lower than the last one obtained . . . . There are twenty-two coincidences in favour of a decrease of the velocity of light, while there is not a single one against it." [emphasis in original] 

Although the speed of light has only decreased a percent or so during the past three centuries, the decrease is statistically significant since measurement techniques can detect changes that are thousands of times smaller. Of course the older measurements have greater errors. However, the trend of the data is startling. The speed of light apparently increases the further back one looks in time. The rate of change is high. Several mathematical curves seem to fit these three centuries of data. Projecting these curves back in time, the speed of light becomes so fast that conceivably the light from distant galaxies could reach Earth in several thousand years. 

There is no physical reason why the speed of light must be constant. Most of us simply assumed that it is, and of course, changing old ways of thinking is sometimes difficult. Russian cosmologist, V. S. Troitskii, at the Radiophysical Research Institute in Gorky, is also questioning some old beliefs. He concluded, independently of Setterfield, that the speed of light was ten billion times faster at time zero! Furthermore, he attributed the cosmic background radiation and most redshifts to this rapidly decreasing speed of light. Setterfield reached the same conclusion concerning redshifts by a completely different approach. If either Setterfield or Troitskii is correct, the big bang theory will fall (with a big bang). 

Atomic vs. Orbital Time. Why would the speed of light decrease? T. C. Van Flandern, working at the U.S. Naval Observatory, showed that atomic clocks are apparently slowing relative to orbital clocks. Orbital clocks are based on orbiting astronomical bodies, especially Earth's oneyear period about the sun. Before 1967, one second of time was defined by international agreement as 1/31,556,925.9747 of the time it takes Earth to orbit the sun. Atomic clocks are based on the vibrational period of the cesium-133 atom. In 1967, a second was redefined as 9,192,631,770 oscillations of the cesium-133 atom. Van Flandern showed that if atomic clocks are "correct," then the orbital speeds of Mercury, Venus, and Mars are increasing; consequently, the gravitational "constant" should be changing. However, he noted that if orbital clocks are "correct," then the gravitational constant is truly constant, but atomic vibrations and the speed of light are decreasing. The drift between the two types of clocks is only several parts per billion per year. But again, the precision of the measurements is so good that the discrepancy is probably real. 

There are four reasons why orbital clocks seem to be correct and why atomic frequencies are probably slowing very slightly. 

If a planet's orbital speed increased (and all other orbital parameters remained the same), then its energy would increase. This would violate the law of conservation of mass-energy. 

If atomic time is slowing, then clocks based on the radioactive decay of atoms should also be slowing. Radiometric dating techniques would give ages that are too old. This would bring radiometric clocks more in line with most other dating clocks. This would also explain why no primordial isotopes have half-lives less than 50 million years. Such isotopes simply decayed away when radioactive decay rates were much greater. 

If atomic clocks and Van Flandern's study are correct, the gravitational "constant" should change. Statistical studies have not detected these variations. 

If atomic frequencies are decreasing, then five "properties" of the atom, such as Planck's constant, should also be changing. Statistical studies of the past measurements of four of the five of these "properties" support both the magnitude and direction of this change. 

For these reasons, orbital clocks seem to be more accurate than the extremely precise atomic clocks. 

Many of us were skeptical of Setterfield's initial claim, since the decrease in the speed of light apparently ceased in 1960. Large, one-time changes seldom occur in nature. The measurement techniques were precise enough to detect any decrease in the speed of light after 1960, if the trend of the prior three centuries had continued. Later, Setterfield realized that beginning in the 1960s, atomic clocks were used to measure the speed of light. If atomic frequencies are decreasing, then both the measured quantity (the speed of light) and the measuring tool (atomic clocks) are changing at the same rate. Naturally, no relative change would be detected, and the speed of light would be constant in atomic time--but not orbital time. 

Misconceptions. Does the decrease in the speed of light conflict with the statement frequently attributed to Albert Einstein that the speed of light is constant? Not really. Einstein's theory of special relativity assumes that the speed of light is independent of the velocity of the light source. This is called Einstein's Second Postulate. Many have misinterpreted this to mean that "Einstein said that the speed of light is constant." Imagine spaceships A and B traveling away from each other. An astronaut in spaceship A suddenly shines a flashlight at spaceship B. Einstein claimed that the beam will strike spaceship B at the same speed as it would if the spaceships were traveling toward each other. This paradox has some experimental support. Setterfield, on the other hand, says that while the speed of light has decreased over time, at any instant all light beams travel at the same speed, regardless of the velocity and location of their sources. 

Some people give another explanation for why we see distant stars in a young universe. They believe that light was created God created a beam of light between Earth and each star. Of course, a creation would immediately produce completed things. Seconds later, they would look older than they really were. This is called "creation with the appearance of age." The concept is sound. However, for starlight, it is probably not an acceptable explanation for two reasons: 

Very bright, exploding stars are called "supernovas." If starlight, apparently from a supernova, were created en route to Earth and did not originate at the surface of the star, then what exploded? If the image of an explosion was only created on that beam of light, then the star never existed, and the explosion never happened. Only a relatively short beam would have been created near Earth. One finds this hard to accept. 

Every hot gas radiates a unique set of precise colors, called its emission spectrum. The gaseous envelope around each star also emits specific colors that identify the chemical composition of the gas. Since all starlight has emission spectra, this strongly suggests that a star's light originated at the star--not in cold, empty space. Each beam of starlight also carries other information, such as the star's spin rate, magnetic field, surface temperature, and the chemical composition of the cold gases between the star and Earth. Of course, God could have created this beam of light with all this information in it. However, the real question is not, "Could God have done it?" but, "Did He?" 

For these reasons, starlight seems to have originated at stellar surfaces, not in empty space. 

Surprising Observations. Starlight from distant stars and galaxies is redshifted--meaning that the light is redder than it should be. (Most astronomers have interpreted the redshifted light to be a wave effect, similar to the pitch of a train's whistle that is lower when the train is going away from an observer. The greater the redshift, the faster stars and galaxies are supposedly moving away from us.) Since 1976, William Tifft, a University of Arizona astronomer, has found that the redshifts of distant stars and galaxies typically differ from each other by fixed amounts. This is very strange if stars are moving away from us. It would be as if galaxies could travel only at specific speeds, jumping abruptly from one speed to another, without passing through intermediate speeds. If stars are not moving away from us at high speeds, the big bang theory is incorrect, along with most other beliefs in the field of cosmology. Many other astronomers, not believing Tifft's results, have done similar work, only to reach the same conclusions as Tifft. 

Atoms behave in a similar way. That is, they give off tiny bundles of energy (called quanta) of fixed amounts--and nothing in between. So Setterfield believes that the "quantization of redshifts," as many refer to the phenomenon, is an atomic effect, not a strange recessional velocity effect. If a property of space is slowly removing energy from all emitted light, it would do so in fixed increments. This would also redshift starlight, with the furthest star's light being redshifted the most. Furthermore, it would also slow the velocity of light and the vibrational frequency of the atom, all of which is observed. Setterfield is currently working on a theory to tie all of this together. PREDICTION 16: The redshifts of some specific, distant galaxies will undergo abrupt decreases. 

Another surprising observation is that most distant galaxies look remarkably similar to nearer galaxies. For example, galaxies are fully developed and show no signs of evolving. This puzzles astronomers. If the speed of light has decreased drastically, these distant, yet mature, galaxies no longer need explaining. 

A Critical Test. How can we test whether the speed of light has decreased a millionfold? If it has, we should observe events in outer space in extreme slow motion. Here is why. 

Consider a time in the distant past when the speed of light was, say, a million times faster than it is today. On a hypothetical planet, billions of light-years from Earth, a light started flashing toward Earth every second. Each flash then began a very long trip to Earth. Since the speed of light was a million times greater than it is today, those initial flashes were spaced a million times further apart in distance than they would have been at today's slower speed of light. 

Thousands of years have now passed. Throughout the universe, the speed of light has slowed to today's speed, and the first of those flashes--strung out like beads sliding down a long string--are approaching Earth. The distances separating adjacent flashes have remained constant during these thousands of years, because the moving flashes slowed in unison. Since the first flashes to strike Earth are spaced so far apart, they will strike Earth every million seconds. In other words, we are seeing past events on that planet (the flashing of a light) in slow motion. If the speed of light has been decreasing since the creation, then the further out in space we look, the more extreme this slow motion becomes. 

As one example, galaxies would be seen in slow motion. Galaxies that appear to spin at a rate of once every 200 million years would be spinning much faster. This might explain the partial twist seen in all spiral galaxies. If the speed of light has not decreased, and there is no slow-motion effect, then why do billion-year-old spiral galaxies, at all distances, show about the same twist? 

Most stars in our galaxy are binary; that is, they and a companion star are in a tight orbit around each other. If there is a "slow-motion effect," the orbital periods of binary stars should tend to increase with increasing distance from Earth. If the speed of light has been decreasing, the Hubble Space Telescope will find that binary stars at great distances have very long orbital periods, showing that they are in slow motion. 

These calculations contain mathematical errors which, if corrected, would support the hypothesis that the speed of light has decreased. I have discussed these matters with each author. The following professional statisticians have verified my conclusions or have reached similar conclusions independently: 

Michael Hasofer, University of New South Wales, Sidney 2033, Australia. 

David J. Merkel, 11 Sunnybank Road, Aston, Pennsylvania 19014, U.S.A. 

Alan Montgomery, 218 McCurdy Drive, Kanata, Ontario K2L 2L6, Canada. 

No physical law prevents anything from exceeding the speed of light. In two published experiments, the speed of light was apparently exceeded by as much as a factor of 100! The first experiment involved radio signals which, of course, are a type of light. Counterexplanations are being proposed for these surprising results, but so far, no one has repeated the experiment or shown it to be false. [Alexis Guy Obolensky, personal communication.] The second report referred to a theoretical derivation and a simple experiment that permitted electrical signals to greatly exceed the speed of light. This derivation follows directly from Maxwell's equations. The special conditions involved extremely thin electrical conductors with very low capacitance and inductance.

A strange quantum effect also causes light, in certain situations, to slightly exceed the normal speed of light. Some who believe in an old universe have a different explanation. Those isotopes are extinct because so much time has passed. However, this explanation raises a counterbalancing question: How did those isotopes, and 97 percent of all elements, form? The standard answer is that these elements appeared during supernova explosions. This is actually speculation, since essentially no supporting evidence has been found. Besides, all supernova remnants we see in our galaxy appear to be less than 10,000 years old. This is based on the well-established decay pattern of a supernova's light intensity in the radio-wave frequency range. The light beams are considered to be traveling in a vacuum. Light travels at slightly slower speeds when it travels through any substance, such as air, water, or glass. 

Another question concerns Einstein's well-known formula, E=mc 2 , which supposedly gives the energy (E) released when a nuclear reaction annihilates a mass (m). If the speed of light (c) decreases, then one might think that either E must decrease or m must increase. Not necessarily. 

In the universe, time could flow according to either atomic time or orbital time. Under which standard would E=mc 2 be a true statement? Mass-energy would be conserved under both; in other words, the energy or mass of an isolated system would not depend on how fast time passed. Obviously, E=mc 2 would be absolutely true in atomic time where c is constant, but not in orbital time where c decreases. Let's now see why E=mc 2 will be approximately correct even in orbital time. 

Nuclear reactions convert mass to energy. Unfortunately, the extremely small mass lost and large energy produced cannot be measured precisely enough to test whether E=mc 2 is absolutely true. Even if mass and energy could be precisely measured, this formula has embedded in it an experimentally-derived, unit-conversion factor that requires a time measurement by some clock. Which type of clock should be used: an orbital clock or an atomic clock? Again, we can see that E=mc 2 is "clock dependent." 

If c has decreased (the orbital time standard), neither length, electrical charge, nor temperature standards would change. Therefore, chemical and nuclear reactions would not change. However, the speed of nuclear reactions, and to a slight extent chemical reactions, would change, since the vibrational frequencies of atoms would change. Also, radioactive decay rates, which depend on the vibrational frequency of the atom, would decrease if c decreased. 


Feedback or Chat - Please contact: As a son unto the Father in Christ
Layout and design Copyright ©2000 BURNING BUSH PRODUCTIONS