Godlike Productions - Discussion Forum
Users Online Now: 2,290 (Who's On?)Visitors Today: 936,053
Pageviews Today: 1,247,076Threads Today: 309Posts Today: 4,827
10:20 AM


Rate this Thread

Absolute BS Crap Reasonable Nice Amazing
 

Time vortals and mandelbrot event boundaries and horizons.

 
FHL(C)
Offer Upgrade

05/30/2005 02:52 AM

Report Abusive Post
Report Copyright Violation
Time vortals and mandelbrot event boundaries and horizons.
Have any of you noticed that at the personal level, there seems to be eddies of time, that is, supposedly accurate clocks and watches keep on getting out of synch, that time is not enough to do what you you used to be able to accomplish in the same time only a few years ago?
I have some other thoughts about this as well as few scriptural and scientific reasons as to potential reasonings why(though not neccesarily how).
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
Anonymous Coward
12/08/2005 10:17 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Psychic hurricanes.
AA  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Have you noticed this personally FHL(C)? I have not personally since the only clocks I really use are on my computers or on my cell phone.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
FHL(C)  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Yes.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
AA  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
I am curious to hear your reasonings... Time is something I have always been fascinated by.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
FHL(C)  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Lol, ok AA, but it will be over the course of time not just this one session, i will start from scripture and then move into what is considered by some to be controversial science.

Mat 24:22 And except those days should be shortened, there should no flesh be saved: but for the elect´s sake those days shall be shortened.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
AA  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Ok starting with that scripture... from what you have observed, have you "lost" or "gained" time?
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
AA  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Ok bedtime... I will check this thread in the morning. I am curious to hear your hypothesis´ and am also curious to hear if anyone else is experiencing clocks being off as well. Interesting scripture to think about since all time is basically a measurement of events. If there were no kinetic energy (things did not move) there would be no time. The less events you have left in your life or your existence, then the less time you "have". So cliche, I know. In other words one way to interpret that scripture would be time would "shorten" as you neared the end.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
ICF  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Expect more on this thead, lots of potential?

Time and space are the two polarities that make possible the manifestation of the universe, and consequently they are both expressions of inherent awareness; that is, the inherent awareness of the Unspeakable. Therefore time and space are specific states of awareness, each being the opposite polarity of the other, and in this respect are inseparable and inter-dependent, for without that awareness termed time there could be no awareness of space, and without that awareness termed space there could be no awareness of time. Mathematically the term "awareness of time" means "awareness multiplied by time", and the term "awareness of space" means "awareness multiplied by space". And yet, both time and space are in themselves states of inherent awareness, and therefore irrespective of whether we are referring to "awareness of time" or to "awareness of space", both terms amount to the same thing as saying "awareness of awareness", "awareness multiplied by awareness" or, more precisely, "awareness multiplied by inherent awareness".
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
.  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Lost time
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
buddha bloke  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
.. icf... ever heard of indra´s web? flower
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
Anonymous Coward
12/08/2005 10:17 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
[link to fusionanomaly.net]
.  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
bump for AP(another poster)
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
Anonymous Coward
12/08/2005 10:17 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
I have a windup clock. I wind it every night. By the end of the week it is 3 hours later than all the other clocks. It could be a gear problem, but I listen to it and it sounds very steady.
Who knows...
.  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
OTHER QUESTIONS





The Speed of Light



The "missing helium" problem

comments on recent RATE paper September 5, 2003



Faster c and heat output



Biological processes



Does the Universe Show False Maturity?



Future possibilities



Hydrogen spectral lines



A Series of Questions from One Correspondent



Light and Subatomic Particles



Quartz Clocks





The Speed of Light



Question: What is it or causes, makes light to go at 186,000 miles per second?



Setterfield: from Foundations of Physics, by Lehrman and Swartz, (Holt, Rinehart, and Winston, 1969) p. 510-511:

“An electromagnetic wave consists of a changing electric field that generates a changing magnetic field that regenerates the electric field and so on, indefinitely. The wave travels by transferring energy from the electric field to the magnetic field and back again, just as the wave in a spring travels by transferring energy from the potential energy of deformation to the kinetic energy of the spring’s mass and back. Once produced, the wave continues to travel away from its source even if the oscillation that caused it no longer exists. Maxwell was able to calculate the speed with which these electromagnetic waves travel in free space, by assuming that each change in the electric field generates a magnetic field and vice versa…The magnetic field resulting from a change in the electric field must be such as to oppose the change in the electric field, according to Lenz’s Law. Therefore, we may think of the magnetic property of space as a kind of inertial property, inhibiting the rapid change of the fields. The magnitude of this property is the magnetic constant of free space. The electric constant of free space must also be brought into the picture. Coulomb’s Law tells us that the field around a charge is proportional to the charge. A charge represents a kind of electrical distortion of space, which produces a force on neighboring charges. The relationship is analogous to the elastic relationship in a spring where a distortion of shape (charge) produces a force (field). In the electric case, the constant of proportionality is the electric constant of free space, which is therefore a kind of electric elastic property of space.”

In other words, the light is propelled by the changing electric and magnetic fields. The speed at which the light travels is dependent upon the electric-elastic property of space and the magnetic-inertial property of space. These properties are respectively called the permittivity and the permeability of free space.





Question: What makes it goes faster than that?--- Nothing, right?



Setterfield: If the electric and magnetic properties of free space alter, the speed of the wave will also change. The electric and magnetic properties of free space are determined by the zero point energy which fills all space. If there is a change in the strength of the zero point energy at any point in space, the speed of light will also change at that point. If zero point energy increases, the speed of light decreases. There is a helpful way of looking at this. Because energy exists in free space, and since Einstein’s equation shows that matter and energy are inter-convertible, the zero point energy allows the manifestation of what are called ‘virtual particle pairs.’ The greater the strength of the zero point energy, the more the number of these virtual particle pairs in any given volume of space at any given moment. These virtual particle pairs act as obstacles to the progress of a photon of light. For a more detailed laymen’s explanation here, please see A Simplified Explanation of the Setterfield Hypothesis and The Vacuum, Light and the Redshift.





Question: What makes it go slower?



Setterfield: The presence or absence of virtual particle pairs determines the speed of light in space. See the above and the articles referenced.



New -- Question: Is the speed of light constant in a vacuum?

Setterfield: At any one time, assuming the properties of the vacuum are uniform, yes, the speed of light is constant in a vacuum. However there something called the Zero Point Energy, or ZPE, (check the Discussions section on my website for further information on this) which produces something called virtual particles which flash in and out of existence and they DO slow down the arrival of light at its final destination. So any change in the ZPE out there in the vacuum will change the speed of light. However changes in the ZPE occur throughout the universe at the same time, except for some relatively small local secondary effects -- which I am becoming convinced are connected with gravity as we know it.





The "missing helium" problem



Question: One on the evidences that I use for a young earth is the apparent "missing" helium in the atmosphere. According to creationist literature, if the earth is really 4.6 billion years old, we are missing about 2000 times the amount of helium that should have accumulated by now.



As I understand it, this helium is being released into the atmosphere because of isotopic decay. I think your belief that the decay rate of isotopes was much greater in the past makes a whole lot of sense and I use it as well. The problem for me reconciling the two. In short, if the decay rate was greater in the past, shouldn´t all the "missing" helium be present in the atmosphere?



Setterfield: The answer to this “missing helium” problem is being sought by both evolutionists and creationists. This may surprise you since the problem that evolutionists face might not be expected to be a problem to creationists. However, the creationist RATE group, which is studying the whole problem of radioactive decay and radiometric dating, is slowly swinging around to the idea of accelerated rates of radioactive decay, at least during certain specific periods of earth history, such as the time of the Flood. The model that is developing from the RATE group is “a short burst of high-rate radioactive decay and helium production, followed by 6000 years of He diffusion at today’s temperatures in the formation” [D. R. Humphreys in “Radioisotopes and the Age of the Earth”, L. Vardiman et al Editors, p.346, ICR/CRS 2000]. The model that Humphreys and his RATE associates are working on envisages that “since Creation, one or more episodes occurred when nuclear decay rates were billions of times greater than today’s rates. Possibly there were three episodes: one in the early part of the Creation week, another between the Fall and the Flood, and the third during the year of the Genesis Flood” [Humphreys, op. cit., p.333]. This position contrasts with the lightspeed proposition of diminishing decay rates following the form of the redshift/lightspeed curve over a somewhat more extended period.



NOTE (Sept. 5, 2003): The RATE group has just published an excellent paper on the helium question which may be found here. My comments on this paper are below.



However, the effects of this accelerated decay, no matter which of these two models is being used, is similar. Effectively, there has been a high production rate of helium in the earlier days of the earth which should yield higher amounts of helium in our atmosphere than observed at the moment. By contrast, the old creationist position was that radiometric dates were basically in error for one reason or another and that radioactive decay at today’s rate was all that existed over the last 6000 years. The result of that old creationist position was that there was no case to answer on the helium question, since none was “missing”. This older position was the one that led to the missing helium being an effective argument for a young earth. In his article in the above book by the RATE group, Humphreys indicates where some of the original thinking on this issue was incorrect through failure to note the evidence from the data. As a result of further investigation, recent creationist research has indicated that the situation may not be as straightforward as the older model had suggested.



This result then leads directly to the question that you posed. Where is the “missing” helium? The brief answer is that it is still locked up in the interior of the earth and in the rocks in the earth’s crust. Let me explain a little further. In 1999, an assessment of data collected in 1997 indicated that the bottom 1000 km of the earth’s mantle contained anomalous reservoirs of heat producing elements [“Researchers propose a new model for earth mantle convection,” MIT News Release, 31 March 1999. This is discussed in more detail in the appendix of A Brief Stellar History]. In other words, there is a layer in the earth’s interior where the radioactive elements were concentrated, and from which some have come to reside in the crust as a result of ongoing geological processes. This means that the majority of helium must still be within the earth’s mantle, not having had enough time to work its way to the surface on the contracted timescales offered by both the RATE and Setterfield Vc models.



However, that does not account for the radioactive material which has come to reside in the crust of the earth. Where is the helium emitted by radioactive decay from that source? The possible answer is a surprise. It had been noted in 1979 by R. E. Zartman in a Los Alamos Science Laboratory Report Number LA-7923-MS that zircons from deep boreholes below the Jemez volcanic caldera in New Mexico gave an age for the granitic basement complex there of about 1.5 billion atomic years. However, when zircons from this complex were analyzed by Gentry et al in 1982, they were found to contain very large percentages of helium, despite the hot conditions [R.V. Gentry, G.L. Glish and E.H. McBay “Differential helium retention in zircons…” Geophysical Research Letters Vol.9 (1982), p.1129]. In other words, these zircon crystals had retained within them the radioactive decay products of almost 1.5 billion atomic years, without very much diffusing out. Humphreys concludes that one of the strongest pieces of evidence for accelerated decay rates over a relatively short timespan is this high retention of radiogenic helium in microscopic zircons [op.cit., p.344, 350]. I find myself in agreement with this assessment.



Therefore the answer to your question based on these assessments is that

(1) radiogenic helium is trapped in the interior of the earth, not having had sufficient time to be brought to the surface from the radioactive layer that exists at a depth of about 1700 km, while

(2) much of that which was brought up to the crust has not diffused out from the host rocks, again because of insufficient time and also because of much lower diffusion rates than previously anticipated.

This position has at least some experimental data to support it.



Comments on the RATE paper, Helium Diffusion Rates Support Accelerated Nuclear Decay



The paper by Humphreys, Baumgardner, Austin and Snelling on helium diffusion rates in zircons from the Jemez Granodiorite is an important contribution to the debate on the missing helium in our atmosphere. The Jemez Granodiorite can be found on the west flank of the volcanic Valles Caldera near Los Alamos, New Mexico, and has a radiometric date of 1.5 billion years. That places this granodiorite in the Precambrian Era geologically. As Humphreys et al report in their Abstract, “Up to 58% of the helium (that radioactivity would have generated during the … 1.5 billion year age of the granodiorite) was still in the zircons. Yet the zircons were so small that they should not have retained the helium for even a tiny fraction of that time. The high helium retention levels suggested to us…that the helium simply had not had enough time to diffuse out of the zircons…” If indeed there was not enough time for 1.5 billion atomic years worth of helium from radioactive decay the helium to diffuse out of the zircons, this implied two things. First that the radioactive decay rate must have been higher in the past, and second, that the Precambrian rocks in which they were found had an actual age much less than the atomic or radiometric age would suggest. Indeed, the authors state that the data “limit the age of these rocks to between 4,000 and 14,000 years.”



The analysis by these authors was mainly focused on zircon crystals from the granodiorite. As they point out, zircon has a high hardness, high density, and high melting point. It is also important to note that uranium and thorium atoms can replace up to 4% of the normal zirconium atoms in any given crystal. In all, seven samples were used in their analysis and were obtained from depths ranging up to 4310 metres and temperatures up to 313 degrees Celsius. An important conclusion became apparent to the authors immediately. They state: “Samples 1 through 3 had helium retentions of 58, 27 and 17 percent. The fact that these percentages are high confirms that A LARGE AMOUNT OF NUCLEAR DECAY did indeed occur in the zircons [their emphasis]. Other evidence strongly supports much nuclear decay having occurred in the past….We emphasize this point because many creationists have assumed that “old” radioisotopic ages are merely an artifact of analysis, not really indicating the occurrence of large amounts of nuclear decay. But according to the measured amount of lead physically present in the zircons, approximately 1.5 billion years worth – at today’s rates – of nuclear decay occurred. Supporting that, sample 1 still retains 58% of all the alpha particles (the helium) that would have been emitted during this decay of uranium and thorium to lead.” This statement is a pleasing development, and makes a refreshing change to some creationist approaches.



A glance at the analysis figures indicates that the helium retention levels decrease as the rock temperature increases. Another important component is the diffusion rates of the minerals surrounding the zircons, which are frequently biotite or muscovite, both forms of mica. Analysis disclosed that, in the temperature range of interest, these micas had a somewhat higher diffusion rate than the zircons. In other words, the material surrounding the zircons did not impede the outflow of helium as much as the zircons did. The analysis concluded that “the observed diffusion rates are so high that if the zircons had existed for 1.5 biollion years at the observed temperatures, samples 1 through 5 would have retained MUCH LESS HELIUM THAN WE OBSERVE [their emphasis]. This strongly implies they have not existed for nearly so long a time…In the meantime we can say the data of Table 4, considering the estimates of error, indicate an age between 4000 and 14,000 years. This is far short of the 1.5 billion year uniformitarian age!”



The final section of the paper considered four alternative explanations for the data, but the authors were able to dismiss most of them readily. The first of these was the assertion that temperatures in the Jemez Granodiorite before the Pliocene-Pleistocene volcanism were low enough to make the diffusion coefficients small enough to retain the helium. The analysis of this contention concluded that for this to happen, “the pre-Pliocene temperature in the granodiorite would have to have been about -100 degrees Celsius, near that of liquid xenon…[This] demonstrates how zircons would need unrealistically low temperatures to retain large amounts of helium for … eons of time.”



A second line of defense might be to claim that the helium 4 concentration in the surrounding rock is presently about the same as in the zircons. However, the authors point out that the measured values of the helium concentration in the surrounding biotite is much lower than in the zircons. Thus helium must still be diffusing out of the zircons into the biotite. In addition, they state “the Los Alamos geothermal project made no reports of large amounts of helium (commercially valuable) emerging from the boreholes, thus indicating that there is not much free helium in the formation as a whole.”



A third objection might be to claim that the team involved made a huge mistake, and that the actual amounts of helium were really many orders of magnitude smaller than reported. A discussion in their Appendix C indicates otherwise, as also does the fact that similar data have been obtained by others from that and other formations.



The authors took some trouble to refute the fourth possibility. Basically, that possibility makes use of the geoscience concept of a ‘closure temperature’ to claim that zircons below that temperature are permanently closed systems. Thus no significant helium would be lost by diffusion and the high helium content of zircons would thereby be explained. The authors respond in the following terms: “After the zircon cools below the closure temperature, helium begins to accumulate in it…Later, as the temperature levels off to that of the surrounding rock, the diffusion coefficient becomes constant…As the amount of helium in the zircon increases, Fick’s laws of diffusion (sect.3)say the loss rate increases. Eventually, even well below the closure temperature, the loss rate approaches the production rate, an event we call the ‘reopening’ of the zircon….If the closure interval were long compared to the age of the zircon, then the zircon would indeed be a closed system. But [the data] gives us values [for the closure interval] between a few dozen years and a few thousand years depending on the temperature of the sample in the borehole. Those times are very small compared to the uniformitarian age of 1.5 billion years…Thus the closure temperature does not help uniformitarians in this case, because the closure interval is brief.”



If this new approach to the fourth problem holds up to closer scrutiny, then the authors have a strong case for their conclusion, namely that “The data and our analysis show that over a billion years worth of nuclear decay have occurred very recently, between 4,000 and 14,000 years ago…[Consequently] helium diffusion casts doubt on uniformitarian long-age interpretations of nuclear data and strongly supports the young world of Scripture.”



The groundbreaking research is of great value to the science community as a whole, not just the creation community. We need to be grateful for the work these men have put into this research.



Barry Setterfield, 1st September, 2003.





Faster c and heat output



Question: Wouldn’t the earth and all life have fried to a crisp when light speed was faster?

Setterfield: As far as radioactive heating turning the earth into a plasma is concerned, this is also incorrect. First, the Genesis 1:2 record indicates that the earth started off in a cold state with an ocean covering the surface. This contrasts with current astronomical paradigms, but I consider that we have an eyewitness account here. If we that this as our starting point, plus the more rapid rate of radioactive decay. It is currently accepted that there was/is a region in the earth´s mantle from about 1700 km down to a depth of 2700 km which has the radioactive elements concentrated in it.



These elements came to be incorporated in the crust by later geological processes. On this basis, calculation shows that the temperature of the core today, some 7200 years later would be about 5800 degrees with 1900 degrees now at the top of the lower mantle.





Biological processes



Question: What about biological processes?

Setterfield: In brief, biological processes remain essentially unaffected by variations in lightspeed, c. If the old collision theory of reactions was true, all

reactions would proceed to the point of completion in a fraction of a second, even with the current speed of light. However, it can be shown that most reactions are controlled by a rate- determining step involving an activated complex. Reaction with the activated complex is dependent upon both the number of particles or ions approaching it, and the time these particles spend in the vicinity of the activated complex. As the physics of that step are worked through, it can be shown that even though there may be more ions come within reaction distance of the activated complex, proportional to c, the time that they spend in its vicinity is proportional to 1/c, so that the final result is that reaction times are unchanged with higher c values.



Consequently, brain processes, muscular contractions, rates of growth, time between generations, etc. all would occur at the same rate with higher c as they do today.



Question: Photons possess many different energy levels. From radio waves to gamma rays. Are these "categories" dependent on wavelength or energy? If it is dependent on wave length (as all radio-technology would insist on)then sometime in the future there will be much less light and lower wavelength photons, and much more radio waves. Will we all someday be blind?

Setterfield: The energy of a photon E is given by [hf] or [hc/w] where h is Planck´s constant, f is frequency, w is wavelength, and c is light-speed. Two situations exist. First, for LIGHT IN TRANSIT through space. As light-speed drops with time, h increases so that [hc] is a constant. It should be emphasised that frequency, [f], is simply the number of wave-crests that pass a given point per second. Now wavelengths [w] in transit do not change. Therefore, as light-speed c is dropping, it necessarily follows that the frequency [f] will drop in proportion to c as the number of wave-crests passing a given point will be less, since [c = fw]. Since [f] is therefore proportional to c, and [h] is proportional to [1/c], it follows that [hf] is a constant for light in transit. Since both [hc] and [w] are also constants for light in transit, this means that [hc/w] and [hf] do not alter. In other words, for light in transit, E the energy of a photon is constant, other factors being equal.

The second situation is that pertaining at the TIME OF EMISSION. When c is higher, atomic orbit energies are lower. This happens in a series of quantum steps for the atom. Light-speed is not quantised, but atomic orbits are. As light-speed goes progressively higher the further we look back into space, so atomic orbit energies become progressively lower in quantum steps. This lower energy means that the emitted photon has less energy, and therefore the wavelength [w] is longer (redder). This lower photon energy is offset by proportionally more photons being emitted per unit time. So the total energy output remains essentially unchanged.

As a result of these processes at emission, light from distant galaxies will appear redder (the observed redshift), but there will be more photons so distant sources will appear to be more active than nearby ones. Both of these effects are observed astronomically. (August 1, 1999)



Question: I´m a Christian layperson who is very interested in this ongoing debate about the speed of light. First off, let me say that I too, like yourself, firmly believe that a natural reading of Genesis yields a young-earth creation. After that, since I´m not a scientist, I´m not quite sure what to say about this whole thing. The amount of material available on the internet concerning this subject is so voluminous it´s difficult to get my hands around it. In any case, I´ve read enough to know most secular scientists disagree with you and so do most young-earth creationists, however in the article written by your wife about the history of this debate she put forth a valid reason why many young-earth creationists are disagreeing with you (of course it´s obvious why most secular scientists disagree).
To make a long story short, I´ve been reading stuff on your website and you seem to have considered many of the implications of a decrease in c, but since my knowledge of science is limited, I´m not quite sure how qualified I am to make such a judgment. Nevertheless, from what I´ve been reading, I respect your work and would appreciate a short response to a couple of questions. [note: the second question concerned relativity and may be found here. It is the third question down in this section.]
It seems that aside from accusations about the mishandling of the data (which your friend Alan Montgomery appears to have handled), some say that a high c in the past would have affected biological life processes. What is your response to them?

Setterfield: Thank you for your comments and question. Let me see if I can answer your points satisfactorily

You point out that some claim that a higher value for lightspeed would affect biological processes. This is based on the fact that some atomic processes are faster when lightspeed is higher. However, as is shown on our website in the paper Atomic Quantum States, Light and the Redshift, in equations (77) and (78), this does not affect reaction rates, and hence biological processes remain unchanged. This is the case because energy is being conserved throughout any changes in the speed of light. In turn, this means that other associated atomic constants are also varying in a way that mutually cancel out any untoward effects. Another way of looking at it is to see biological processes as occurring at the level of molecules rather than at the subatomic level. At the molecular level, the basic quantity that is governing effects is the electronic charge. This has remained constant throughout time and so that molecular interactions remain unchanged.







Does the Universe Show False Maturity?

Question: Is the universe mature or does it just appear mature? Are there any ways to observationally differentiate between a mature universe and an apparently mature universe? If a globular cluster looks like it is 13 GY old and has a population of stars that give that appearance, it is 13 GY old for all intents and purposes. There is no difference. In this vein, why would God create a nearly dead star, a White dwarf, a core of a star that has had its atmosphere discharged in a planetary nebula episode after it has extinguished its nuclear fuel? Does God create all things new or would he create "old" dead objects. Was the soil in the garden of Eden filled with decaying vegetatable and animal matter? Were there bones and fossils in the sediments below the soil? A dead star equates well with a fossil, I believe. Would God create either?

Setterfield: Inherent within the redshift data for cDK is an implied age for the cosmos both on the atomic clock and on the dynamical or orbital clock. These ages are different because the two clocks are running at different rates. The atomic clock runs at a rate that is proportional to light speed, and can be assessed by the redshift. Originally this clock was ticking very rapidly, but has slowed over the history of the universe. By contrast, the dynamical or orbital clock runs at a uniform rate. The atomic clock, from the redshift data, has ticked off at least 15 billion atomic years. By contrast, the orbital clock, since the origin of the cosmos, has ticked off a number of years consistent with the patriarchal record in the Scriptures. (1/29/99)

For further material on sedimentation and the early earth, please see "A Brief Earth History".





Future possibilities



Question: I have read some of your articles on the issue of the speed of light and I am fascinated by the idea that the speed of light may be changing. I am wondering if you believe that it will someday be possible to increase the speed of light for technological applications such as computers, communications and starships. Any of your thoughts would be greatly appreciated.

Setterfield: Yes, I guess it´s theoretically possible to alter the characteristics of the vacuum locally in order to increase the speed of light, and this may have implications and applications for starships. However, these applications may be a considerable way into the future. We can alter the structure of the vacuum locally in the Casimir effect in which the speed of light perpendicular to the plates increases. This effect, however, is small because only a limited number of waves at the zero point field are excluded from between the plates. Haisch, Rueda, and Puthoff are in fact examining a variety of possibilities with NASA in these very areas for spaceship propulsion. More locally, I envisage a time when it may be possible to change the characteristics of the vacuum in a large localised container in which radioactive waste may be immersed. As a consequence, the half-lives would be shortened and the decay processes speeded up, but there would be no more energy given off in that container than if the material was decaying at the normal rate outside. The energy density would be remaining constant.



What happens in this kind of thing is that the energy density is determined by the permittivity and permeability of free space. When the speed of light is high, the permittivity and permeability are low, proportional to 1/c. As a consequence, the energy density of emitted wavelengths is also proportional to 1/c. However, because radioactive decay effectively has a ´c´ term in the numerator for each of the decay sequences, this means that there will be c times as much radiation emitted. However, because the energy density of the radiation is only 1/c times as great, the effective energy density remains constant.



Under these circumstances, some practical applications of this may be possible in the forseeable future, but a lot of development would still have to occur.





Hydrogen spectral lines



Question: In a recent book I´ve read (Show Me God by Fred Heeren) they make the objection to CDK by stating how the spectral lines of hydrogen show no changes. They claim that the hydrogen spectral line at 21 cm should vary along with the speed of light, and since it hasn´t, this rules out CDK. I

have no idea how to answer this objection. Any thoughts?

Setterfield: Regarding the spectral lines of hydrogen question: this is a common statement by those who have not looked in any detail at the cDK hypothesis. It shows a complete misunderstanding of what is happening. When the speed of light is higher, there is no change in the wavelength of emitted radiation. So the wavelengths of hydrogen and its spectral lines will be unchanged by the changing speed of light. What does change is the *frequency* of light in transit, because light speed equals wavelength times frequency. Because wavelength is constant, frequency changes lockstep with the speed of light. Therefore, as light speed changes in transit, the frequency automatically changes, so that when light arrives at the observer on earth, no difference in frequency of a specific wavelength will be noted compared with laboratory standards. At the moment of emission, the wavelength of a given spectral light is the same as our laboratory standard, but the frequency, which is the number of waves passing per second, will be higher proportional to c. In transit those wavelengths remain constant, but as lightspeed decays, the frequency drops accordingly, so that, at the moment of reception, the frequency of light from any given spectral line will be the same as the observer´s standard. The wavelength will also be unchanged.



One point should be re-emphasised in view of current scientific convention. Convention places emphasis on frequency rather than wavelength, but frequency is a derived quantity, whereas wavelength is basic. So if you have a constant wavelength at emission, and that wave train is traveling faster, you will inevitably have a higher frequency.





A Series of Questions from One Correspondent

Question #1: Setterfield assumes that the beginning of the universe is (from the moment We can observe anything at all) very small, hot, dense in energy and very rapidly expanding. Somewhere else he said (if I understood it right) that after two days the universe had its maximum size. Is that right?

Setterfield: I have assumed that at the very beginning of the cosmos it was in a small, hot dense state for two reasons. First, there is the observational evidence from the microwave background. Second there is the repeated testimony of Scripture that the lord formed the heavens and "stretched them out." I have not personally stipulated that this stretching out was completed by day 2 of Creation week. That has come from Lambert Dolphin´s interpretation of events. What we can say is that it was complete by the end of the 6th day. It may well have been that the stretching out process was completed by the end of the first day from a variety of considerations. I have yet to do further thinking on that. Notice that any such stretching out would have been an adiabatic process, so the larger the cosmos was stretched, the cooler it became. We know the final temperature of space, around 2.7 degrees Absolute, so if it has been stretched out as the scripture states, then it must have been small and hot, and therefore dense initially, as all the material in the cosmos was confined in that initial hot ball.

Question #2: In expanding so rapidly there was a conversion of potential energy to the ZPE (what is the difference between zero-point energy and zero-point radiation?). This was not completed in two days but needed a longer period of time, about 3000 years (?)

Setterfield: Please distinguish here between what was happening to the material within the cosmos, and the very fabric of the cosmos itself, the structure of space. The expansion cooled the material enclosed within the vacuum allowing the formation of stars and galaxies. By contrast, the fabric of space was being stretched. This stretching gave rise to a tension, or stress within the fabric of space itself, just like a rubber band that has been expanded. This stress is a form of potential energy. Over the complete time since creation until recently, that stress or tension has exponentially changed its form into the zero-point energy (ZPE). The ZPE manifests itself as a special type of radiation, the zero-point radiation (ZPR) which is comprised of electromagnetic fields, the zero-point-fields (ZPF). These fields give space its unique character.

Question #3: Connected to this there was also a rapid decline in the speed of light, Because higher ZPE level puts a kind of "brake" on the speed of light because of higher permeability and permittivity (OK?).

Setterfield: Yes! As more tensional energy (potential energy) became exponentially converted into the ZPE (a form of kinetic energy), the permittivity and permeability of the vacuum of space increased, and light-speed dropped accordingly.

Question #4: As more energy came available in the world space, matters had to run at a higher energy level, causing the emitted light spectrum to be shifted in the blue direction.

Setterfield: Yes! It has been shown by Harold Puthoff in 1987 that it is the ZPE which maintains the particles in their orbits in every atom in the cosmos. When there was more ZPE, or the energy density of the ZPF became higher, each and Every atomic orbit took up a higher energy level. Each orbit radius remained fixed, but the energy of each orbit was proportionally greater. Light emitted from all atomic processes was therefore more energetic, or bluer. This process happened in jumps as atomic processes are quantised.

Question #5: Let me think what that means for the redshift case. Stars and galaxies not too distant have sent their original more redshifted light to the earth Already a long time ago. Now they sent only the "standard" spectrum.

Setterfield: Yes! The stars within our own milky way galaxy will not exhibit any quantum redshift changes. The first change will be at the distance of the Magellanic clouds. However, even out as far as the Andromeda nebula (part of our local group of galaxies and 2 million light-years away) the quantum redshift is small compared with the actual Doppler shift of their motion, And so will be difficult to observe.

Question: #6 Very distant galaxies have still not succeeded in sending us all their "redshifted" light, but this will reach us in due time (how long? Millions of years, I suspect?). This must be the reason that the redshift increases with greater distance of the stars.

Setterfield: That is correct! Because these distant galaxies are so far away their emitted light is taking a long time to reach us. This light was therefore emitted when atoms were not so energetic and so is redder than now. Essentially we are looking back in time as we look out to the distant galaxies, and the further we look back in time, the redder was the emitted Light. The light from the most distant objects comes from about 20 billion light-years away. This light has reached us in 7,700 years. The light initially traveled very fast, something like 1010 times its current speed, But has been slowing in transit as the energy density (and hence the permittivity and permeability) of space has increased.

Question #7: Is there any answer to the question if the light of even the most distant galaxies has reached the earth from the very beginning? Is there any reasoning for it?

Setterfield: Yes! There is an answer. When the speed of light was about 1010 times its current speed as it was initially, observers on earth could see objects 76,000 light years away by the end of the first day of creation week. That is about the diameter of our galaxy. Therefore the intense burst of light from the centre of our galaxy could be seen half way through the first day of creation week. This intense burst of light came from the Quasar-like processes that occurred in the centre of every galaxy initially.

Every galaxy had ultra-brilliant, hyperactive centres; ours was no exception. After one month light from galaxies 2.3 million light years Away would be visible from earth. This is the approximate radius of our local group of galaxies. So the Andromeda spiral galaxy would be visible by then. After one year, objects 27 million light-years away would be visible if telescopes were employed. We now can see to very great distances. However, since we do not know the exact size of the cosmos, we do not know if we can see right back to the "beginning."

Question #8: That this redshift caused by the rising ZPE level goes in jumps is clear to me, it is caused by the quantum-like behaviour of matter.

Setterfield: Absolutely correct!

Question #9: Setterfield mentions the "highly energetic" beginning of the universe, but The ZPE was low at that time. How was this early energy represented? In a kind of "proto-matter" or something like that?

Setterfield: The highly energetic beginning of the universe was referring to the contents of the cosmos, the fiery ball of matter-energy that was the raw material that God made out of nothing, that gave rise to stars and galaxies as it cooled by the stretching out process. By contrast, the ZPE was low, Because the tensional energy in the fabric of space had not changed its form into the ZPE. A distinction must be made here between the condition of the contents of the cosmos and the situation with regards to the fabric of space itself. Two different things are being discussed. Note that at the beginning the energy in the fabric of space was potential energy from the stretching out process. This started to change its form into radiation (the ZPE) in an exponential fashion. As the tensional potential energy changed its form, the ZPE increased. (February 24, 2000).



Light and Subatomic Particles

Correspondent: I have made the mistake of reading several texts on physics/astronomy designed for lay-readers (including "A brief History of Time") and am now left with a few questions that I can not find the answer to, at least in terms that I can understand. It´s annoying me... These questions are probably completely trivial, so I wondered whether you or one of your colleagues could suggest where I may find answers to these then I would be extremely grateful [I do have a scientific background, but in organic chemistry (nothing mathematical)].



I´ll try and write this questions as concisely as possible...



My understanding is that anything that interacts with the Higgs boson will have mass and the maximum (theoretical) speed a massive object can attain would be equal to the speed of light. This is the speed of light as measured in a vacuum [c]. I understand though that in other media, the speed of light can be less than c and it is therefore possible for certain massive particles to travel faster than the speed of light in that particular medium giving off Cherenkov radiation.



However, we´re informed that a vacuum is not completely empty but consists of many real particles [e.g. hydrogen atoms, the Higgs boson (if it exists?)] and a whole gamut of virtual particles.



From this I wondered whether the speed of light ´c´ measured in this far-from-empty vacuum was actually the fastest speed that light could attain, or whether light would travel even faster in a completely empty vacuum (were this creatable). If so, then if c is the speed of light in the medium we know as a vacuum, then would it be possible for a particle to have a speed greater than light in this medium? If so, then I guess we would have spotted its associated Cherenkov radiation and I have not read that this is the case. I´ve heard of tachyons, but I don´t know if these are simply "postulated" particles.



The next question that sprung from this is as follows. As a particle accelerates, it becomes more massive in accordance with the theory of relativity and therefore more energy is needed to continue acceleration. As this particle approaches the speed of light then the mass approaches infinity and therefore the energy required approaches infinity. My question was why does this limit occur when the speed is c? What´s so special about c; why is light limited to this speed? If light is does not interact with the Higgs boson then it must be mass-less so what physical phenomenon actually limits it to the speed c?



And finally, if mass increases with speed then does this mean that an object interacts with the Higgs more strongly or does it mean that it interacts with more Higgs bosons?



Setterfield: First, the matter of the Higgs boson. Although it has been searched for, there is as yet no hard experimental evidence for its existence - we are still waiting for that. As for its role in mass through interaction, I strongly suggest that you read an important article entitled "Mass Medium" by Marcus Chown in New Scientist for 3rd Feb. 2001, pp.22-25. There it is pointed out that, even if the Higgs is proven to exist, it may not be the answer we seek.



Instead, an alternative line of enquiry is opening up in a very positive way. This line of enquiry traces its roots to Planck/Einstein/Nernst and can reproduce the results of quantum physics through the effects of the Zero-point Energy (ZPE) of the vacuum. Those following this line of study can account for mass through the ineraction of the ZPE with massless point charges (as in quarks). Indeed, gravitational, inertial and rest-mass can all be shown to have their origin in the effects of the ZPE. This mass comes from the "jiggling" of massless particles as the electromagnetic waves of the ZPE impact upon the particles.



This links back to another query that you had, namely the reason for the speed of light in the vacuum. This speed, c, is related to the ZPE through the manifestation of virtual particle pairs in the paths of photons. As photons travel through the vacuum, there is a continual process of absorption of the photon by virtual particles, followed very shortly after by its re-emission as the virtual particle pairs annihilate. This process, while fast, does take a finite time to accomplish. Its akin to a runner going over hurdles. Between hurdles the runner maintains his maximum speed, but the hurdles impede progress. The more hurdles over a set distance, the longer it takes to complete the course. This is essentially the reason for the slowing of light in glass, or water etc. Atoms absorb photons, become excited, and then re-emit the photons of light. The denser the substance, the slower light travels.



Importantly, the strength of the ZPE governs the number of virtual particles in the paths of photons. It has been shown that when the energy density of the ZPE is decreased, (as in the Casimir effect where the energy density of the vacuum is reduced between two parallel metal plates), then lightspeed will be faster. The reason is that there are fewer virtual particles per unit length for light photons to interact with. It has been shown that this process can account for the electric permittivity and magnetic permeability of the vacuum. A summary of some of this can be found in an article by S. Barnett in Nature, Vol.344 (1990) p.289. A more comprehensive study by Latorre et al in Nuclear Physics B Vol.437 (1995), p.60-82 stated in conclusion that "Whether photons move faster or slower than c [the current speed of light] depends only on the lower or higher energy density of the modified vacuum respectively". Thus a vacuum with a lower energy density for the ZPE will result in a higher speed of light than a vacuum where the energy density of the ZPE is higher.



On the matter of tachyons, an important item is currently being considered. Just as virtual particles exist with the ZPE, so also do virtual tachyons. A recent study has shown that the Cherenkov radiation from these virtual tachyons can account accurately for the microwave background [T. Musha, Journal of Theoretics, June/July 2001 Vol.3:3].



Your final question relates to the increase in mass with increase in speed of particles. Again the reason can be found in the ZPE. As mentioned in the New Scientist article referred to above, radiation from the ZPE essentially bounces of the accelerating charge and exerts a pressure which we call inertia. The faster the movement, the greater the pressure from the ZPE, and so the greater the inertial mass. This has been demonstrated in a mathematically rigorous way in peer-reviewed journals and a consistent theory is emerging from these studies.



I hope this is of assistance. If you have further questions, please do not hesitate to get back to me.





Correspondent: As previously mentioned, your reply below has proved invaluable to me; once again, many thanks.



However, I do have two (hopefully final) questions arising from your reply that I´d like to pose to you.



The first is this: my knowledge of these virtual particle pairs is next-to-nothing. However from my knowledge of chemistry (admittedly studied approximately 10 years ago and probably recalled incorrectly), I understand that when an atom absorbs a photon and then re-radiates it, the direction in which the photon is re-radiated is not necessarily the same as the direction of the original photon. There are of course factors affecting the direction, but for an essentially spherical excited atom that is not in an electric/magnetic field, it could be in any direction, particularly if the emission is not instantaneous. This obviously is not the case for these virtual particles, otherwise light would not be seen to travel in a straight line. Presumably, the current models explain this successfully - is there a lay-man´s explanation you could give me?



The second: Again, from my chemistry background, I know that the absorption of photons is regulated by quantum effects and that the energy of the photon absorbed is the same as the energy difference between the ground-state and excited-state orbitals. It appears to be the case that all frequencies of light are absorbed equally by these virtual particles - if this were not the case then gamma rays and radio waves (and all frequencies in between) would not travel at the same speed; c. Quarks etc have no discernable internal structure (as far as I´m aware) - so again, would it be possible to give a lay-man´s description to the absorption/emission process occurring?



Many, many thanks again!





Setterfield: Your first question is in fact two-fold – it concerns the behaviour of a light beam transmitted by atoms in a transparent medium compared with the behaviour of light transmitted through the vacuum that includes virtual particles.



In the first instance you state “that when an atom absorbs a photon and then re-radiates it, the direction in which the photon is re-radiated is not necessarily the same as the direction of the original photon.” My response to this follows the approach adopted by Jenkins and White, Fundamentals of Optics third edition section 23.10 entitled ‘Theory of Dispersion’, pp.482 ff. The initial point that they make is that as any electromagnetic wave traverses the empty space between molecules, it will have the velocity that it possesses in free space. But your question effectively asks something slightly different, namely, how is it possible for the light wave to be propagated in substantially the same direction as it was initially traveling in?



You point out that atoms effectively scatter and/or re-emit the incoming photons in any direction. But here Jenkins and White make an important point. They demonstrate that “the scattered wavelets traveling out laterally from the [original] beam [direction] have their phases so arranged that there is practically complete destructive interference. But the secondary waves traveling in the same direction as the original beam do not thus cancel out but combine to form sets of waves moving parallel to the original waves” (their emphasis). I think that this deals with the first aspect of your question.



The second aspect of the question essentially asks why the original direction of a light photon is maintained after it has been absorbed by a virtual particle pair and re-emitted upon their annihilation. In other words, why does the emitted photon travel in the same direction as the original photon? In order to easily visualize why, it may be helpful to look at this in a slightly different way which is still nonetheless correct. One may consider a traveling photon to be briefly transformed into a virtual particle pair of the same energy as the original photon. Furthermore, the dynamics of the situation dictate that this particle pair will move forward in the original photon’s direction of travel. However, the virtual particle pair will travel a distance of less than one photon wavelength before they annihilate to create a new photon with characteristics indistinguishable from the old one. This whole process complicates the smooth passage of photons of all energies through space by making the photon travel more slowly. This is explicable since on this approach the photon spends a fraction of its existence as a virtual particle pair which can only travel in the direction of the original photon’s motion at sub-light velocities. Implicit in this whole explanation is the fact that both virtual particles and the new light photon must all move in the same direction as the original photon, which then answers your question.



This explanation also impinges on your final question. This effectively asks why photons of all energies are uniformly affected by the process of absorption by the virtual particles, which thereby ensures that photons of all energies (or wavelengths) travel at the same velocity. If the alternative explanation given above is followed through, the original photon transforms into a virtual particle pair, which then annihilate to give the original photon. On this basis, it immediately becomes apparent that the energy (and wavelength) of the virtual particle pair results directly from the energy of the original photon, and this energy is then conserved in all subsequent interactions.



I trust that this answers your questions. If you have further issues, please do not hesitate to get back to me.





Quartz Clocks

Question: I understand the oscillation of a quartz crystal to be a mechanical effect (piezo-electric) rather than an atomic effect and so I would assume they are not be affected by the increase in the zero point energy. Is this true?

Setterfield: If the oscillation is entirely mechanical (piezo-electric effect), then it will not be affected by any changes in the ZPE. There are, however, some elements which are mechanical, and others which seem to be atomic (Is the behavior of the molecules themselves atomic or mechanical in this case?). However, if we go with the observations, the measurements done on the speed of light using quartz oscillator clocks did in fact show a decrease in the speed of light over time. This contrasts with atomic clocks which have run rates in synch with the speed of light, and therefore cannot show c as changing. Because the quartz clocks are showing the change, this seems to indicate that they are not affected – or as affected (?) – by a changing ZPE. If they were affected by the ZPE, they would show no change in the speed of light at all.



Quartz oscillator clocks came in around 1949-1950. The speed of light had, at this point, already been declared constant by Birge; however it was not until the atomic clock measurements of the early eighties that the speed of light was ‘officially’ declared constant. So there were some measurements taken after Birge, and the decay was showing up as continuing. This was measured using those quartz clocks, so the observational evidence we have from this indicates that the ZPE changes are not affecting them.



This needs more research, obviously, but these are my current feelings about the matter.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
.  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Speed of light slowing down after all?
Famous physicist makes headlines

by Carl Wieland

9 August 2002

Headlines in several newspapers around the world have publicized a paper in Nature by a team of scientists (including the famous physicist Paul Davies) who (according to these reports) claim that ‘light has been slowing down since the creation of the universe’.1

In view of the potential significance of the whole ‘light slowing down’ issue to creationists, it is worth reviewing it briefly here.

Well over a decade ago, AiG’s Creation magazine published very supportive articles concerning a theory by South Australian creationist Barry Setterfield, that the speed of light (‘c’) had slowed down or ‘decayed’ progressively since creation.

In one fell swoop, this theory, called ‘c decay’2 (CDK) had the potential to supply two profound answers vitally important for a Biblical worldview.

The distant starlight problem
One was, if stars are really well over 6000 light years away, how could light have had time to travel from them to Earth? Two logically possible answers have serious problems:

God created the starlight on its way: this suffers grievously from the fact that starlight also carries information about distant cosmic events. The created-in-transit theory means that the information would be ‘phony’, recording events which never happened, hence deceptive.

The distances are deceptive: but despite some anomalies in redshift/distance correlations (see Galaxy-Quasar ‘Connection’ Defies Explanation), it’s just not possible for all stars and galaxies to be within a 6000-light-year radius—we would all fry!

But if light were billions of times faster at the beginning, and slowed down in transit, there would be no more problem.

Radiometric dates
The definitive technical creationist resource on modern radiometric dating
Radioisotopes and the Age of the Earth
Larry Vardiman, Andrew A. Snelling, Eugene F. Chaffin

The age of the earth stands out as one of the most important issues among Christians today. The RATE group, consisting of six young-earth creationist geologists, geochemists, and physicists, is cooperating to research the issue of Radioisotopes and the Age of the Earth. They have dared to ask the tough questions and are searching for an alternative explanation for the billions of years found in rocks.

ORDER YOUR COPY TODAY


Visit our Q&A page on
Radiometric Dating!
Since most nuclear processes are mathematically related to the speed of light, a faster ‘c’ might well mean a faster rate of radioactive decay, thus explaining much of the evidence used to justify the billions of years of geological hypothesizing. In fact, top-flight creationist researchers involved with the RATE (Radioactive Isotopes and the Age of the Earth) project have found powerful evidence of speeded-up decay in the past (see their book (right). CDK might offer a mechanism.

CDK—the history of the idea
Barry Setterfield collated data of measurements of c spanning a period of about 300 years. He claimed that rather than fluctuating around both sides of the present value as measurements became more accurate, they had progressively declined from a point significantly higher than today’s value. He proposed that this decline had been exponential in nature, i.e. very rapid early on, gradually easing to stabilize at today´s value for c, just a few decades ago.3

He and Trevor Norman, a mathematician from Flinders University in South Australia, published a monograph4 (still stocked by this ministry for the assistance of potential researchers) outlining this, and answering several arguments raised against the theory. The monograph also showed how, over the past years, the measurements of the value of various constants (e.g. electron mass, Planck’s constant (h)) were varying progressively, if ever so slightly, in a ‘directional’ fashion consistent with the direction predicted by their mathematical linkage with ‘c’.

With such a bombshell, there were, not surprisingly, substantial efforts at scientific assessment and criticism. The critiques were not only from those motivated to undermine Biblical cosmology, but from leading creationist physicists. Criticism (‘iron sharpening iron’ as Proverbs 27:17 puts it) is meant to be a healthy process enhancing the search for truth in science.

The criticisms centered around two issues: the first was the validity of the statistical data itself, particularly the reliability of some of the earlier measurements of c given their large uncertainties, and the other was the consequences we should find in the present world if c has declined. This is an immensely complex area; for one thing, when c changes, so do other things, which can become mind-boggling to sort out, even for the experts.

One of the attacks concerned Einstein’s special relativity, E = mc2 and the like. (If c is a billion times greater in the past, then E would be a billion billion times greater, so would not a campfire be like an atom bomb, and so on?) Critics at the time used this to mock CDK, but Setterfield answered that rest mass itself is inversely proportional to c2, so that energy is still conserved. He also claimed that there is experimental evidence that the charge to mass ratio of an electron has been decreasing (supporting his claim that mass has increased as c2 has decreased). But as usual, the skeptics, along with ‘progressive creationist’ (long-age) astronomer and ardent ‘big bang’ advocate, Dr Hugh Ross,5 kept repeating this claim as if Setterfield hadn’t thought of this and answered it. Whether one agrees with his answer or not, it was improper to ignore it (or perhaps his critics, lacking any qualifications in physics, didn’t understand it).

Critics of CDK said that accepting it would mean one would have to discard Einstein, despite all the evidence for his theory. Setterfield said (and it seems to me correctly) that all that special relativity claims in this matter is that c is constant at any point in time with respect to the observer, it does not involve any magic, canonical value for c. In other words, the actual value of c could change with time, so long as that change was consistent throughout the entire universe.6

Others dismissed CDK by claiming that if c had changed, the fine-structure constant (FSC, symbol α) should be different as measured using light from distant stars than from those nearby, but that this was not so.4 However, Setterfield’s particular theory predicted that the FSC would remain constant.7

A word of caution
But, intriguingly, it now turns out that the fine-structure constant is in fact slightly different in light from distant stars compared to nearby ones. In fact, this is the very reason that physicists of the stature of Davies are now prepared to challenge the assumption that light speed has always been constant. And in addition to being different from the prediction of the Setterfield theory, this research by itself does not support c-decay theory of the magnitude that Setterfield proposed. The change is billions of times too small. In fact, the newspaper hype surrounding Davies’ theory, and the quotes attributed to him, hardly seem to be justified by the Nature article itself, which is rather speculative. NB, although Setterfield predicted constant α, given the small change and tentative nature of this new discovery, by itself it is not conclusive evidence against the Setterfield theory either. See an earlier AiG response to reports of a change in a, Have fundamental constants changed, and what would it prove?

Unfortunately, despite being urged to continue to answer critics and further develop his theory within the refereed technical creationist literature, Setterfield effectively withdrew from that forum some years ago, though not from individual promotion and development of the idea, e.g. on the Web.

Well known creationist physicist, Dr Russell Humphreys (now with ICR), has long given credit to Setterfield’s challenging hypothesis for stimulating the development of his own cosmology, which seeks to answer the same question about starlight, and which is currently in favour among many creationist astronomers (see How can we see distant stars in a young Universe?). Humphreys says that he tried for over a year to find a way to get CDK to ‘work’ mathematically, but gave up when it seemed to him that so many things were changing in concert that it would be hard to detect a change in c from observations.

It’s also important to note, as we have often warned, that newspaper reports are often very different from the original paper. The actual Nature article, as shown by its accurate title, was about how the theory of black-hole thermodynamics might determine which is correct out of two possible explanations for previous work that claimed that FSC might have increased slightly and slowly over billions of years. The details are summarized in the box below. In conclusion, the authors (who are also prepared to accept that their interpretation of the data may be wrong) still believe in billions of years, and would reject the relatively rapid change in c that Setterfield proposed since they are talking about <0.001% over 6–10 billion years.

To be fair to the journalists, Davies has long been something of a publicity seeker. So he possibly didn’t mind at all that his actually quite non-descript paper was being publicized (it was actually less than a full page in total length in the ‘Brief Communications’ section, and didn’t rate a mention as a feature item), even for something peripheral to the paper.

Other c-decay ideas
Still, it is fascinating to see vindication for at least the possibility that c has changed. Whether this decline (if real) has only just ceased recently, as Setterfield proposed, or happened earlier (perhaps in a ‘one-step’ fashion), or is still going on, is another question.

Physicist Keith Wanser, a young-universe creationist and full Professor of Physics at California State University, Fullerton, told Creation magazine in 1999 that he was open to the idea of changing c (see God and the Electron8). He said:

‘I don’t go along with Barry’s statements on this; he’s well-meaning but in my opinion he’s made a lot of rash assumptions ... and there’s a misunderstanding [of many of the consequences of changing c].’

But Wanser, also said:

‘there are other reasons to believe that the speed of light is changing, or has changed in the past, that have nothing to do with the Setterfield theory.’

The interview also quoted a 1999 New Scientist cover story two years ago, which also proposed the ‘heresy’ of c-decay.9 (More recent New Scientist articles have reported on how it seems to be acceptable to propose c-decay to try to solve another well-known difficulty of the big bang theory, called the horizon problem. That is, the cosmic microwave radiation indicates that space is the same temperature everywhere, indicating a common influence. But no connection between distant regions would be possible, even in the assumed time since the alleged ‘big bang’, because of the ‘horizon’ of the finite speed of light. As an ad hoc solution to this problem, Alan Guth proposed that the universe once underwent a period of very rapid growth, called ‘inflation’. But now it seems that even this has its own horizon problem. So now some physicists have proposed that the speed of light was much faster in the past, which would allow the ‘horizon’ to be much further away and thus accommodate the universe´s thermal equilibrium.10 Note that these other proposals even have c much faster than in the Setterfield concept.)

Whether Setterfield is truly vindicated remains to be seen; the process would be greatly helped by further scientific debate of the actual issues in TJ or the CRSQ. In the absence of such involvement by skilled proponents of the theory, AiG cannot take a strong stand. In fact, in our publications over the last few years, we have tended to strongly favour Humphreys’ relativistic white hole cosmology, though always pointing out, along with Humphreys himself, that it was just one alternative model, and not ‘absolute truth’.

It is clear, though, that the issue is so complex, that one or two pronouncements of ‘certainty’ by a physicist or two, whether creationist or evolutionist, should not be taken as the death knell of the notion or any aspects of it—nor as final proof of it.

The irony of bias
It is truly ironic to look back at the time when some creationists were actively putting forward CDK as a profoundly important hypothesis. The anticreationists, both the anti-theists and their compromising churchian allies, launched their attacks with glee. Skeptics around the world seldom failed to have audiences in fits of laughter at the ‘ridiculous’ notion that what they labeled as a ‘certain cornerstone of modern physics’, the alleged constancy through time of the value of c, was wrong. No matter what comes of his notion as a whole, no matter even whether c has actually changed or not, in that sense at least, thanks to Paul Davies, Setterfield (and those, like ourselves, who supported his pioneering efforts) has already had the last laugh.

The real issue
Great resource for refuting the ‘big bang’
Starlight and Time (VHS)
Dr D. Russell Humphreys

See in spectacular 3-D imagery how a big bang and creation cosmos differ and why evidence supports a recent creation of the universe! Learn a cosmological model which shows how God may have made and used relativity to create the cosmos in six ordinary days...

ORDER YOUR COPY TODAY


Visit our Q&A Page on Astronomy
Christians worried about the ‘starlight travel-time’ issue have seen a number of theories put forward to try to solve it, including CDK. For instance, the relativistic white-hole cosmology (see video, right) and even the two different conventions of calculated v. observed time.11 Which of these is right? Maybe none. I often say to enquirers, after outlining the encouraging advances made by some of these ideas, something like the following:

‘I don’t know for sure how God did it, but I know that I for one would hate to stand in front of the Creator of the Universe at a future point and say:

”Lord, I couldn’t believe your plain words about origins, just because I couldn’t figure out, with my pea-sized intelligence, how you managed to pull off the trick of making a universe that was both very young and very large.”’

I believe we need to understand, as most physicists really do, how immensely little is yet known about such major issues. What if Humphreys is right, for instance, and the answer lies in the general relativistic distortion (by gravity) of time itself in an expanded (by God who ‘stretched out the heavens’ as Scripture says repeatedly) bounded universe? Would not the world have laughed if such notions (as time running differently under different gravity influences, for instance) had first been put forward by modern Bible-believers? They would have been seen as ad hoc inventions, but they have been experimentally tested.

This ‘secular CDK’ announcement, by one of the biggest names in physics, should really be an antidote to the confident arrogance of long-age big-bangers. So should the recent landmark TJ paper by Humphreys showing observationally that we are in fact close to the centre of a bounded universe (download PDF file Our galaxy is the centre of the universe, ‘quantized’ red shifts Show).

People need to be aware just how abstract, shaky and prone to revision the findings of modern cosmology really are. To quote Prof. Wanser again:

‘The sad thing is that the public is so overawed by these things [big bang and long-age cosmologies], just because there is complex maths involved. They don’t realize how much philosophical speculation and imagination is injected along with the maths—these are really stories that are made up.’12

All in all, it’s an exciting time to be a Genesis creationist. But then, it’s always been an exciting time to take God at His Word.

What was Davies’ paper really about?
The gist of it is:

Already known: the fine structure constant α = 2πe2/hc, where e is the electronic charge and h is Planck’s Constant. Last year, there was a claim that α is increasing over time [as AiG reported in Have fundamental constants changed, and what would it prove?].

So this increase in α could be due to increasing e or decreasing c (CDK). But as mentioned, this conflicts with Setterfield’s model that has α invariant with varying C because it’s h that varies inversely to c.

The Second Law of Thermodynamics is in force. The entropy of a black hole increases with area of its event horizon (that’s if the standard formula applies with either varying c or e). Therefore the area cannot decrease unless the black hole’s environment has a corresponding entropy increase.

The key point of this theoretical ‘brief communication’: an increase in e would mean a reduction of a black hole’s area, which would seem to violate the Second Law under the current formula. Increasing e could also lead to an increase of a black hole’s electric charge above a threshold value where the event horizon disappears and we are left with a naked singularity, and this would violate what’s known as the cosmic censorship hypothesis. Davies et al. conclude:

Our arguments, although only suggestive, indicate that theories in which e increases with time are at risk of violating both the second law and the cosmic censorship hypothesis.

But a decrease in c over time would lead to an increase in a black hole’s area, which is in line with the Second Law. So by a process of elimination based on this theory about black hole thermodynamics (not on any new data), a tiny decrease of c is the right explanation for the tiny increase that was previously claimed for α over time.

Return to text


Addendum: Nuclear physicist Dr Russell Humphreys comments:
‘The article on the AiG Web site is well balanced. Paul Davies’ Nature article itself falls far short of the hype, which is much ado about nearly nothing. General Relativity has had a variable speed of light ever since 1917. For the past six years, the physics journals have had a steady trickle of variable-c theories, including some by Davies. His latest article is only peripherally about a variable c. So why all the fuss?’

References
Davies, P.C.W., Davis, T.M. and Lineweaver, C.H., Black holes constrain varying constants, Nature 418(6898):602–603, 8 August 2002. Return to text

The word ‘decay’ is used here to describe declining velocity, without necessarily implying any thermodynamic or moral ‘decay’ in that sense of the word. Return to text

The decay curve chosen to fit the data was c = √[a + ekt(b + dt)], a square root of the critically damped harmonic oscillator equation. A critically damped system is one that reaches equilibrium as fast as possible without any overshoot or oscillation. Return to text

Norman, T.G. and Setterfield, B., The Atomic Constants, Light and Time, 1990. Return to text

Ross, H.N., Creation and Time, Navpress, Colorado Springs, pp. 98–99, 1994. Return to text

Interestingly, Davies thinks that a changing c would have grave consequences for Einstein’s theory, which may be superseded by another theory which encompasses all the observations including changing c. Return to text

Setterfield proposed that since energy must be conserved in atomic orbits, then h must be inversely proportional to c. Therefore any constant that contains the product hc with other constants, including α, must also be constant. Norman and Setterfield, Ref. 4, pp. 33–39. Return to text

Creation 21(4):38–41, September–November 1999. Return to text

Barrow, J., Is nothing sacred? New Scientist163(2196):28–32, 24 July 1999. Cf. ‘C’ the difference, Creation 22(1):9, 1999. Return to text

Adams, S., The Speed of Light, New Scientist 173(2326) Inside Science, p. 4, 19 January 2002. Return to text

Newton, R., Distant starlight and Genesis: conventions of time measurement, TJ 15(1):80–85, 2001. Return to text

Ref. 8, p. 41. Return to text
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
ethereal  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
I´ll order the book on tape.
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
.  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
Upheaval in Physics:
History of the Light-Speed Debate
by Helen D. Setterfield

-------------------------------------------------------------​-------------------

[Ed Note: We have been following Barry Setterfield´s research on the speed of light since 1993.1 It is interesting that both evolutionists and creation scientists can be blinded by their own presuppositions...]

When we walk into a dark room, flip a switch and the light is instantly on, it seems that light has no speed but is somehow infinite - instantly there - and that was the majority opinion of scientists and philosophers until September 1676, when Danish astronomer Olaf Roemer announced to the Paris Academie des Sciences that the anomalous behavior of the eclipse times of Jupiter´s inner moon, Io, could be accounted for by a finite speed of light. 2 His work and his report split the scientific community in half, involving strong opinions and discussions for the next fifty years. It was Bradley´s independent confirmation of the finite speed of light, published January 1, 1729, which finally ended the opposition.3 The speed of light was finite-incredibly fast, but finite.

The following question was: "Is the speed of light constant?" Interestingly enough, every time it was measured over the next few hundred years, it seemed to be a little slower than before. This could be explained away, as the first measurements were unbelievably rough compared to the technical accuracy later. It was not that simple, though. When the same person did the same test using the same equipment at a later period in time, the speed was slower. Not much, but slower.

These results kicked off a series of lively debates in the scientific community during the first half of the 20th century. Raymond Birge, highly respected chairman of the physics department at the University of California, Berkeley, had, from 1929 on, established himself as an arbiter of the values of atomic constants.4 The speed of light is considered an atomic constant. However Birge´s recommended values for the speed of light decreased steadily until 1940, when an article written by him, entitled "The General Physical Constants, as of August 1940 with details on the velocity of light only," appeared in Reports on Progress in Physics (Vol. 8, pp.90-100, 1941). Birge began the article saying: "This paper is being written on request - and at this time on request ... a belief in any significant variability of the constants of nature is fatal to the spirit of science, as science is now understood [emphasis his]." These words, from this man, for whatever reason he wrote them, shut down the debate on the speed of light. Birge had previously recognized, as had others, that if the speed of light was changing, it was quite necessary that some of the other "constants" were also changing. This was evidently not to be allowed, whether it was true or not, and so the values for the various constants were declared and that was that. Almost. In the October 1975 issue of Scientific American (p. 120), C.L. Strong questioned whether the speed of light might change with time "as science has failed to get a consistently accurate value." It was just a ripple, but the issue had not quite disappeared.

Partly in order to quell any further doubts about the constancy of the speed of light, in October 1983 the speed of light was declared a universal constant of nature, defined as 299,792.458 kilometers per second, which is often rounded off to the measurement we are more familiar with in the West as 186,000 miles per second.

Birge´s paper was published in 1941. Just a year later, Barry Setterfield was born in Australia. In 1979 he was 37 years old. That year he received a book from a friend, a book on astronomical anomalies. It was a large book, and near the end of it there was a section on the speed of light, questioning its constancy. Barry was stunned. Nothing he had read or learned in physics or astronomy had even hinted that there was a question regarding the speed of light. It was a constant, wasn´t it? As he read, he learned about the measurements that had been taken years before, and the arguments that had gone on in the scientific literature, and he was fascinated. He figured he could read up on it and wrap up the question in about two weeks; it didn´t quite work out that way.

Within a couple of years, one of the creationist organizations had started publishing some of Barry´s findings. They were still preliminary, but there was so much more to this than he had thought. In the following years his exploration continued, and he read all the literature he could find. His work caught the attention of a senior research physicist at Stanford Research Institute International (SRI), who then asked him to submit a paper regarding his research. It was to be a white paper, or one that was for the purposes of discussion within the Institute.

Barry teamed up with Trevor Norman of Flinders University in Adelaide, and in 1987 Flinders itself published their paper, "Atomic Constants, Light, and Time." Their math department had checked it and approved it and it was published with the Stanford Research Institute logo as well. What happened next was like something out of a badly written novel. Gerald Aardsma, a man at another creationist organization, got wind of the paper and got a copy of it. Having his own ax to grind on the subject of physics, he called the heads of both Flinders and SRI and asked them if they knew that Setterfield and Norman were [gasp] creationists! SRI was undergoing a massive staff change at the time and since the paper had been published by Flinders, they disavowed it and requested their logo be taken off. Flinders University threatened Trevor Norman with his job and informed Barry Setterfield that he was no longer welcome to use any resources there but the library. Aardsma then published a paper criticizing the Norman-Setterfield statistical use of the data. His paper went out under the auspices of a respected creation institution.

Under attack by both evolutionists and creationists for their work, Norman and Setterfield found themselves writing long articles of defense, which appeared in a number of issues of creation journals. In the meantime, Lambert Dolphin, the physicist at Stanford who had originally requested the paper, teamed up with professional statistician Alan Montgomery to take the proverbial fine-tooth comb through the Norman-Setterfield paper to check the statistics used. Their defense of the paper and the statistical use of the data was then published in a scientific journal,5 and Montgomery went on to present a public defense at the 1994 International Creation Conference. Neither defense has ever been refuted in any journal or conference. Interestingly enough, later in 1987, after the Norman-Setterfield paper was published, another paper on light speed appeared, written by a Russian, V. S. Troitskii.6 Troitskii not only postulated that the speed of light had not been constant, but that light speed had originally been about 1010 times faster than now.

Since then, a multitude of papers on cosmology and the speed of light have shown up in journals and on the web. The theories abound as to what is changing, and in relation to what, and what the possible effects are. There is one person who is continuing to work with the data, however. As the storm around the 1987 report settled down, Barry Setterfield got back to work, investigating the data rather than playing around with pure theory.

Meanwhile, halfway around the world from Australia, in Arizona, a respected astronomer named William Tifft was finding something strange going on with the redshift measurements of light from distant galaxies. It had been presumed that the shift toward the red end of the spectrum of light from these distant galaxies was due to a currently expanding universe, and the measurements should be seen as gradually but smoothly increasing as one went through space. That wasn´t what Tifft was finding. The measurements weren´t smooth. They jumped from one plateau to another. They were quantized, or came in quantities with distinct breaks in between them.

When Tifft published his findings,7 astronomers were incredulous and dismissive. In the early 1990s in Scotland, two other astronomers decided to prove him wrong once and for all. Guthrie and Napier collected their own data and studied it. They ended up deciding Tifft was right.8 What was going on? Barry Setterfield read the material and studied the data. The universe could not be expanding if the red shift measurements were quantized. Expansion would not occur in fits and starts. So what did the red shift mean? While most others were simply denying the Tifft findings, Barry took a closer look. And it all started to make sense. The data was showing where the truth of the matter was. While many articles continued to be published regarding theoretical cosmologies, with little regard for much of the data available, Barry was more interested in the data.

Yet, his work is not referenced by any of the others. The Stanford paper is just about forgotten, if it was ever known, by the folks in mainstream physics and astronomy. However, not only are the measurements still there, but the red shift data has added much more information, making it possible to calculate the speed of light back to the first moment of creation. So Barry wrote another paper and submitted it to a standard physics journal in 1999. They did not send it to peer review but returned it immediately, saying it was not a timely subject, was of no current interest, and was not substantial enough. (It was over fifty pages long with about a hundred and fifty references to standard physics papers and texts.) So Barry resubmitted it to an astronomy journal. They sent it out to peer review and the report came back that the paper was really interesting but that it really belonged in a physics journal. So, in 2000, he sent it off to another physics journal. They refused it because they did not like one of the references Barry used: a university text on physics. They also disagreed with the model of the atom that Barry used - the standard Bohr model. In August 2001, the paper was updated and submitted to a European peer-reviewed science journal. The editor has expressed interest. We will see what will happen. In the meantime everything continues: Barry Setterfield is giving presentations in different countries, the mainstream physicists and theorists are continuing to publish all manner of theoretical ideas, and the subject of the speed of light has erupted full force back into the scientific literature.

There is a reason that Barry´s work is not being referenced by mainstream scientists - or even looked at by most. If Barry is right about what the data are indicating, we are living in a very young universe. This inevitable conclusion will never be accepted by standard science. Evolution requires billions of years.

And there is a reason why the major creation organizations are holding his work at an arm´s length as well: they are sinking great amounts of money into trying to prove that radiometric dating procedures are fatally flawed. According to what Barry is seeing, however, they are not basically flawed at all: there is a very good reason why such old dates keep appearing in the test results. The rate of decay of radioactive elements is directly related to the speed of light. When the speed of light was higher, decay rates were faster, and the long ages would be expected to show up. As the speed of light slowed down, so the radioactive decay rates slowed down.

By assuming today´s rate of decay has been uniform, the earth and universe look extremely old. Thus, the evolutionists are happy with the time that gives for evolution and the creationists are looking for flaws in the methods used for testing for dates. But if the rates of decay for the different elements have not been the same through time, then that throws both groups off! Here was an "atomic clock" which ran according to atomic processes and, possibly, a different "dynamical" clock, the one we use everyday, which is governed by gravity - the rotation and revolution rates of the earth and moon. Could it be that these two "clocks" were not measuring time the same way? A data analysis suggested this was indeed happening. Tom Van Flandern, with a Ph.D. from Yale in astronomy, specializing in celestial mechanics, and for twenty years (1963-1983) Research Astronomer and Chief of the Celestial Mechanics Branch at the U.S. Naval Observatory in Washington D.C., released the results of some tests showing that the rate of ticking of the atomic clock was measurably slowing down when compared with the "dynamical clock."9 (Tom Van Flandern was terminated from his work with that institution shortly thereafter, although his work carries a 1984 publication date.)

In recognizing this verified difference between the two different "clocks," it is important to realize that the entire dating system recognized by geology and science in general, saying that the earth is about 4.5 billion years old, and the universe somewhere around ten billion years older than that, might be thrown into total disarray. The standard science models cannot deal with that. The standard creation models cannot, at this point, deal with the fact that radiometric dating may be, for the most part, telling the truth on the atomic clock. And, meanwhile, the Hubble spacecraft keeps sending back data which keep slipping into Barry Setterfield´s model as though they actually belonged there.

* * *

This article was originally published in the
July 2002 Personal Update NewsJournal.
For a FREE 1-Year Subscription, click here.


-------------------------------------------------------------​-------------------
**NOTES**

-------------------------------------------------------------​-------------------



Personal UPDATE, 3/93, pp. 12-16; 3/95 pp. 10-14; 3/98, pp. 13-14; 1/99, pp.13-16.
I. B. Cohen, "Roemer and the first determination of the velocity of light (1676)," Isis , Vol. 31, pp.327-379, 1939.
J. Bradley, "A letter…", Philosophical Transactions , Vol.35, No. 406, pp.637-661, December 1728.
R. T. Birge, Reviews of Modern Physics, Vol. 1, January 1929, pp.1-73. See also: [link to sunsite.berkeley.edu:2020]
A. Montgomery and L. Dolphin, Galilean Electrodynamics , Vol. 4 No. 5, pp. 93ff., 1993.
V. S. Troitskii, "Physical Constants and the evolution of the Universe", Astrophysics and Space Science Vol. 139, 1987, pp 389-411.
W. G. Tifft, Astrophysical Journal , 206:38-56, 1976; 211:31-46, 1977; 211:377-391, 1977; 221:449-455, 1978; 221:756-775, 1978; 233:799-808, 1979; 236:70-74, 1980; 257:442-449, 1982; etc.
T. Beardsley, Scientific American 267:6 (1992), p. 19;. J. Gribbin, New Scientist 9 July (1994), 17; R. Matthews, Science 271 (1996), 759.
T. C. Van Flandern, "Precision Measurements and Fundamental Constants II," Taylor and Phillips (Eds.), National Bureau of Standards (U.S.) Special Publication 617, 1984, pp. 625-627.



-------------------------------------------------------------​-------------------
**ADDITIONAL RELATED RESOURCES**

-------------------------------------------------------------​-------------------


Stretching the Heavens and the Dilation of Time - Chuck Missler
This includes an exclusive interview with Barry Setterfield whose controversial discoveries concerning the slowing down of the speed of light may yet disrupt the previous dictums of modern physics.

Click for more information - Audio Cassette

The Creator Beyond Time and Space Series on CD-ROM - Chuck Missler and Dr. Mark Eastman
Includes: The Creator Beyond Time and Space, The Bible: An Extraterrestrial Message, Immanuel: The Deity of Messiah and The Divine Watchmaker
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
Anonymous Coward
12/08/2005 10:17 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
I see birds coming out of nowhere, space and time are the same thing right?
HS  (OP)

12/08/2005 10:17 AM

Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
FHL(C)

What is a VORTAL?
YAHshua the sound of His Name in English, YAH is short form of YHVH,
Bible.PRAYERBOOK.Praisebook DOWNLOADs
[link to www.docdroid.net (secure)]
[link to pdfhost.io (secure)]
[link to www.docdroid.net (secure)]
.
User ID: 1191321
China
12/10/2010 10:53 PM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
those spirals that have been happening and now this from another GLP thread/wikileaks, (YHVH made time and is in control of it and unbound by it being eternal and immortal, infinite and omniscient)



Thread: Wikileaks, the US secret bunker, the Gulf of Aden Vortex: Contact made?

Anonymous Coward
User ID: 1156051
Canada
12/9/2010 3:28 AM
Report Abusive Post
Report Copyright Violation
Wikileaks, the US secret bunker, the Gulf of Aden Vortex: Contact made?
Quote

Where is this story in the international media? The combined naval might of twenty-seven countries is concentrated off the Somali coast allegedly to fight the poorly armed pirates who continue to act with apparent impunity. Or is there something far, far more serious?

Once again the Wikileaks cables come into play. And what is revealed is terrifying. According to a report allegedly prepared by Admiral Maksimov of Russia's Northern Fleet, in late 2000, a magnetic vortex was discovered in the area of the Gulf of Aden. Russia, the PR China and the USA joined efforts to study what it was but discovered that it defied logic and the laws of physics.
.
User ID: 1191321
China
12/10/2010 11:29 PM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
FHL(C)

What is a VORTAL?
 Quoting: HS

a vortal is vortex and a portal
Anonymous Coward
User ID: 769657
United States
12/11/2010 01:13 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
[link to www.google.com]
Anonymous Coward
User ID: 769657
United States
12/11/2010 01:14 AM
Report Abusive Post
Report Copyright Violation
Re: Time vortals and mandelbrot event boundaries and horizons.
 Quoting: Anonymous Coward 769657



I like the cookies better. Mandelbrot is also the German word for Biscotti.





GLP