Quantum entanglement is the name that describes the way that particles can share information and interact with each other regardless of how far apart they are.

For example an electron in certain atoms will spontaneously decay after being excited by emitting pairs of polarized photons such that one is aligned horizontally the other vertically.  According to quantum mechanics these photons are entangled and act of observing one instantly affects the other no matter how far they are apart.

Lecture 1 of Leonard Susskind’s course concentrating on Quantum Entanglements

This instantaneous communication between the entangled photons is at the heart of quantum entanglement.  This is the “spooky action at a distance” Einstein believed was theoretically implausible because according to Relativistic theories information cannot be propagated instantaneously but only at the speed of light.

To demonstrate this 1935, Einstein co-authored a paper with Podolsky and Rosen which was intended to show that Quantum Mechanics could not be a complete theory of nature.  The first thing to notice is that Einstein was not trying to disprove Quantum Mechanics in any way.  In fact, he was well aware of its power to predict the outcomes of various experiments.  What he was trying to show was that there must be a “hidden variable” that would allow Quantum Mechanics to become a complete theory of nature

The argument begins by assuming that there are two systems, A and B (which might be two free particles), whose wave functions are known.  Then, if A and B interact for a short period of time, one can determine the wave function which results after this interaction via the Schrödinger equation or some other Quantum Mechanical equation of state.  Now, let us assume that A and B move far apart, so far apart that they can no longer interact in any fashion.  In other words, A and B have moved outside of each other’s light cones and therefore are spacelike separated.

With this situation in mind, Einstein asked the question: what happens if one makes a measurement on system A?  Say, for example, one measures the momentum value for system A.  Then, using the conservation of momentum and our knowledge of the system before the interaction, one can infer the momentum of system B.  Thus, by making a momentum measurement of A, one can also measure the momentum of B.  Recall now that A and B are “spacelike” separated, and thus they cannot communicate in any way.  This separation means that B must have had the inferred value of momentum not only in the instant after one makes a measurement at A, but also in the few moments before the measurement was made.  If, on the other hand, it were the case that the measurement at A had somehow caused B to enter into a particular momentum state, then there would need to be a way for A to signal B and tell it that a measurement took place.  However, the two systems cannot communicate in any way!

If one examines the wave function at the moment just before the measurement at A is made, one finds that there is no certainty as to the momentum of B because the combined system is in a superposition of multiple momentum eigenstates of A and B.  So, even though system B must be in a definite state before the measurement at A takes place, the wave function description of this system cannot tell us what that momentum is!  Therefore, since system B has a definite momentum and since Quantum Mechanics cannot predict this momentum, Quantum Mechanics must be incomplete.

In response to Einstein’s argument about incompleteness of Quantum Mechanics, John Bell derived a mathematical formula that quantified what you would get if you made measurements of the superposition of the multiple momentum eigenstates of two particles.  If local realism was correct, the correlation between measurements made on one of the pair and those made on its partner could not exceed a certain amount, because of each particle’s limited influence.

In other words he showed there must exist inequities in the measurements made on pairs of particles that cannot be violated in any world that included both their physical reality and their separability because of the limited influence they can have on each other when they are “spacelike” separated.

When Bell published his theorem in1964 the technology to verify or reject it did not exist.  However in the early 1980s, Allen Aspect performed an experiment with polarized photons that showed that the inequities it contained were violated.

Many believed this provided experimental verification of the concept of Quantum entanglement.  Additionally it meant that science has to accept that either the reality of our physical world or the concept of separability does not exist.

However this may not be true because in the article “The *reality* of quantum probabilities” Mar. 31 2011 it was shown the probability functions quantum mechanics associates the wave function can be understood by assuming it is physically a result of a matter wave moving on a “surface” of a three-dimensional space manifold with respect to a fourth *spatial* dimension.

Very briefly the article “Why is energy/mass quantized?” Oct. 4, 2007 showed that one can derive the quantum mechanical properties energy/mass by extrapolating the laws of classical resonance to a matter wave moving on a “surface” of a three-dimensional space manifold with respect to a fourth *spatial* dimension.

(Louis de Broglie was the first to predict the existence of a continuous form of energy/mass when he theorized all particles have a wave component.  His theories were confirmed by the discovery of electron diffraction by crystals in 1927 by Davisson and Germer.)

It showed the four conditions required for resonance to occur in a classical environment, an object, or substance with a natural frequency, a forcing function at the same frequency as the natural frequency, the lack of a damping frequency and the ability for the substance to oscillate spatial would be meet in one consisting of a continuous non-quantized field of energy/mass and four *spatial* dimensions.

The existence of four *spatial* dimensions would give a matter wave the ability to oscillate spatially on a “surface” between a third and fourth *spatial* dimensions thereby fulfilling one of the requirements for classical resonance to occur.

These oscillations would be caused by an event such as the decay of a subatomic particle or the shifting of an electron in an atomic orbital.  This would force space (the substance) to oscillate with the frequency associated with the energy of that event.

However, the oscillations caused by such an event would serve as forcing function allowing a resonant system or “structure” to be established in it.

Observations of a three-dimensional environment show the energy associated with resonant system can only take on the incremental or discreet values associated with a fundamental or a harmonic of the fundamental frequency of its environment.

Similarly the energy associated with resonant systems in four *spatial* dimensions could only take on the incremental or discreet values associated a fundamental or a harmonic of the fundamental frequency of its environment.

Therefore this defines a physical mechanism responsible for why energy/mass is quantized in terms of a matter wave move on a surface of a three dimensional with respect to a fourth *spatial* dimension.

In earlier article “Embedded dimensions” Oct. 4, 2007 it was shown that one can derive all forms of energy including that of quantum systems in terms of displacement in a *surface* of a three-dimensional space manifold with respect to a fourth *spatial* dimension.

However assuming its energy is result of a displacement in four *spatial* dimension allows one to derive, the probability distribution associated with the wave function of individual particles by extrapolating the laws of a three-dimensional environments to a fourth *spatial* dimension.

Classical mechanics tell us that because of the continuous properties of space the oscillations the article “Why is energy/mass quantized?” associated with a quantum system would be distributed throughout the entire “surface” a three-dimensional space manifold with respect to a fourth *spatial* dimension.

This would be analogous to what happens when one vibrates a rod on a continuous rubber diaphragm.  The oscillations caused by the vibrations would be felt over its entire surface while their magnitudes would be greatest at the point of contact and decreases as one moves away from it.

However, this means if one extrapolates the mechanics of the rubber diaphragm to a “surface” of a three-dimensional space manifold one must assume the oscillations associated with each individual quantum system exists everywhere in three-dimensional space.  This also means there would be a non-zero probability they could be found anywhere in our three-dimensional environment.

As mentioned earlier the article “Why is energy/mass quantized?” showed a quantum mechanical system is a result of a resonant structure formed on the “surface” of a three-dimensional space manifold with respect to a fourth *spatial* dimension.

Yet Classical Wave Mechanics tells us that resonance would most probably occur on the surface of the rubber diaphragm were the magnitude of the vibrations is greatest and would diminish as one move away from that point,

Similarly a quantum system would most probably be found were the magnitude of the vibrations in a “surface” of a three-dimensional space manifold is greatest and would diminish as one move away from that point,

However this also means each individual particle in a quantum system has its own wave and probably function and therefore the total probability of a quantum system being in a given configuration when observed would be equal to the sum of the individual probability functions of each particle in that system.

As mentioned earlier Allen Aspect verified that Bell inequities were violated by the quantum mechanical measurements made on pairs of polarized photons that were spacelike separated or in different local realities.

Yet, as just mentioned the wave or probability function of a quantum system is a summation of the probably function of all of the particles it contains.  Therefore, two particles which originated in the same quantum system and were moving in opposite directions would have identical wave or probability functions even if they were not physically connect.

The measurements Allen Aspect made on the polarized photon that verified that Bells inequity was violated involved finding a correlation between the probabilities of each particle being in a given configuration based on the concepts of quantum mechanics.  When this correlation was found many assumed that somehow they must be entangled or physical connected even though they were in different local realities.  In other words the Newtonian concept separability does not apply to quantum environment. 

However, this may not be true.

According to quantum mechanics act of measuring the state of one of pair of entangled photons instantly affects the measurement of the other no matter how far they are apart.  Yet if it is true as mentioned earlier that each particle has a separate but identical wave or probably function as it move through space the measurement of the state of one particle would be reflected in the measurement of the other because those measured states will have the same probability of occurring in each particle.

In other words the reason why Bell’s inequity is violated in quantum system is not because they are physically entangled or connected at the time of measurement but because their individual wave or probability functions were “entangled” or identical at the time of their separations and remained that way as they moved apart.  Therefore even though they are not physical connected measurements based on their quantum mechanical probability function would be.

Additionally quantum entanglement is defined in terms probability. Therefore, there would be a non-zero probably that bell’s inequity will be violated when measuring the influence of one particle on another because those measurement are based on probabilities. Therefore, one could mathematical quantify the scenario proposed above because the probability of this occurring should mirror the individual quantum mechanical probability function of each individual particle.

But to say that the correlation between measurements of the quantum characteristics of two particles is because they are entangle or are physically connected is like saying the correlation between the color characteristics of the hair of identical twins is because they have been physically connect throughout their entire life.

This shows how one can by extrapolating the classical laws governing a three-dimensional environment to a fourth *spatial* dimension define a mechanism responsible for the correlation of the quantum mechanical measurements of particles that exist in non-local environments while maintaining the classical concepts of reality and separability.

Later Jeff

Copyright Jeffrey O’Callaghan 2011

 

The Imagineer’s
Chronicles
Vol. 4 — 2013


Paperback
$13.29
Ebook
$7.99

The Imagineer’s
Chronicles
Vol. 3 — 2012

Paperback
$10.96
Ebook
$6.55

The Reality
of the Fourth Spatial Dimension

Paperback
$9.77
Ebook
$6.24

The Imagineer’s
Chronicles

Vol. 2 — 2011

Paperback
$8.32
Ebook
$6.57

The Imagineer’s
Chronicles
2007 thru 2010

Paperback
$14.97
Ebook
$7.82

On page 297 of Sean Carroll’s book from “Eternity to Here” he discusses the difficulty in interrogating gravity with the second law of thermodynamics.

The second law of thermodynamics states the entropy or disorder of an isolated system either remains constant or increases with time and that no usable energy can be obtained from a high entropy configuration.

However, observations of the effects gravity has on interstellar environments appear to contract it.

For example if we take a quantity of gas concentrated in a small region, it will naturally tend to expand to fill a progressively larger volume of space.  This is an example of how the entropy of physical systems tends to increase while the reverse of this process does not.  In other words the fact that a dispersed quantity of gas will not spontaneously become concentrated into a smaller region confirms the second law of thermodynamics

However, in regions of interstellar space where there exist large quantities of gas (of sufficient density) it will naturally contract due to the mutual gravitational attraction of the molecules thereby decreasing their entropy.

Additionally the observation that the gravitational collapse of high entropy interstellar gas causes the initiation of nuclear reactions in stars that releases useable energy appears to be a violation of the postulate contained in the second law of thermo dynamics that no usable energy can be obtained from a high entropy system.

Some physicists try to rescue it by pointing out that the gravitational collapse of interstellar gas does not occur in a closed environment and that the heat and usable energy generated from its collapse is offset by the heat radiated out into space thereby increasing the overall entropy of the universe.

However, that argument cannot be applied to the universe as a hole because by definition it is a closed system.  This means the heat and useable energy caused by its gravitation collapse could not be offset by energy being radiated to another environment.

This also beings up the question as to why the second law of thermodynamics held in such high esteem by many physicists.

For example Sir Arthur Stanley Eddington, implied it is the most important law in physics when in “The Nature of the Physical World” when he said “The second law that entropy always increases holds, I think, the supreme position among the laws of Nature.  If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations.  If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes.  But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.”

However, observations of gravity’s effect on our local environment, when extrapolated to a cosmic scale can have a profound effect on our understanding of the origins of our universe.

Granted science has not yet determined with certainty if the universe will continue to expand or will enter a contraction phase.

However, if it did it would present a serious problems for the second law of thermodynamics because the loss of entropy and generation of heat by the total gravitational collapse of our universe could not be radiated to another environment.  Therefore, the radiation pressure caused by this heat and loss of entropy would result in its reexpansion.  Hence the argument that the heat and usable energy (usable in the sense that it powered its expansion) generated by the gravitational collapse of the universe is offset by the heat radiated out into space would not apply to the gravitational collapse of the universe.

Even if the universe does not enter a contraction phase this presents a strong theoretical challenge to the universality of the statement “The second law that entropy always increases holds, I think, the supreme position among the laws of Nature”

However, this also means that we may be able to quantifiably derive the origins of our present universe, as was suggested by Robert Dicke in 1964 in terms of cyclical expansion and contraction of previous ones.

As was shown in the article “Entropy and the Big Crunch” Sept 15, 2010 the heat generated by the gravitationally collapse of the universe could raise its temperature to a point where all matter would become ionized, while the radiation pressure generated by its increasing temperature would eventually halt its contraction and cause it to enter an expansion phase. 

There are many objections to scenario however they are based on applying the second law of thermodynamic to local environment such as the collapse of interstellar gas to form stars.

For example many dismiss this idea because they believe that the “engine” powering the universe’s cycle of expansions and contractions would eventually slow down and stop because according to the second law of thermodynamic there is no such thing as a totally efficient engine. 

However, that assumption is not based extending the validity of that law to a scale which encompasses the entire universe, which as was just shown is invalid when you consider the effects of gravity.

One reason a perfectly efficient engine cannot exist is because energy, usually in the form of heat escapes from its environment.  However, as mentioned earlier because the universe is by definition a closed environment no energy can escape from it therefore, it would be a perfectly efficient with respect to the energy associated its gravitational contraction and thermodynamic expansion.

Another aspect of the second law of thermodynamics many use to invalidate the concept of a cyclical universe, is that it dictates that as the number of cycles increases the entropy or randomness of its components and structures must also increase and therefore after infinite number of cycles there should be no organized structures such as galaxies remaining.

However, this would only be true if the heat generated by the gravitational collapse of the universe was not great enough to completely break matter down to its constituent parts.

As mentioned earlier the article “Entropy and the Big Crunch” showed the energy associated with its gravitational collapse would be great enough to completely break matter down into it fundamental components thereby each cycle would start at the same point with respect to randomness or entropy of its matter component.

In other words the matter component of the universe would be reset or reinitialized to their previous configuration at the beginning of each cycle thereby making randomness of its components, such as galaxies consistent through each success cycle.

This scenario gives just as logical and consistent explanation of the Cosmic Background Radiation as the current Big Bang Model because it assumes it is a result of the universe cooling to a point, due to its expansion where its matter component has become  de-ionized enough to make it transparent to radiation. 

However, in the model suggest above the heat generated by the collapse of the universe would ionize all matter while the radiation pressure caused by its contraction will eventually result in it entering an expansion phase  Therefore, the universe would be opaque to radiation until it entered an expansion phase and cooled enough to allow its matter component to become de-ionized.

Therefore, both of these theoretical models make the same predications as to the origin and the properties of the Cosmic Background Radiation.

A cyclic universe like the one Robert Dicke proposed would also answer several of the questions nagging modern science.  It would allow us to understand and predict what came before the Big Bang in terms of a collapse and the subsequent expansion of a previous universe.  It would also eliminate the fine-tuning required to make the present Big Bang model fit modern observational data regarding the abundances of the light elements by allowing one to predict them based on what occurred before its expansion.

For example, the Big Bang model accurately predicts the abundances of the light elements.  However, each prediction requires at least one adjustable parameter unique to that element prediction.  When you take away these degrees of freedom, no genuine prediction remains.  The best the Big Bang can claim is consistency with observations using the various ad hoc models to explain the data for each light element.

However, if it is true that our present universe is a result of its collapses and reexpansion one could use the first law of thermodynamics to predict and quantify the properties of the environment when the lighter elements were formed based on the energy supplied to it by the momentum of its collapse.  One could then determine if their observed abundances could be predicted based on the properties of that environment.

A cyclical model like the one suggest by Robert Dicke has an advantage over the big bang model in that it would allow one to quantify the origins of our present universe by observing the way it is now and using the first law of thermodynamics to extrapolative them backwards in time to determine if they could predict its reexpansion in its present form.  Additionally this would give science the ability to quantify the properties of the environment where the light element were formed based on observations of the present universe instead on the random selection of different parameters.

Later Jeff

Copyright Jeffrey O’Callaghan 2010

 

The Imagineer’s
Chronicles
Vol. 4 — 2013


Paperback
$13.29
Ebook
$7.99

The Imagineer’s
Chronicles
Vol. 3 — 2012

Paperback
$10.96
Ebook
$6.55

The Reality
of the Fourth Spatial Dimension

Paperback
$9.77
Ebook
$6.24

The Imagineer’s
Chronicles

Vol. 2 — 2011

Paperback
$8.32
Ebook
$6.57

The Imagineer’s
Chronicles
2007 thru 2010

Paperback
$14.97
Ebook
$7.82

« Previous Articles    
The Imagineer's Chronicles is based on WordPress platform, RSS tech , RSS comments design by Gx3.