Tuesday, June 14, 2011

SCHRODINGER'S CAT (2)

That the cat still is a topic of conversation 80 years after Schrödinger's original thought experiment is amazing.

But why?
THOMAS YOUNG
You can put part of the blame on Thomas Young. Thomas Young (13 June 1773 – 10 May 1829) was an English polymath, and dabbled in physiology, language, musical harmony and Egyptology.  Oh yea, and also physics.

In Young's own judgment, of his many achievements the most important was to establish the wave theory of light. To do so, he had to overcome the century-old view, expressed in the venerable Isaac Newton's "Optics", that light is a particle. Isaac Newton, who did many experimental investigations of light, had rejected the wave theory of light and developed his corpuscular (or particle) theory according to which light is emitted from a luminous body in the form of tiny particles.  Nevertheless, in the early 19th century Young put forth a number of theoretical reasons supporting the wave theory of light, and he developed two enduring demonstrations to support this viewpoint. With the ripple tank he demonstrated the idea of interference in the context of water waves. With the two-slit, or double-slit experiment, he demonstrated interference in the context of light as a wave. In a paper entitled "Experiments and Calculations Relative to Physical Optics", published in 1803, Young describes an experiment in which he placed a narrow card (approx. 1/30th in.) in a beam of light from a single opening in a window and observed the fringes of color in the shadow and to the sides of the card. He observed that placing another card before or after the narrow strip so as to prevent light from the beam from striking one of its edges caused the fringes to disappear.  This supported the contention that light is composed of waves.  That the double-slit experiment can also prove the particle theory was not lost on the physicists of the 1927 Solvay conference, both must be versions of the natural world.  A good lay description of the double-slit experiment  is "Dr. Quantum - Double Slit Experiment & Entanglement" by Fred Alan Wolff, or Cassopedia Project's "Double Slit Experiment - The Strangeness Of Quantum Mechanics", both of which are usually available on YouTube.



The nature of quantum physics is such that the old methods of experiment, observe, record, refine, no longer provide valid data.  In the double-slit experiment, the act of observation changes the outcome.  That something can be a particle, and a wave, at the same time is difficult to understand intuitively. But with an overwhelming amount of evidence it needs to be treated as a natural state.

Quantum mechanics differs significantly from classical mechanics in its predictions when the scale of observations becomes comparable to the atomic and sub-atomic scale, the so-called quantum realm. Quantum physics deals with a subject that has several key facets that must be taken at face value.
During a 1961 lecture for undergraduate students at the California Institute of Technology, Richard Feynman, said this about the concept of energy:
There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same.
—The Feynman Lectures on Physics

Since 1918 it has been known that the law of conservation of energy is the direct mathematical consequence of the translational symmetry of the quantity conjugate to energy, namely time. That is, energy is conserved because the laws of physics do not distinguish between different instants of time (see Noether's theorem).

So, if we put the pieces together, and use the old axiom that after all the impossible things are discarded, the remainder, however unlikely, must be considered.  In this case, I swear it doesn't seem like it should warrant being stuck in limbo with Schrödinger's cat.

SIDE NOTES:

The quantum realm is inhabited by very, very small particles, bits of matter.  They translate in space and time in quanta, similar to frames on a movie reel.  When they move, they exhibit a wave pattern.  The term 'wave' is poorly defined in this case, and the mathematics implies a repeating pattern of probable locations.

When we do something to observe the location of the particle, it is like petting the cat with a 710J John Deere Backhoe.  Liable to be some damage to the cat, and you won't be able to determine its location or velocity.

In order to explain the interference pattern, you either; (1) assume that the probability wave is all there is, and that the act of observation brings the particle into reality and collapses the wave. (2) assume that there are hidden variables, 'a unifying theory' or a carrier mechanism that will carry and shape the wave form. (3) assume that each quantum shift spins off a new universe, a global mind, or everyone has their own universe. (4) go all metaphysical with Doctor Quantum and ring up the Dalai Lama.

Tuesday, June 7, 2011

SCHRODINGER'S CAT

I have worked on this diary entry a long time.  I do not mean that it took me a lot of work time, I mean it took a long time to work.  I originally started the post to simply say that I hated Schrödinger's cat.  Then I found out that I was misguided, because Schrodinger's cat still lives!  Yes, it is true that she is 76 years old, but she did not age while in the box (unknown quantum state) until she was released in 2000 by Serge Haroche.  After learning Kittish English, she moved to an unknown location in the US and is working as a fashion photographer.  I find now that I have a lot of respect for that cat.
SCHRODINGER'S CAT
However, I still hate Schrodinger's cat thought experiment. It was a stupid thought (my opinion). Why not a rat? That sounds better, Schrodinger's rat experiment. Why in a box? Why not deep in a cave where you could not hear the poor tormented thing scratching and meowing as it thirsted to death? Why not have the diabolical device just turn on a lamp inside the box? - we would still have to say the lamp was in; an on and an off, quantum state until we looked to see for sure. At least you would not be talking about an uncertain probability state with a dead cat stinking up the room. What was wrong with the 'if a tree falls in the forest' thought experiment anyway?

Schrödinger's original purpose was to bring the quantum level world into the macro level world as a point of contention with the Copenhagen interpretation of quantum mechanics.  It illustrates what he and others saw as the problems caused by thinking of the wave function as a real entity.  I liked the response added by Einstein, "so if the trigger is hooked to dynamite instead of hydrocyanic acid, can the cat be thought of as both alive and blown to pieces"?  As I was thinking through my thoughts on this thought experiment, I realized I needed to think some more about what caused Erwin to thunk up such a stupid thought.


That the Copenhagen interpretation became a dominate facet of quantum mechanics was the culmination of the 1927 Solvay Conference, and highlights one of the brightest periods of modern physics.
1927 SOLVAY CONFERENCE ATTENDEES
The Copenhagen interpretation of quantum physics developed between 1871 and 1927, due to the contributions of a remarkable collection of minds all working around this same time.  They theorized a new world of discrete quantities of energy, entities which fit neither the classical idea of particles nor the classical idea of waves. These 'quanta' were not continuous in time, not divisible beyond a certain size, and could only be measured and defined as a state of 'probability'.  Physicists were asked to step beyond the world of empirical experiments and pragmatic predictions and accept that the observer became a variable in the equations. According to their interpretation, the act of measurement causes the calculated set of probabilities to "collapse" to the value defined by the measurement. This feature of the mathematics is known as wavefunction collapse. The concept that quantum mechanics does not yield an objective description of microscopic reality - but deals only with probabilities, and that measurement plays an ineradicable role - is the most significant characteristic of the Copenhagen interpretation.

The Newtonian world had become familiar territory where the physicist felt in control, leading to the famous quote, "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement" (often cited to William Thompson).  However, as pointed out by the same Lord Kelvin, there were "dark clouds" on the horizon even then.  The dark clouds he was alluding to were the unsatisfactory explanations that the physics of the time could give for two phenomena: the Michelson–Morley experiment and black body radiation. In 1871, Ludwig Boltzmann and James Maxwell formulated the Maxwell–Boltzmann distribution. Boltzmann further identified the logarithmic connection between entropy and probability, which helped usher in the era of modern physics. Stating that the pressure of a gas arises from the force exerted by molecules or atoms impacting on the walls of its container, kinetic theory developed a statistical probability in a quantized form. This was at a time when most physicists still did not believe in atoms or molecules.  Then came a remarkable group of physicists that developed modern physics as we know it, and came together in Solvay, Belgium, in 1927.

Max Karl Ernst Ludwig Planck (April 23, 1858 – October 4, 1947) Won the Nobel Prize in Physics - 1918.
PLANCK 1878
The grandfather of quantum physics, Max Planck was working on black body radiation for the power company in 1984. At first, Planck did not include energy quantization, and did not use statistical mechanics, to which he held an aversion. In November 1900, Planck revised this approach, relying on Boltzmann's statistical interpretation of the second law of thermodynamics as a way of gaining a more fundamental understanding of the principles behind his radiation law. Even though Planck disliked the philosophical and physical implications of Boltzmann's approach, his recourse to them was, "an act of despair ... I was ready to sacrifice any of my previous convictions about physics."  He eventually came up with the Planck postulate, stating that electromagnetic energy could be emitted only in quantized form, in other words, the energy could only be a multiple of an elementary unit E = hν, where h is Planck's constant, also known as Planck's action quantum (introduced already in 1899), and ν is the frequency of the radiation.

Albert Einstein (14 March 1879 – 18 April 1955) Won the Nobel Prize in Physics - 1921.
EINSTEIN 1905
As a young physicist, Einstein was convinced that Newtonian mechanics was no longer enough to reconcile the laws of classical mechanics with the laws of the electromagnetic field. On 30 April 1905, Einstein completed his thesis and was awarded a PhD by the University of Zurich. His dissertation was entitled "A New Determination of Molecular Dimensions".  Einstein published four additional papers in the Annalen der Physik scientific journal that year. 1905 was Einstein's annus mirabilis or "miracle year", and these papers could arguably mark the post-Newtonian age of physics. These four articles contributed substantially to the foundation of modern physics and changed views on space, time, and matter. (1) "On a Heuristic Viewpoint Concerning the Production and Transformation of Light", received March 18 and published June 9, proposed the idea of energy quanta. This idea, motivated by Max Planck's earlier derivation of the law of black body radiation, assumes that luminous energy can be absorbed or emitted only in discrete amounts, called quanta. (2) "On the Motion of Small Particles Suspended in a Stationary Liquid, as Required by the Molecular Kinetic Theory of Heat" received May 11 and published July 18, delineated a stochastic model of Brownian motion. Using the kinetic theory of fluids, this paper cemented the reality of the atom, and formalized the value of statistical mechanics. (3) "On the Electrodynamics of Moving Bodies" was received on June 30 and published September 26.  It reconciles Maxwell's equations for electricity and magnetism with the laws of mechanics, by introducing major changes to mechanics close to the speed of light. This later became known as Einstein's special theory of relativity. (4) "Does the Inertia of a Body Depend Upon Its Energy Content?" was received on September 27 and published November 21. In this paper Einstein developed an argument for arguably the most famous equation in the field of physics: E = mc2. Einstein considered the energy-matter equivalency equation to be of paramount importance because it showed that a massive particle possesses an energy, the "rest energy", distinct from its classical kinetic and potential energies.
Other important papers include;
  • In 1907 and again in 1911, Einstein developed the first quantum theory of specific heats by generalizing Planck's law. His theory resolved a paradox of 19th-century physics that specific heats were often smaller than could be explained by any classical theory. His work was also the first to show that Planck's quantum mechanical law E=hν was a fundamental law of physics, and not merely special to blackbody radiation.
  • Between 1907 and 1915, Einstein developed the theory of general relativity, a classical field theory of gravitation that provides the cornerstone for modern astrophysics and cosmology. General relativity is based on the surprising idea that time and space dynamically interact with matter and energy, and has been checked experimentally in many ways, confirming its predictions of matter affecting the flow of time, frame dragging, black holes, and gravitational waves.
  • In 1917, Einstein published the idea for the Einstein-Brillouin-Keller method for finding the quantum mechanical version of a classical system. The famous Bohr model of the hydrogen atom is a simple example, but the EBK method also gives accurate predictions for more complicated systems, such as the dinuclear cations H2+ and HeH2+.
  • In 1918, Einstein developed a general theory of the process by which atoms emit and absorb electromagnetic radiation (his A and B coefficients), which is the basis of lasers (stimulated emission) and shaped the development of modern quantum electrodynamics, the best-validated physical theory at present.
  • In 1924, together with Satyendra Nath Bose, Einstein developed the theory of Bose-Einstein statistics and Bose-Einstein condensates, which form the basis for superfluidity, superconductivity, and other phenomena.
  • In 1935, together with Boris Podolsky and Nathan Rosen, Einstein put forward what is now known as the EPR paradox, and argued that the quantum-mechanical wave function must be an incomplete description of the physical world.
Niels Henrik David Bohr (7 October 1885 – 18 November 1962) Won the Nobel Prize in Physics - 1922.
BOHR ~1920
In 1912, Bohr went to work for Ernest Rutherford at Manchester University.  By 1914, experiments by physicists Ernest Rutherford, Henry MoseleyJames Franck, and Gustav Hertz had largely established the structure of an atom as a dense nucleus of positive charge surrounded by lower-mass electrons. Working with Rutherford's model, Bohr further postulated that electrons resided in quantized energy states, with the energy determined by the angular momentum of the electron's orbits about the nucleus. The electrons could move between these states, or orbits, by the emission or absorption of photons at specific frequencies. By means of these quantized orbits, he accurately explained the spectral lines of the hydrogen atom.This became a basis for quantum theoryBohr also conceived the principle of complementarity: that items could be separately analyzed as having several contradictory properties. For example, Bohr concluded that light behaves either as a wave or a stream of particles depending on the experimental framework — two apparently mutually exclusive properties — on the basis of this principle. Bohr continued to be a champion of the Copenhagen interpretation throughout his life.

Werner Karl Heisenberg (5 December 1901 – 1 February 1976) Won the Nobel Prize in Physics - 1932.
HEISENBERG - 1927

On 1 May 1926, Heisenberg began his appointment as university lecturer and assistant to Bohr in Copenhagen. It was in Copenhagen, in 1927, that Heisenberg developed his uncertainty principle, while working on the mathematical foundations of quantum mechanics. On 23 February, Heisenberg first described his theory in a letter to fellow physicist Wolfgang Pauli. Published in 1927, the principle implies that it is impossible to simultaneously both measure the present position while "determining" the future momentum of an electron or any other particle with an arbitrary degree of accuracy and certainty. This is not a statement about researchers' ability to measure one quantity while determining the other quantity. Rather, it is a statement about the laws of physics. That is, a system cannot be defined to simultaneously measure one value while determining the future value of these pairs of quantities. The principle states that a minimum exists for the product of the uncertainties in these properties that is equal to or greater than one half of ħ the reduced Planck constant (ħ = h/2π).  For me the uncertainty principle is the basis of my life - the more information that I know about a subject, the less I understand - and this occurs at about the same ratio as Heisenberg's theory.

Louis-Victor-Pierre-Raymond, 7th duc de Broglie (15 August 1892 – 19 March 1987) Won the Nobel Prize in Physics - 1929.
DE BROGLIE ~ 1929
Broglie's, "Research on the Theory of the Quanta", was published in 1924 and introduced his theory of electron waves. This included the wave-particle duality theory of matter, based on the work of Max Planck and Albert Einstein on light.  This research culminated in the de Broglie hypothesis stating that any moving particle or object had an associated wave. De Broglie thus created a new field in physics, the mécanique ondulatoire, or wave mechanics, uniting the physics of energy (wave) and matter (particle).
The concept of the particle creating the wave was supported by Einstein, confirmed by the electron diffraction experiments of Davisson and Germer, and generalized by the work of Schrödinger (although unlike Schrödinger, de Broglie considered the wave to be real, not statististical).




Erwin Rudolf Josef Alexander Schrödinger (12 August 1887 – 4 January 1961) Won the Nobel Prize in Physics - 1933.
SCHRODINGER ~ 1927
In January 1926, Schrödinger published in Annalen der Physik the paper "Quantisierung als Eigenwertproblem" (Quantization as an Eigenvalue Problem) on wave mechanics and what is now known as the Schrödinger equation. In this paper he gave a "derivation" of the wave equation for time independent systems, and showed that it gave the correct energy eigenvalues for the hydrogen-like atom. This paper has been universally celebrated as one of the most important achievements of the twentieth century, and created a revolution in quantum mechanics, and indeed of all physics and chemistry. A second paper was submitted just four weeks later that solved the quantum harmonic oscillator, the rigid rotor and the diatomic molecule, and gives a new derivation of the Schrödinger equation. A third paper in May showed the equivalence of his approach to that of Heisenberg and gave the treatment of the Stark effect. A fourth paper in this most remarkable series showed how to treat problems in which the system changes with time, as in scattering problems. These papers were the central achievement of his career and were at once recognized as having great significance by the physics community.

The cat came later.

Paul Adrien Maurice Dirac (8 August 1902 – 20 October 1984) Won the Nobel Prize in Physics - 1933.
DIRAC - 1933
Dirac's "Principles of Quantum Mechanics", published in 1930, is a landmark in the history of science. It quickly became one of the standard textbooks on the subject and is still used today. In that book, Dirac incorporated the previous work of Werner Heisenberg on matrix mechanics, and of Erwin Schrödinger on wave mechanics, into a single mathematical formalism that associates measurable quantities to operators acting on the Hilbert space of vectors that describe the state of a physical system. The book also introduced the delta function.  He proposed the Dirac equation as a relativistic equation of motion for the wavefunction of the electron. This work led Dirac to predict the existence of the positron, the electron's antiparticle, which he interpreted in terms of what came to be called the Dirac sea.  Dirac's equation also contributed to explaining the origin of quantum spin as a relativistic phenomenon.

Wolfgang Ernst Pauli (25 April 1900 – 15 December 1958) Won the Nobel Prize in Physics - 1945.
PAULI - 1945
Pauli made many important contributions in his career as a physicist, primarily in the field of quantum mechanics. He seldom published papers, preferring lengthy correspondences with colleagues such as Niels Bohr and Werner Heisenberg, with whom he had close friendships. Many of his ideas and results were never published and appeared only in his letters, which were often copied and circulated by their recipients. Pauli was apparently unconcerned that much of his work thus went uncredited.

Pauli proposed in 1924 a new quantum degree of freedom (or quantum number) with two possible values, in order to resolve inconsistencies between observed molecular spectra and the developing theory of quantum mechanics. He formulated the Pauli exclusion principle, perhaps his most important work, which stated that no two electrons could exist in the same quantum state, identified by four quantum numbers including his new two-valued degree of freedom.  In 1926, shortly after Heisenberg published the matrix theory of modern quantum mechanics, Pauli used it to derive the observed spectrum of the hydrogen atom. This result was important in securing credibility for Heisenberg's theory.

Pauli introduced the 2 × 2 Pauli matrices as a basis of spin operators, thus solving the nonrelativistic theory of spin. This work is sometimes said to have influenced Paul Dirac in his creation of the Dirac equation for the relativistic electron.  In 1930, Pauli considered the problem of beta decay.  In a letter of 4 December to Lise Meitner et al., beginning, "Dear radioactive ladies and gentlemen", he proposed the existence of a hitherto unobserved neutral particle with a small mass, no greater than 1% the mass of a proton, in order to explain the continuous spectrum of beta decay. In 1934, Enrico Fermi incorporated the particle, which he called a neutrino, into his theory of beta decay. The neutrino was first confirmed experimentally in 1956.

In 1940, he proved the spin-statistics theorem, a critical result of quantum field theory which states that particles with half-integer spin are fermions, while particles with integer spin are bosons.

In 1949, he published a paper on Pauli–Villars regularization: regularization is the term for techniques which modify infinite mathematical integrals to make them finite during calculations, so that one can identify whether the intrinsically infinite quantities in the theory (mass, charge, wavefunction) form a finite and hence calculable set which can be redefined in terms of their experimental values, which criterion is termed renormalization, and which removes infinities from quantum field theories, but also importantly allows the calculation of higher order corrections in perturbation theory.

Max Born (11 December 1882 – 5 January 1970) Won the Nobel Prize in Physics - 1954.
BORN ~ 1912
In 1921, Born formulated the now-standard interpretation of the probability density function for ψ*ψ in the Schrödinger equation of quantum mechanics. Up until this time, matrices were seldom used by physicists; they were considered to belong to the realm of pure mathematics. Born had used them in his work on the lattices theory of crystals in 1921. While matrices were used in these cases, the algebra of matrices with their multiplication did not enter the picture as they did in the matrix formulation of quantum mechanics. In 1925, Born and Werner Heisenberg formulated the matrix mechanics representation of quantum mechanics. On 9 July, Heisenberg gave Born a paper to review and submit for publication.  In the paper, Heisenberg formulated quantum theory avoiding the concrete but unobservable representations of electron orbits by using parameters such as transition probabilities for quantum jumps, which necessitated using two indexes corresponding to the initial and final states. When Born read the paper, he recognized the formulation as one which could be transcribed and extended to the systematic language of matrices, which he had learned from his study under Jakob Rosanes at Breslau University. Born, with the help of his assistant and former student Pascual Jordan, began immediately to make the transcription and extension, and they submitted their results for publication; the paper was received for publication just 60 days after Heisenberg’s paper (Hiesenberg won the Nobel Prize for this work in 1932).  Heisenberg wrote a letter to Born in which he said he had been delayed in writing due to a “bad conscience” that he alone had received the Prize “for work done in Göttingen in collaboration — you, Jordan and I.” Heisenberg went on to say that Born and Jordan’s contribution to quantum mechanics cannot be changed by “a wrong decision from the outside.” In 1954, Heisenberg wrote an article honoring Max Planck for his insight in 1900. In the article, Heisenberg credited Born and Jordan for the final mathematical formulation of matrix mechanics and Heisenberg went on to stress how great their contributions were to quantum mechanics, which were not “adequately acknowledged in the public eye.”


OK, so back to the cat, so one day Mrs Schrodinger calls her husband at work and says, "Erwin, have you seen the cat?"

The purpose of all of this is to try to understand how physicists reached the point we are now (and I want to focus on the time element). The people at the 1927 Solvay conference conceived quantum mechanics, and were, (it could be said 'still are') the ones that have explained it to the rest of us. These were the ring-leaders. They took away Newton and replaced it with uncertainty. Well, the Copenhagen interpretation (subjective interpretation) divided the physics world into two major camps (each with many factions). Bohr, Born, and Heisenberg were leaders of the subjective interpretation of quantum mechanics. All versions of the Copenhagen interpretation include at least a formal or methodological version of wave function collapse, in which unobserved eigenvalues are removed from further consideration.  The strictly subjective interpretation is that nothing at atomic scale is real, that a system is completely described by a wave function ψ, representing an observer's subjective knowledge of the system. They believe that the observer brings reality through the act of observing. The Copenhagen interpretation rejects questions like "where was the particle before I measured its position?" as meaningless. The measurement process randomly picks out exactly one of the many possibilities allowed for by the state's wave function in a manner consistent with the well-defined probabilities that are assigned to each possible state. According to the interpretation, the interaction of an observer or apparatus that is external to the quantum system is the cause of wave function collapse, thus according to Heisenberg "reality is in the observations, not in the electron".

The other dissenting interpretation from Copenhagen was that quantum mechanics was not a complete theory (championed by Einstein, Schrodinger, and Pauli).  It is unrealistic to make the tool into something more than a tool.  So, the wavefunction, and probability are describing only a part of nature - mathematically.  The current usage of "realism" and "completeness" originated in the 1935 paper in which Einstein and others proposed the EPR paradox In that paper the authors proposed the concepts "element of reality" and the "completeness" of a physical theory. They characterised "element of reality" as a quantity whose value can be predicted with certainty before measuring or otherwise disturbing it, and defined a "complete physical theory" as one in which every element of physical reality is accounted for by the theory. In a semantic view of interpretation, an interpretation is complete if every element of the interpreting structure is present in the mathematics. Realism is also a property of each of the elements of the maths; an element is real if it corresponds to something in the interpreting structure. For example, in some interpretations of quantum mechanics (such as the many-worlds interpretation) the key vector associated to the system state is said to correspond to an element of physical reality, while in other interpretations it is not.

The state of physics changed completely within the period of about 15 years.  In 1905 when Einstein published his first paper, Newtonian physics ruled the landscape.  He was finally recognized for his work on light quanta in 1922 (he won the 1921 Nobel, the following year). By the time of the Solvay conference in 1927, quantum mechanics was nearly fully formed.  We have made tremendous progress in the last 80 years, but we are still divided by the Copenhagen interpretation.  It seems a handful of physicists took it the first 90% and all other physicists have been filling the remaining 10% since then.

Several of the other interpretations that still have a following are as follows;

Incomplete Theory: Classification adopted by Einstein
Main article: Grand Unified Theory
The Copenhagen interpretation
Main article: Copenhagen interpretation
Many worlds
Main article: Many-worlds interpretation
Consistent histories
Main article: Consistent histories
Ensemble interpretation, or statistical interpretation
Main article: Ensemble interpretation
de Broglie–Bohm theory
Main article: de Broglie–Bohm theory
Relational quantum mechanics
Main article: Relational quantum mechanics
Transactional interpretation
Main article: Transactional interpretation
Stochastic mechanics
Main article: Stochastic interpretation
Objective collapse theories
Main article: Objective collapse theory
von Neumann/Wigner interpretation: consciousness causes the collapse
Main article: Quantum mind/body problem
Many minds
Main article: Many-minds interpretation
Quantum logic
Main article: Quantum logic
Other interpretations
Main article: Minority interpretations of quantum mechanics
Time-symmetric theories
Main article: Retrocausality
Student:  "What's the meaning of it all?"
Professor: "Shut up and calculate!"

Thursday, June 2, 2011

THE AETHER


The Aether. According to ancient and medieval science aether, also spelled æther or ether, is the material that fills the region of the universe above the terrestrial sphere. When Isaac Newton was formulating his laws of motion, mechanics, and gravity, he needed a medium for them to operate within. He adopted a modified form of the Luminiferous aether, although he did write, "I do not know what this Aether is", but that, "if it consists of particles" they will be "exceedingly smaller than those of Air, or even than those of Light". Once again, Newton may have been well ahead of this time. From about 1600 until the late 1800's various Aether Theories held sway on the Physics landscape. The ubiquitous aether has always had irritating problems; then the 1887 Michelson-Morley experiment really rocked the boat, and we haven't quite recovered yet.

HOW TO EXAMINE THE LUMINIFEROUS AETHER
The concept of a medium to occupy space is still a very attractive one. Otherwise, you would be forced to explain what it is that does occupy empty space, is it a construct?, a geometry? Falling back on emptiness or lack of matter fails to explain how the separation came to be in the first place. Einstein favored the construct of a space-time continuum to provide fabric of space. Mathematically it is a manifold consisting of "events" which are described by a coordinate system. Spacetime can be complicated, but if you just apply a little Dr. Seuss juice it makes it all a little simpler, "From there to here, and here to there, funny things are everywhere." In spacetime, a point is here, or it is there, and the two points are separated by a construction of time and energy (or mass). It is interesting to note that the light-like formulation of the spacetime interval seems to imply that light speed is not a speed, but a resting spot (zero speed).

HERE TO THERE AND BACK AGAIN

 

Modern theories are proposing a Higgs boson sea of elemental particles held in a low energy state field.  I must admit, I don't get how the Higgs boson works, but it sounds cool.  Are the field lines round?  How will they fit together?
FITTING THE HIGGS IN A SMALL ROOM
Another form of the space/time geometry model that I particularly am attracted to is the 'holodeck model'.
ENTROPY (TIME) EVENT HORIZON
This theory holds that what we see around us is merely a projection on the entropy (time) event horizon.  The omniverse is naturally in a very high state of entropy, but something occurred to cause a singularity.  The singularity was completely ordered, with every possible elemental particle in line and entangled.  This was highly unstable and reacted by creating an entropy wave (time in our universe).  The entangled particles waves interacted and collapsed.  This process was very rapid at the beginning of the universe (inflationary period), but began to stabilize as the event horizon enlarged and fewer and fewer interactions were required.  Entropy carries us along on a sequential Lumiere ride through time.  This cosmological event horizon can be modeled similar to a black hole's event horizon, ie, that entropy is proportional to the area of its event horizon divided by the Planck area.  The metric expansion of space, modeled mathematically by FLRWis an exact solution of Einstein's field equations of general relativity, and observed and measured by Edwin Hubble.

Tuesday, May 10, 2011

MODELS

Models are essential for humans in order to understand the universe around us. A model seeks to represent empirical objects, phenomena, and physical processes in a logical and objective way. Scientific theories, laws, postulates, and ideas are a way of generating abstract, conceptual, graphical and/or mathematical tools to use to predict and understand future patterns. A model can be a scaled version of the real thing, a verbal description, a picture, contrasts, or more importantly these days, a mathematical formalization of physics.

It is really important to understand that all models are inherently false.

Models serve a very important purpose. No one would argue that Newton's Philosophiæ Naturalis Principia Mathematica is not one of the greatest works of human intelligence, but they will tell you that it is fundamentally incorrect. Newton's model of the universe still has value, and is still taught in the universities because of its facility to predict motion and force, and its easy visualization.

When my son was in the eighth grade, he had a science project to build a model of the Boron atom. He asked me for my help (BIG mistake). We made a model with each electron on a straight wire at the correct respective n-distance from the nucleus to depict its electron shell energy. Further we color-coded each electron to correspond to the shell configuration. We then graphed a simple scale for shell size to each other.

He received a poor grade because he didn't show the electrons on a ring circling the nucleus (this was 1998, so the knowledge that electrons do not 'circle' the nucleus was only 80 years old, give or take a moron or two). I was incensed, and went in to explain the rationale to the teacher. He told me that Alex received a bad grade because he did not show the electrons on a fixed ring circling the nucleus. I explained the energy scales, configuration patterns, overlap detail, and then he asked me where the rings were shown, and I gave up.
Electrons circling the boron atom nucleus in the Bohr model.




Ask anyone you meet to draw an atom, and the Bohr model of the atom is the most likely picture you will get. It is easy to understand, it predicts the behavior fairly well, and no one can see an atom anyway (these things are very small, probably operate in extra dimensions, and they act funny). The point is, the model that we all carry in our head of an atom is wrong, and misleading. Probably looks more like the following, if you could see it.

Boron Atom Dramatization


First of all, if I scale the electron cloud of a boron atom up to the size of the next graphic, and I show the nucleus and electrons as white dots, this is what you would see;


Atom mass shown in white.

If you actually see any white, clean your screen, it is just dust.

That is why mathematical models are so important. Most things that we want to study now are just too small to see. We are learning about them by their behavior when they interact with other things. Mathematics gives us a tool to use to predict an outcome, and then test it by observing experimental behaviors. Einstein said, "As far as the laws of mathematics refer to reality, they are not certain, as far as they are certain, they do not refer to reality."
















DECOHERENCE

In quantum mechanicsquantum decoherence (also see dephasing) is how quantum systems interact with their environments to exhibit probabilistically additive behavior. Quantum decoherence gives the appearance of wave function collapse (the reduction of the physical possibilities into a single possibility as seen by an observer) and justifies the framework and intuition of classical physics as an acceptable approximation: decoherence is the mechanism by which the classical limit emerges out of a quantum starting point and it determines the location of the quantum-classical boundary. Decoherence occurs when a system interacts with its environment in a thermodynamically irreversible way. This prevents different elements in the quantum superposition of the system+environment's wavefunction from interfering with each other. Decoherence has been a subject of active research since the 1980s.

Monday, May 9, 2011

TIME DEVELOPMENT ENTROPY

The time development operator in quantum theory is unitary, because the Hamiltonian is hermitian. Consequently the transition probability matrix is doubly stochastic, which implies the Second Law of Thermodynamics. This derivation is quite general, based on the Shannon entropy, and does not require any assumptions beyond unitarity, which is universally accepted. It is a consequence of the irreversibility or singular nature of the general transition matrix.  I do not fully understand this, but it seems to imply that all possible choices in the quantum state must add up to 100%.  The quantum state collapse gives up information through the entropy laws.  The original concepts of decoherence proposed by David Bohm and John Bell were held in little regard for many years, but the advances in quantum computing have caused a resurgence in the study of dephasing.  While Bohm felt that a carrier wave was responsible for the duality of  particles, dephasing holds that the particle is real in many states, but releases information (irreversibility) when interaction occurs.


Vector phasing, may be functions or frequencies; instead of matrix multiplication, linear transformations may be operators such as the derivative from calculus. These are only a few of countless examples where eigenvectors and eigenvalues are important.
In this shear graph, the yellow arrow
undergoes a phase shift but the red arrow does not.
Therefore the red arrow is an eigenvector,
with eigenvalue 1, as its length is unchanged.


The Bloch sphere is a representation of a qubit,
 the fundamental building block of quantum computers.
The Protium atom is an example where both types of spectra appear. The eigenfunctions of the hydrogen atom Hamiltonian are called eigenstates and are grouped into two categories. The bound states of the hydrogen atom correspond to the discrete part of the spectrum (they have a discrete set of eigenvalues that can be computed by Rydberg formula) while the ionization processes are described by the continuous part (the energy of the collision/ionization is not quantized).  In mathematics, a shear mapping or transvection is a particular kind of linear mapping. Its effect leaves fixed all points on one axis and other points are shifted parallel to the axis by a distance proportional to their perpendicular distance from the axis. It is notable that shear mappings carry areas into equal areas.
Folding Space.
 A simple cube being folded and multiplied


The concept of multiple spacial eigenvector values coexisting in the wave function is not new.  Some of the quantum information computing work being done is also focusing on time eigenvalue vector analysis.
Superposition, Quantum Algorithms (a fixed sequence of quantum logic gates), Particle Entanglement, and the Quantum Decoherence are all important aspects being studied.  The goal in quantum computing is to get the quantum state to release information.  The theory of decoherence has made a huge comeback in main-stream thinking.  Now decoherence is seen less as a complete collapse of the probability wave (or the disappearance of a carrier wave), and more as the release of information to our timeline through a process of entropic information leak.  The probability wave never fully collapses, but must make decisions based on interactions to remain in concert with the laws of physics (a partial collapse).

The Protium atom orbital shells.

Saturday, May 7, 2011

MANIFOLDS

I would really like draw a picture of the fabric of space, but the math needed to model space-time is too complicated.  We can think in terms of the three space dimensions, but when we start to go into higher dimensions the models just do not translate in our brains.

Enter the mathematicians.

Euclid of Alexandria wrote "Elements" somewhere around 300BC.  The fact that you can find Euclid's work repeated in any text book of geometry and algebra attests to it's remarkable impact on the sciences.  In some ways, you can also see a beauty to the progress of Euclid's mathematical formulations.  Euclid probably took many of the concepts and proofs from other mathematicians of the time, but he was the first to put together all of that knowledge into writings that survive.  Euclid started with five main postulates upon which he built his mathematical model of the world.


  1. Any two points determine a unique line containing them.
  2. Any line segment may be extended.
  3. Given a point P and a distance r, there is a circle with center P and radius r.
  4. All right angles are equal. 
  5. That, if a straight line falling on two straight lines makes the interior angles on the same side less than two right angles, the two straight lines, if produced indefinitely, meet on that side on which are the angles less than the two right angles.
Euclidean Geometry was the only game in town for a couple thousand years, but that was not for want of trying. Many mathematicians took a shot at postulate number five, which did not follow directly from the previous four. Euclid saved the fifth postulate till last for a reason. He obviously had problems with it himself, and by definition, the postulate has some esoteric flaws. Take perspective; when we view a railroad track disappearing into the distance, we understand that the tracks are parallel - but that they appear to converge to a point. Can we model visual perspective as a geometry of it's own? The answer is yes, but for centuries Euclid's geometry held it's own. This dominant view was overturned by the revolutionary discovery of non-Euclidean geometry, primarily through the works of Johann Carl Friedrich Gauss (who never published his theory), János Bolyai, and Nikolai Ivanovich Lobachevsky, who demonstrated that ordinary Euclidean space is only one possibility for the development of geometry, and Jules Henri Poincaré, who was looking for the math to back up his theories of relativity. 

Hyperbolic space is a geometrical space analogous to Euclidean space, but such that Euclid's parallel postulate is no longer assumed to hold. Mathematicians would have been content to just play around with the math, but along came a German mathematician, Georg Friedrich Bernhard Riemann, whom was not content with the status quo. Riemann came up with a new geometry using “tensors”. He also found a way to relate his new geometry to four dimensional analysis.  His broad vision of the subject of geometry was expressed in his 1867 inauguration lecture "Über die Hypothesen, welche der Geometrie zu Grunde liegen" (On the hypotheses on which geometry is based), published only after his death. Riemann's new idea of space proved crucial in Einstein's general relativity theory and Riemannian geometry, a mainstay of modern geometric analysis.

Then in 1896 Pieter Zeeman and Hendrik Antoon Lorentz used post Euclidean geometry to accurately model a part of the natural world, magnetic spectral lines, and the door was open. Lorentz went on to work with a young physicist, Albert Einstien, on a new theory of relativity. By 1906 it was noted by Poincaré that, by using an imaginary time coordinate √−1 ct, the Lorentz transformation can be regarded as a rotation in a four-dimensional Euclidean space with imaginary time being the fourth dimension, "manifold space".


The concept of manifolds is important because you can model complicated structures in terms of the relatively well-understood properties of simpler spaces. For example, a manifold is typically endowed with a differentiable structure that allows one to do calculus and a Riemannian metric that allows one to measure distances and angles. Symplectic manifolds serve as the phase spaces in the Hamiltonian formalism (Sir William Rowan Hamilton) of classical mechanics, while four-dimensional Lorentzian manifolds model space-time in general relativity.

Using the Lorentz transformation, you can mathematically model how two observers' varying measurements of space and time can be converted into each other's frames of reference.


This idea was elaborated by Hermann Minkowski who used it to restate the Maxwell equations in four dimensions showing directly their invariance under Lorentz transformation. He further reformulated in four dimensions the then-recent theory of special relativity of Einstein. From this he concluded that time and space should be treated equally and so arose his concept of events taking place in a unified four-dimensional space-time continuum. In a further development he gave an alternative formulation of this idea which did not use the imaginary time coordinate but represented the four variables (x, y, z, t) of space and time in coordinate form in a four dimensional affine space. Points in this space were regarded as events in space-time. The spacetime interval between two events in Minkowski Space is either space-like, light-like ('null') or time-like, creating Nutches for Nitches to fill.





Orbifolds
Phase space