Wednesday, October 7, 2009

Quantum estimation

Qantum estimation/metrology is a research field closely tied with quantum information theory. Everything was born with the Heisenberg uncertainty principle, when we realized that in general measurements can be non-commutative in contrast with the classical world where everything is commutative i.e. the order of the measurements is irrelevant. The consequence of the non-commutativity is that two given observables may be incompatible for simultaneous arbitrarily precise measurements.

The next step was done by Helstrom in the 70's who introduced the Symmetric Logarithmic Derivative (SLD) operator to obtain the Quantum Fisher information. It was later in the 80's when Wootters came with the refined and elegant concept of quantum statistical distinguishability, which was tied with the work of Helstrom by Braunstein and Caves (1994) in their famous paper Statistical distance and the geometry of quantum states. This paper is remarkable because it also established the connection with the completely independent work of Uhlmann about the Bures metric, which was born in the generalization of the Berry phase for mixed states.

More recently, we witnessed many important developments in the quantum single parameter estimation with one unexpected twist. Now we know that quantum mechanics and in particular quantum entanglement have the potential to defeat their classical estimation methods in the quest for higher precision. For example, you can see my older post about NOON states and quantum interferometry.

However, this new knowledge cannot be applied in the real world until we learn how to generate a considerable number of entangled states in a reliable way.

You can also see my wikipedia article about the Bures metric

Tuesday, October 6, 2009

Learning from being wrong

Being wrong seems to be something to avoid, but it may be the price to pay if we want to improve or optimize something. This is the general philosophy behind Genetic/Evolutionary algorithms and even statistical sampling.

In Genetic/Evolutionar algorithms, the key ingredient is the introduction of mutation. In most cases a mutation will be destructive, but there will be occasions when the mutation will be beneficial. We improve by discarding the destructive mutations, while keeping the beneficial ones.

It was Thomas Bäck from Leiden University who told us how Toyota promotes mutation in the production system in order to make more efficient cars.

In statistical sampling, the objective is to explore the space of a probability distribution (explore the opportunities). Here we have the Metropolis (and Metropolis-Hastings) algorithm, where one is forced to be wrong sometimes in order to explore the probability distribution.

Ben Schumacher, the author of the quantum noiseless coding theorem has something to tell about this
Ben Schumacher on "Being wrong"

Disclaimer: The only way to profit from being wrong (mutate) is to be ready to correct ourselves as fast as possible.

Sunday, October 4, 2009

A scientific approach to science education

In 2006, when I was in Windsor, I had the opportunity to meet Carl Wieman, the winner of the physics Nobel price in 2001 for his experimental work in the production of the Bose-Einstein condensate. However, something that called my attention even more was his research and discoveries about science education. He showed us with experimental data how our current approach is far for being the most efficient way to teach science. Even worse, sometimes teaching science as we do it today, leads to students with even more misconceptions!

More about his research in education can be found HERE

Saturday, October 3, 2009

Quantum/Classical transition

The nature of the quantum/classical transition is according to my opinion, the most important fundamental question of physics. There are many approaches and many insights that have consequences on the continuing debate on the interpretation of quantum mechanics and its mysterious features such as the collapse of the wave function. Some of the approaches and related topics, not necessarily independent from each other are
  1. The Ehrenfest theorem (My favorite).
  2. The Koopman- von Neumann equation for the classical wave function.
  3. The Wigner function for a quantum formulation in the phase space.  Moyal brackets and Weyl quantization [1]
  4. Feynman Path integrals and decoherence.
  5. The classical spinor formalism. Electrodynamics: a modern geometric approach By William Eric Baylis
  6. The limit to the Thomas-Fermi model by Elliott H. Lieb
  7. The analogy of classical statistical mechanics with quantum mechanics inspired on the Wigner function by  Blokhintsev. Another researcher, A. O. Bolivar pursues this idea further but it seems that he was unaware of the previous work by Blokhintsev. 
  8. Related with the previous approach is also the work by Amir Caldeira and Anthony J. Leggett, who proposed a quantum dissipation model.  
  9. The very intriguing derivation of the Schrodinger equation from classical mechanics with complexified Brownian motion by Edward Nelson.  
  10. The p-mechanics formalism, which exploits the representations of the Heisemberg. This method is also closely related with the Weyl quantization.
  11. Geometric quantization, is another method inspired on the Weyl quantization trying to maintain a coordinate-free procedure based on differential geometry.


[1]  Zachos, C. and Fairlie, D. and Curtright, T., Quantum mechanics in phase space: an overview with selected papers, World Scientific Pub Co Inc, 2005

    Friday, October 2, 2009

    The creation of the universe in 6 days

    And God created the universe in 6 days, but in order to give us a nice sky full of beautiful shiny stars, he decided to simulate the light as though as it was coming from thousands, millions and billions of years ago in the past. This was necessary because the speed of light is so fast that in 6 days there would be no chances for us to see anything, not even the light from our closest star, excepting our own sun. We would not be able to see the Milky Way as we see it today, not even in 6 thousand years.

    However, God did his job so well that there is no way for anybody to find any hint of crack in the simulation. Unfortunately, this implies that humans will never be able to prove that the universe was created in 6 days, no matter how hard they could try.

    Shannons's noisy-chanel coding theorem

    This is an amazing theorem that completely defies intuition and lies at the core of classical information theory. One way to defeat the noise is to encode the original information with added redundancy. The price to pay is a reduction on the efficiency of transmission, also called transmission rate. Intuition says that as the communication error goes to zero, through redundancy coding, the transmission rate should go to zero as well. NOT TRUE!. Shannon actually found a finite bound for the transmission rate, even as the communication error goes to zero.
    http://en.wikipedia.org/wiki/Noisy-channel_coding_theorem

    Princeton Physics Plasma Laboratory

    The director of the Princeton Physics Plasma Laboratory, Stewart Prager, gave today an overview presentation about the research carried out in the institution. They design and construct tokamaks for nuclear fusion and collaborate with other institutions and projects such as ITER.. After seeing the steady progress in this research field, I am much more optimistic about the feasibility to have one day a commercial energy generator plant based in nuclear fusion. Something that also called me the attention was the importance of Lithium as coating material for the inner surfaces of the plasma chamber.