Monday, November 7, 2011

Thermodynamical entropy and Shannon information

The moment I read about information theory I thought that it was evident that the thermodynamical entropy  was completely equivalent; the same thing, just measured in different units with the Boltzmann constant in the middle. I was shocked when I read that this issue is still under debate today. There are two books that share the idea that the thermodynamical entropy is fundamentally a measure of information
  • A Farewell To Entropy: Statistical Thermodynamics based on Information. By Arieh Ben-Naim
  • E.T. Jaynes: Papers on Probability, Statistics and Statistical Physics. By R.D. Rosenkrantz (Editor)
The main issue is that the Shannon information is fundamentally conserved, while  the thermodynamical entropy  may describe irreversible systems where it is not conserved. More about this topic can be found at



Tuesday, November 1, 2011

PyNewtonCUDA

Denys Bondar and I, just released to the public: PyNewtonCUDA

https://code.google.com/p/py-newton-cuda/

This program is able to propagate a large number of classical particles using  GPU technology. The code is written in Python and PyCUDA, which allows to write high-level code for GPU programming while maintaining very high performance.


You can download the code with the following command

svn checkout https://py-newton-cuda.google.com/svn/trunk PyNewtonCUDA