# Contents

Why Are The Laws of Thermodynamics “Supreme Among the Laws of Nature”?

The second law of theromdynamics seems to be very fundamental to physics, and a weak form can even be proven, although on cosmological scales it is still a mystery as to how the Universe ended up in the exquisitely low entropy state of the Big Bang.

Information is Physical: Landauer’s Principle and Information Soaking Capacity of Physical Systems

In the physical World, information cannot simply be thought abstractly as often (usefully and validly) done in pure information theory. Physical World information must be written in some kind of “ink”, whose nature I explore here.

How Does a Gas of Particles with Uniform Speed Reach Thermodynamic Equilibrium?

When a gas “irreversibly” doubles its volume, what really happens to its entropy? Its information theoretic entropy cannot change if the gas is truly sundered from the rest of the World. Here I describe some practicalities that mean the entropies – information theoretic and experimental – will become equal again.

The Kolmogorov Complexity and Shannon Entropy of a System and How They Are Related

These two measures of a system’s information theoretic entropy are subtly different, but often very alike. Here I explain them.

What Might Lie Beyond Quantum Computing: Can We Overcome The Church-Turing Thesis?

I describe that technology overcoming the Church-Turing Thesis might look like. I look at a proof of the existence of uncomputable functions.

The Statistical Nature of the Second Law of Thermodynamics

Can the second law of thermodynamics be “proven”. Here I discuss the law’s nature and how the Loschmidt paradox thwarts attempts to prove a strong form of the law.

Intuitive Understanding of the Clausius Definition of Entropy

The definition $\delta S = \frac{\delta Q}{T}$ is actually a definition of temperature. Here I show how a thermodynamic temperature was first defined. Temperature is roughly proportional to the mean particle energy; whilst this is exactly true for ideal gasses, it is not the full picture.

How is the Information in a Continuous Variable Limited?

A continuous variable can seemingly encode $\aleph_0$ bits. Why instead does a continuous variable encode only a finite amount of information?

Free Energies: What Does a Physical Chemist Mean when He/She Talks of Needing Work to Throw Excess Entropy Out of a Reaction?

The seemingly handwavey statement that a reaction’s free energy is the total energy let slip by the reaction less the amount of work we need to do to “throw the excess entropy of the reactants relative to the products out of the system” actually makes a great deal of sense. I explore exactly what it means.