(…) In this paper I review some of this recent work on the ‘stochastic thermodynamics of computation’. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
The stochastic thermodynamics of computation
David H Wolpert
Journal of Physics A: Mathematical and Theoretical, Volume 52, Number 19
Shannon’s Information Theory 70 years on