Speaker
Description
The term entropy was originally introduced by the physicist Rudolf
Clausius as a quantity which describes the ability of a physical system
to change its state in a thermodynamic process. But at least since the
pioneering work of the mathematician Claude Shannon, entropy has also
become a central concept in information theory. How are these two
interpretations related? What exactly is entropy and how can we use it
to understand thermodynamic quantities such as temperature? This lecture
aims to provide an introductory overview of these questions in the area
between computer science and physics. One focus will be the numerical
estimation of entropy through sampling. As with any nonlinear estimation
measure, systematic errors occur when estimating entropy, but these
errors can be significantly reduced using suitable mathematical methods.
The lecture also deals with the question of how these correction
procedures can be implemented algorithmically.