In this lecture, we revisit the fundamentals of the stochastic modeling at the heart of statistical mechanics (microstates, macrostates, multiplicity, and the second law of thermodynamics). We use those fundamentals to motivate the formula for Boltzmann entropy (i.e., why it is a logarithm) and then discuss how systems with relatively low entropy tend to have high "free energy." That is, they have a high ability to do work on another system. When we consider the whole system, the work done by the low-entropy system ends up producing a lot more entropy, causing the entropy of the whole system to increase. This motivates our discussion of the emergence of "dissipative structures" (like life itself) when there are high amounts of free energy (and we use a ball-in-a-basin conceptual model to justify this). We pivot to discussing a combination of economics and physics – econophysics – and how we can think of wealth distributions as macrostates where equal wealth has the lowest entropy and exponentially distributed wealth has the highest entropy. Tidbits of information references are peppered throughout, but we do not get to a specific discussion about Shannon entropy and its relationship to Boltzmann entropy.
Whiteboard notes for this lecture can be found at: https://www.dropbox.com/s/qyvcf975wjs3qc8/SOS220-LectureE2-2023-02-16-Dissipative_Structures_Econophysics_and_Information.pdf?dl=0
No comments:
Post a Comment