In 1988 Benatti first developed an agent-based economic model with the very peculiar characteristic that it neither included any representation of matter nor of energy, and no prices. Instead it merely represented agents and the money (capital) which was randomly exchanged between the agents. In 2000 Drăgulescu and Yakovenko developed a set of similar models, unaware of Benatti’s previous work. They are now collectively called the BDY models, or capital exchange models. Variations on these models reveal some striking connections to very fundamental dynamics studied in physics, thermodynamics, and statistical chemistry. In this presentation I will explore the history of capital exchange models, explain one of Yakovenko’s BDY models, present a definition of entropy for such models, describe one variation of a BDY, and briefly mention the remarkable connection to the work of J Willard Gibbs and Gavin Crooks.
Thank you for your presentation - I especially like employment of entropy in research problems covered by social sciences (and I am sociologist not physicist). I have question regarding the equation of CFT at 11:30:
I intuitively understand it, that if the probability of transition from A to B is higher than transition from B to A, entropy increases (delta S is positive). OK, it is in line with 2nd law of termodynamics - In the isolated system entropy increases. And now the mere question:
How we could use this theorem? For what is it useful? It seems to me just as different formulation of 2nd law… so, i.e. we might predict increase of entropy in system, if we know both transition probabilities?
Thank you, Francesco, for your question.
I will give you a short answer, and a longer answer.
I, personally, am excited about this equation, even though I do not really know where it is going to take me ultimately. I am excited, not just because it is in agreement with a very important new presentation of the second law of thermodynamics (the CFT), but because my personal mathematical development of this equation is only very loosely tied to thermodynamics. It potentially has broad applicability to any circumstance in which a histogram of some sufficiently well-conserved measurement is involved. This includes energy (obviously, as found in thermodynamics), genetics (as found in biology, the topic bridged into by the paper by Jeremy England), economics and money (as shown in capital exchange models). But, with much more generality, it implies the “arrow of time” for any social or economic process for which there is a ratio of asymmetric probabilities.
- think of buying an apple, and selling it back to the seller again;
- think of divorcing, and marrying the same person again;
- think of cleaning a floor, then intentionally throwing the dirt back on the floor.
Each of those actions has a high probability of occurrence in one direction, and a low probability of occurrence in the other direction. The reasons why we would not undo these events are both varied and obvious. The fact that some kind of socially-determined entropy is rising is not obvious. But what IS obvious is that these events have very little to do with thermodynamic entropy. They have everything to do with probable behaviour of social beings.
For a more detailed answer, keep reading.
My derivation of it is simple, and starts with a paper by Victor Yakovenko.
Victor M. Yakovenko (2010) “Statistical mechanics approach to the probability distribution of money”, Department of Physics, University of Maryland, College Park, Maryland 20742-4111, USA
Victor Yakovenko is an accomplished physicist, and his presentation is full of arguments that call on a deep knowledge of physics. But there is a snippet of information within it which does not need that deep knowledge. It only requires an understanding of high-school level combinatorial mathematics.
On page 2 of this cited paper, in reference to his equation # 3, we find this definition of entropy.
This quantity is given by the combinatorial formula in terms of the factorials W = N! / ( N1! N2! N3! … ) ) This logarithm of the multiplicity is called the entropy S = ln W.
The formula he calls the multiplicity (W) is the well-known multinomial coefficient of combinatorial mathematics. As a high-school teacher of mathematics, I have used it often in reference to the ways you could construct a given histogram. In my study of agent-based models I had been looking for a way to calculate entropy of a histogram, and this is the key.
Yakovenko, then demonstrates the use of this calculation in his own calculation of the entropy of his capital exchange models.
Then the following series of events led to the equation in my presentation:
- I modified one of Yakovenko’s capital exchange models to be doubly-bounded;
- I developed an equation for the probability of transformation from one state to another;
- I noticed a peculiar property of these transformations that I called an asymmetric ratio of probabilities (AROP); Essentially, that is [ P(AB)/P(BA) ]
- I noticed the cited paper by Jeremy England that contained the same AROP;
- He cited a paper by Crooks that contained the same AROP.
- I returned to the equation S=ln(W) and discovered that the definition of entropy cited by Yakovenko is mathematically equivalent to the definition delta S = ln [P(A–>B)/P(B–>A) ]
Suppose we then decide to step out of thermodynamics, and simply view S=ln(W) as a new kind of measure of any histogram. It can be any kind of well-formed histogram. Then, the question is, when does entropy calculated in this way rise? It is when we attach probabilities to the changes of state from one histogram to another, and when the probabilities are different in the two different directions. These probabilities normally, can only be calculated if the contents of the two histograms are conserved during the transformations.
If you want to see my diary notes on the topic, I would be happy to email them to anyone who is interested.