Author Avatar

Yordan Arango

Some time ago I started to read the book "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress", by Steven Pinker. It is a data-driven defense of the enlightenment and his ideals (reason, science, humanism, and progress) by showing all the benefits they are providing us. I really enjoyed the book until the author introduced a chapter in which he expressed his optimism about the environmental dilemma we are facing as humankind. According to his argument, science should be the solution, something I agree with. However, I think science is not a panacea, opposite to the author's opinion. Regarding climate change, the Paris Agreement stated in 2015 that we needed to cut the emissions by roughly 50% by 2030 to stay below 1.5\(^\circ\)C. Similar suggestions were already warned by science, decades before the 2015's Paris Agreement. Now, by 2024, we are touching the 1.5\(^\circ\)C threshold, 6 years before the the deadline in which it was sopposed we should have cut the half of our emmisions. That's only because science was not listened to. No! Science did what science was supposed to do. Now is the time for decisions, and they will be difficult ones. It's the time for politics.

I decided to not continue reading the Pinker's book, because that's my character. I hoped such a reputed writer had a more engaged role in this catastrophic dilemma. However, he decided to shun his ethical responsibility, something quite common in our leaders. And, the worst of all is that he decided to take such a position because "our living standards (of the humankind) justifies it". Really disappointing and hope you understand why. But this post doesn't go about the Pinker's position about this dilemma, but about a chapter of his book I definitely enjoyed and from which I realized many things. Among them, entropy, or why everything can go wrong.

Despite hating the Pinker's position about climate change, I have to be honest and to accept that his chapter about entropy introduced me into a new fashion to understand many things around me. I am someone who loves new ways to see our reality (of course all of them having sense; not astrology, homeopathy, or those kind of myths). That's probably why I love the metaphor, allegories, proverbs, sayings, and parabolas. I know you have heard about entropy, and it is even possible you know one of its many definitions. One of the most-well known stats that entropy is a measure of disorder: the more disorder, the more entropy. In fact, there are many definitions of entropy, some from statistical mechanics, ones from thermodynamics, and other ones from information theory. This post is just a drifting not about entropy definitions, but about its implications on different topics. It's a way to discus something that, later, I am gonna call an entropic logic.

The monkey typewriter and the intuition behind entropy

One of the reasons why I was deeply engaged by the Pinker's explanation of entropy was that it was an intuitive and simple abstraction. I will use the same example to grasp the intuition behind entropy. Imagine that a monkey in front of a typewriter starts to tap. What would you expect from that experiment? Would you expect a poem? Would you expect a diary? Or at least a count of how many lice it ate in the last week? All of us know that what the monkey is writing is just letters randomly: chaos, disorder, no information, nothing important to read, and something really entropic.

The previous result highlights one of the entropy core ideas: it tends to increase. For the sake of simplicity, suppose the monkey is allowed to randomly choose just 40 letters and the experiment is carried out billions of times (just suppose it is possible). Certainly, a meaningful phrase can be written with 40 letters. But, we all agree that, despite some meaningful phrases can be shaped by chance, the vast majority of the billions of tries will say absolutely nothing. This analogy exhibits why entropy tends to increase. It is not because there is some divine law saying all has to be disordered. It is just statistics. It is just there are more possibilities for disorder. Given a relative huge number of 40-characters phrases without sense (relative to those with sense), it is likely (very) that if one phrase is selected by chance it will be meaningless one. But even, in a very rare case, suppose the monkey writes a phrase that has meaning. We would expect that the next phrase will be a non-sense one, being in accordance to the entropy increasing property.

The concept of entropy was coined by the Prussian physicist Rudolf Clausius. His conceptualization of entropy was embedded in thermodynamics, a discipline of physics studying phenomena related to matter, heat, energy, phase transitions, among others. His work is more remembered for the establishment of the second and third laws of thermodynamics, the second of which we already know: entropy tends to increase. However, this important axiom didn't explain why, as it just stated a macrophysical fact. More specifically, it was a direct consequence coming from the empirical observation of how heat is transferred from a warmer body to a cooler one.

But the principle was the same. Let me show you. Some years after Clausius stated his famous second law of thermodynamics, in the 1860s, the physicist Ludwig Boltzmann developed the earlier work on an area of physics that times later would be called statistical mechanics. He introduced the H-theorem which explained the reasoning behind Clausius' concept of entropy. And guess what: a monkey typewriting. I mean, the principle was the same as the monkey parabola. In short, he explained entropy in a gas tends to increase because the vast majority of possible configurations that gas molecules can adopt are chaotic. Thus, if the gas is not disturbed by an external agent, the likelihood the gas adopts an organized structure, for example with hydrogens separated from nitrogens, is very low; or what is the same: the gas will likely have a messy and high-entropic setting.

Image taken from hydrogen.wsu.edu
Boltzmann's grave. Notice the epitaph on the tombstone: \(S = k \cdot log(W)\). This states the entropy, \(S\), as a function of all the microstates, \(W\), a gas can aqcuire. Photo taken from hydrogen.wsu.edu

Entropy and Information

Talking about entropy is like talking about basic statistics and probabilities. It makes sense the name of the new area of study Boltzmann founded (statistical mechanics). The probabilistic nature behind the concept of entropy makes it easily extrapolable to other fields, even outside of physics. The Information Theory is one of those areas, in the field of mathematics, where the entropy is highly related to information, and it tries to solve questions about information redundancy, information storage, communications, etc.

These words are closely linked. Imagine your neighbor comes to you on a pleasant Sunday while you are reading a nice sci-fi book on the balcony. He talks to you about his week and routine job, and grumble about his tyrant boss. You are a very friendly neighbor and naturally listen to what he is saying. However, you think: boring! The same as the previous week; not new information. Your neighbor is an economist working in the central bank of your country. At some point, he changes the topic and begins to speak about some projections the bank has made in the construction industry for the next two years. You ask your neighbor to repeat the numbers. Then, you quickly compute some projections with those numbers for your architecture company: eureka! If those numbers, provided by your neighbor, are accurate, luckily they could make your company grow in ways you never expected.

What was the probability someone came to you with that kind of comments about the economy? At least, we can agree that it was much more likely to be a comment about your neighbor's boring job. At the end of the day, he did it every Sunday while you were relaxing on the balcony. You see? We are talking again about probabilities. The first comment about your neighbor's job was certainly expected, very likely; but was not informative, nothing new. Conversely, the second comment about economy was more surprising and maybe, because of that, more interesting, more informative.

Surprise. Information. Uncertainty. What I hope is clear here is that the less entropic an event is, the rarer it is and the more information it gives when appears; or vice versa. Since the parallelism between order and organization, and entropy, we can also say: the more organized (low entropic) an event is, the more information it provides. Have you seen those scenes in movies where an agent is investigating a homicide, where the crime scene is so messy? Well, that crime scene is a high entropic event, and it does not provide much information about the murderer. However, once the investigator arranges all the clues, new and valuable information will arise to resolve the case. That depicts an entropic logic of thinking about world issues.

In fact, life can be thought as one of those low entropic events with much information in it. Think a while on it. Humanity has spent perhaps decades and billions of dollars on satellites, research and space missions trying to find life outside of earth. And what we have obtained is conspiracy theories and some tv series and movies mining our curiosity for low entropy/high informative facts, like Men in black and The X-files. Although some calculations in astrobiology say there are high probabilities for life existence in other planets rather than earth, it is also true that it is a rare event in all vast facts in universe. Have you ever wondered why? Yes, we know it is because of entropy. But, like in Boltzmann's theories in statistical mechanics, the tendency to a larger entropy in universe is just a general symptom of randomness of atomic movement in space. Be honest. Is it expected that some atoms would have joined together to form the first DNA chain, to then shape the first archeal cell? Definitely not. Given all the possible settings of atoms in the universe, it was pretty difficult to have one to give rise to the Earth's life.

Image taken from phdcomics.com
Entropy analogy as the level of a desk disorder. Photo taken from phdcomics.com

Buildings, technology, knowledge, philosophy, science, life, democracy, arts... All of them are organized entities with a lot of information. Don't you believe it? Just think that none them are not created by chance. There is something else. Let's return to the example of life trying to find life. Imagine we land on a planet some light-years from earth. We find nothing more than water, which increases possibilities of life, and some other metallic elements. Information? Nothing relevant; the same as the others thousand planets we have visited. But in the \(1002^{nd}\) planet we visit, we find some structures sticking out of the water, with some symmetries, some objects flying in straight line, maybe some lights, and also some characters on large screens. Those things have not been seen on the other 1001 planets and are not there by chance; that kind of organization is not random. That is new information, probably telling us about a new type of life. Definitely, we would figure out the intricacies about the power management of this civilization, the energy sources, their symbols' meanings, their new technology, and their history. But that would only be possible through the information contained in the organization all their advancements represent.

Life: an error in the matrix?

The second law of thermodynamics states that entropy tends to increase. If that is true, why does life exist? Certainly, life is a low-probability event—a low-entropic one. When we stated the second law of thermodynamics, we didn't write it completely: entropy tends to increase in an isolated system. Isolated implies that nothing alters or interacts with the system. Consider that the atoms on Earth are not in an isolated system. For millions of years, Earth has been constantly impacted by meteors and comets, bringing water and metallic materials. More than that, the energy coming from the sun is a source of energy that makes Earth not an isolated system, allowing for low-entropic systems like life. As if that weren't enough, the fact that living organisms must consume something to extract energy is another way in which life is not in an isolated system.

That could be an explanation of why the life, a low entropic state of matter, is possible on earth. The other explanation is that life is actually increasing its entropy. The evidence: aging, death. Remember, the second law doesn't prevent low entropic states in the system; but it prevents entropy from decreasing. You can think, "Okay, we are aging, so we are increasing our entropy. Now, how do you deal with the fact the Earth or life is not an isolated system?". The second law just states a fact about the entire system. I mean, it does not prevent a decrease in the entropy in a part of the system, as long as the other part compensates that "violation" of the law. In other words, the other part can increase the entropy to not just balance the reduction, but also to increase the total system entropy. "You say: yes, but that applies to an isolated system. The earth is not such". Yes, but what if now we consider the universe as the system of interest. Nothing enters or exits from it because, precisely, all is contained within it, all happens within it. So, it does not matter if earth is increasing or decreasing its entropy; we would suppose the entire universe is increasing its entropy.

I bet you know some of the following expressions: "Anything that can go wrong, will go wrong", "Murphy's Law", "No such thing as a free lunch", "All debts must be paid", "It's easier to destroy than to build", etc. All of these are actually expressions of the second law of thermodynamics, in which an undesired event is anticipated, as a higher entropic state is expected. All of this evidences one consequence of the entropic thinking: what we try to do is battling against entropy.

Difference: the key against the entropy

Battling against entropy is actually a defense of heterogeneity. There is some sense carrying us to think homogeneity is analogous to order, and is what we should seek as mankind or even as life. That doesn't have much sense. Think for example in the sellers of paint jars. Does it have sense they sell white and black paint colors in the same jar? Off course not, because in the same jar all would be mixed in gray color. In fact, is really useful to have them separated no just to paint a wall in black or white, but to get different tones of other colors like blue, red or green.

What entropy seeks is to get high, to get an state of disorder, homogeneity, and no differences. But life just blooms in the difference. Many of our sources of energy are based on a gradient, on a difference. For example, hydraulic energy needs a height difference between the water surface and its turbine entrance to be able to convert the potential energy into electrical energy. Water in the ground turns into clouds because a gradient or difference in air density. Migrations of different species are driven by gradients, be it in temperature, food or water availability.

Gradients are everywhere producing movement, life. This methaphor also applies in a cultural sense. If wars are understood as high entropic events, like those happening now in Palestine or in Ukraine, differences in our cultures should be understood as opportunities to create order, to reduce entropy, and finally to create peace and maintain life. Many countries have understood this. In a podcast by Diana Uribe, a famous Colombian historian, she tells the story of a meeting in the context of the peace agreement between the guerrilla group FARC and the Colombian state, where she was invited along with various personalities from different countries. In that meeting she spoke with a Norwegian diplomat, whose country is a current guarantor of the peace agreement, and asked him why Scandinavian countries were so peaceful compared to others. The diplomat answered that in the past they had been a warlike and quarrelsome culture, but they had understood that war left nothing but desolation. Although it is likely they don't understand the equivalence between war and high entropy, we can do. Now, Scandinavian countries are developed, peaceful and culturally advanced.

Classic cultures also understood how differences could create great civilizations that would be remembered for millennia. Greece was indeed a mix of various city-states in different regions of the Aegean Sea. Each region had different myths, but they decided to build a culture based on something that united them: the language. Certainly, the language was a catalyst of a strong country, but what pushed greece towards the dominant culture of his time was the recognition of all different fashions of viewing the world within its territory. Something similar happened in the Roman empire. In its case, each province was tolerated with its own religion. While Rome worshipped to Jupiter, Egypt did the same to Osiris, Dacians to Zalmoxis, and time after, Christians were allowed to worship Jesus. Although this is not an apology to the empires, it is undeniable the preponderant role played by the openness to diversity in building important cultures. Today, we consider ourselves as descendants of Greece and Rome, especially regarding topics like philosophy, language, government, science, technology, among others.

Entropy everywhere

Image by the author
Please, if you are not genuine or an original person, and for that just observe how many people look or behave like you, don't come to me saying, "respect my tastes".
An entropic thinking is something, since the reading of the Pinker's book, I have tried to train. Many of the examples exposed here are just a reminders of the entropy footprint. But the truth is entropy can be everywhere. Entropy is in your messy room; if not, why didn't your t-shirt get folded when you threw it. Entropy is also in the rush hour or in an accident on the road. It is also in the dirtiness you have to clean everyday you arrive from your job. It is in broken eggs. It is in the rust of cars. It is everywhere.

I want to give you a final example of how the entropic thinking can be applied to some ordinary topic. Have you ever wondered why mustages are not popular nowadays, but 30 or 40 years ago they were so trendy? Or how did we transition from linking tattooed people with dangerous, convicted or gangsters to just consider them normal people? I know you know the why and the how: culture, market, globalization and alienation. I always hated that phrase saying "I like it and what; respect my tastes". I always had the intuition that something was bad with that phrase, and it was kind of a bonus card to use in situations you don't have any other argument to wield. I felt tastes are some kind of societal construct, and honestly have tried to respect "tastes" but by a sense of political correctness, but not because such a think as "tastes" exists.

If you don't believe yet that culture, market and globalization make you alienates, entropy can make you do. Let's start with probabilities, the way entropy works. What is the probability thousands of people have the same taste? I mean, is it possible thousands of people have the same taste about how to carry, for example, clothes? What is the probability thousands or, maybe millions, of people like hearing the same song? What is the probability that suddenly, all people in a country wake up liking cycling and athletics (running, for a more aesthetic term)? It's possible this questions sound weird, in the best of cases. Even, it's possible you had never wondered such kind of questions. Because I have to tell, we have used to have imposed tastes.

Now, just imagine culture, market, capitalism, internet, globalization don't exist. For the sake of simplicity, suppose god let you be born alone in a distant planet from Earth. Somehow, as Adam and Eve, you manage to survive alone. In some point, god decides to show you all the singers in Earth's history and let you choose your favorite one. What would be the probability you choose Taylor Swift, Bad Bunny, Feid or Karol G (all of them in the 2023 top-10 ranking on spotify)? The probabilities are extremely low. In fact, if we accept tastes are personal choices, it's possible the music you like is not in the list god showed you. Now, imagine those millions of people are also left in a distant planet from earth but different of yours, and god ask them for choosing their preferred artist. Would they choose the same artist? I am sure that given the huge amount possibilities, it's pretty difficult to find coincidences.

The previous example reveals that tastes can be understood through an entropic logic. The key is to consider how many possible elections are available. If each person has a taste, why is it not suspicious that they coincide? It's not I am saying we can't have real tastes. But I think only very genuine and original people can have them. So, please, if you are not genuine or an original person, and for that just observe how many people look or behave like you, don't come to me saying, "respect my tastes".

Final remarks

This blog's entry just aimed to train you in which I have called an "entropic logic". As you saw, an entropic logic is just a probabilistic reasoning. But what distinguishes an entropic logic from an usual probabilistic reasoning is that in entropy we are always talking about a really tiny probability of something, be it the probability of life existence, be it the probability your room is ordered, be it the probability your tastes concide with your neighbor's ones, or whatever.

If entropy in universe is always growing, when was the moment of least entropy? Will be there a moment of maximum entropy or will it keep growing and growing forever? According to entropy, what is gonna happend to life? I enccourage you to think about this questions or investigate them by watching youtube videos, reading scientific papers or bookk about entropy. There are a lot in the web not only talking about this questions, but also other passionated topics related with entropy I am sure they will make your brain blow.