What is entropy?

We explain what entropy is, what is negative entropy and some examples of this degree of equilibrium of a system.

  1. What is entropy?

In  physics , we speak of entropy (usually symbolized with the letter S) to refer to the degree of equilibrium of a thermodynamic system , or rather, to its level of tendency to disorder (entropy variation). Thus, when a positive entropy variation occurs, the components of a system tend more to disorder than when a negative entropy occurs.

Entropy is a key concept for the Second Law of Thermodynamics, which says that “the amount of entropy in the universe tends to increase over time.” Or what is the same: given a sufficient period of time, the systems will tend to disorder . And this potential for disorder will be greater to the extent that the system is closer to equilibrium. The greater the balance, the greater the entropy.

It could also be said that entropy is nothing more than the calculation of the internal energy of a system that is not useful for a job , but exists and accumulates in a given system. That is, excess energy , disposable.

Thus, when a system passes from an initial to a secondary state, in an isothermal process (of equal temperature), the entropy will be equal to the amount of heat that the system exchanges with the environment , divided by its absolute temperature. This is expressed according to the following equation:

S2 – S1 = Q1 → 2 / T

This demonstrates that only entropy variations in a system can be calculated, and not absolute values. The only point where the entropy is zero is at absolute zero (0 K or -273.16 ° C).

  1. Negative entropy

Negative entropy, sytropy or neguentropia is a mechanism by which a system keeps its inevitable levels of entropy stable , that is, of decay, through the export of certain entropy margins to other related subsystems.

That is, a system can decrease its levels of uncertainty as long as it modifies its own structure .

This concept was developed by physicist Erwin Schrödinger in 1943, and later resumed by various scholars.

  1. Entropy Examples

Physics warns about the end of the universe as entropy accumulates over time.

Some everyday examples of entropy are:

  • The breaking of a plate . If we understand the dish as an orderly and balanced system, with a high entropic potential, we will see that its fragmentation into pieces is a natural, temporary event, which does not happen spontaneously in the opposite direction.
  • Radioactive decay . This process, also irreversible, leads to unstable atoms and high entropic charge to become much more stable versions of themselves (changing elements). To do this they must first release to the universe large amounts of energy, which is what we call radiation.
  • Old age  and death . An inevitable reality of our existence is represented by the gradual increase in entropy in the system that is our human body. Eventually, these entropy levels reach their maximum possible and our body fails: by its own wear, by disease, by mistakes made, etc. And to these events comes death, the end of our system.
  • The end of the universe . Contemporary physics has warned about the theory of the end of the universe , as entropy accumulates and accumulates over time, to chaotify its operation and lead to a cessation of movement and instability: to the total death of the universe by heat loss.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button