# What is Entropy?

S. Mithra
S. Mithra

Entropy describes the tendency for systems to go from a state of higher organization to a state of lowest organization on a molecular level. In your day-to-day life, you intuitively understand how entropy works whenever you pour sugar in your coffee or melt an ice cube in a glass. Entropy can affect the space into which a substance spreads, its phase change from solid to liquid to gas, or its position. In physics, entropy is a mathematical measurement of a change from greater to lesser potential energy, related to the second law of thermodynamics.

Entropy comes from a Greek word meaning, "transformation." This definition gives us insight into why things seemingly transform for no reason. Systems can only maintain organization on a molecular level as long as energy is added. For example, water will boil only as long as you hold a pan over flames. You're adding heat, a form of kinetic energy, to speed up the molecules in the water. If the heat source is removed, we all can guess that the water will gradually cool to about room temperature. This is due to entropy, because the water molecules tend to use up their accumulated potential energy, release heat, and end up with a lower potential energy.

Temperature isn't the only transformation involved in entropy. The changes always involve moving from disequilibrium to equilibrium, consistent with moving to decreasing order. For instance, molecules always spread out to uniformly fill a container. When we drip food coloring in a clear glass of water, even if we don't stir it, that united concentration of a drop will gradually spread out until every part of the water has the same density of color.

Another type of entropy that has to do with visible motion (as opposed to the invisible motion of heat) involves gravity. Unless we put energy into a system, like an arm and a ball, by holding up an object, it falls toward the ground. An elevated position has higher potential energy. It gets converted into kinetic energy of motion as the object falls. The object always ends up with the position of lowest possible potential energy, such as resting against the floor.

In more technical terms, entropy is a specific value that measures how much energy is released in a system when it settles into the lowest potential energy. Entropy assesses the amount of disorder, understood as a change in heat, from an earlier point to a later point in time. This must happen in a "closed" system, where no energy leaks in or out. Theoretically, that can be measured, but practically it is very difficult to create an absolutely closed scenario. In the food coloring example given above, some of the food coloring solution might be evaporating, a separate process from the uniform distribution of a solute.

## You might also Like

anon114831

No the opposite. Higher entropy means less chaos (energy in the system has dissipated, we are close to the death of the universe, close to equilibrium). Low entropy i.e. when we add more energy to the system it gets more chaotic.

From an information theory point of view, "entropy" is how much information the message contains. Higher entropy means the message contain more information and vice versa.

For example, let's say the sender sends the message with lots of information (high entropy from sender's point of view). Now the receiver gets the message. Does this made the things clearer (the message had high entropy to receive.) or more ambiguous (the message had low entropy to receive).

anon106838

could you do a definition from the information theory point of view? that would be cool.

anon90378

If you think of a closed system, as points with different pressures, they will eventually diffuse into a equilibrium with a single pressure over all points. This is the same as going from having potential energy to having none! Thus the system will see no motion, and you won't be able to provoke it unless you added more energy.

In the context of the enormous universe, the same force are acting upon it will eventual cancel out any differences and densities between the various forms of energy. The end state will be the hypothetical heat death of the universe.

"Entropy is a mathematical measurement of a change from greater to lesser potential energy"

is a very good definition. Check the potential energy link if you still don't get entropy.

anon83176

is this really that hard to understand? I'm in seventh grade doing a project on thermodynamics.

I have a good explanation. when japan put a crapload of energy (bombs) into pearl harbor, how come the energy didn't clean up the site instead of blowing it to bits? Entropy.

anon76203

does it mean that it's dying?

anon61939

does entropy occur in a human body? i'm confused.

Callie

anon54168

yes, that is exactly right.

anon16807

If something has high entropy, does it mean that it is more "chaotic"? -Violet