Showing posts with label configuration space. Show all posts
Showing posts with label configuration space. Show all posts

Monday, March 4, 2024

Cycles of Time - A Summary Review Part 2

    If you read the first post on this topic, you’ll recall that the book is divided into three parts. In this publication, we cover some of Part 1. We highlight its most interesting aspect, discuss the main points, and provide additional insight using extra material as reference. This is more than just a review; it is my attempt to understand the subject of the book and hopefully create a helpful guide for potential readers.

    Part 1 is titled “The second Law and its Underlying Mystery”. The book starts by exploring the concept of entropy, aiming to present it in the most meaningful way possible by explaining application to our understanding of the universe. The chapter is composed of six subchapters. In this post I will discuss sections 1.1, 1.2, and 1.3 which I have read so far.

1.1  The relentless march of Randomness

    The main subject discussed in this subchapter is related to the field of thermodynamics. A good way of defining thermodynamics is “a branch of physics which deals with the energy and work of a system.” This discipline is only concerned with large-scale observations. It is principally based on a set of four laws that deal with temperature, equilibrium, work, heat, energy conservation, and entropy. The latter being the focus of this chapter.

    Penrose starts by discussing the notion of what a physics law is and how the second law of thermodynamics differs from the rest. The second law of thermodynamics (abbreviated as 2nd law) can be stated as follows:

  •     Any spontaneously occurring process will always lead to an escalation in the entropy of the system” in simple words, the law explains that in an isolated system, entropy will never decrease over time.

    While most laws are represented as equalities, the 2nd law is an inequality. It states that the entropy of an isolated system is greater at later times that it was at earlier times. To grasp the idea presented here, one must have a clear understanding of what a system is. In science, a system refers to a group of interacting elements that act according to a set of rules to form a unified whole. In physics, it refers to a collection of objects that make thinking about a problem more convenient. Systems are perfect scenarios where we choose to acknowledge what is relevant and everything else is background. In this system, entropy is always greater as time evolves.

    The next term we must understand is the concept of entropy, and this is where the next subsections spend a deal of time explaining. One description of entropy is disorder or randomness. if you consider a bedroom as a system, the entropy value is determined by how messy or disorganized the room is. Organizing your room takes a lot of time and energy, but for it to get messy takes no effort at all. It is as if its preferred state was to be messy.  That is where entropy is greater.

    Now, the equations of motions that we know as Newton’s law have niceness in them, and that is they are time reversible. You can determine the initial state of a system if you know where it ended and vice versa. This time reversibility is equally allowable, but with entropy, the case is different. Consider an egg that drops form a table to the floor and it breaks and spills everywhere (this is the example used in the book). If you roll the film backwards you will see a spilling egg reassembling itself. While this is possible for motion, it is not what we see with entropy. Entropy is more related to a probabilistic standpoint that indicates the likelihood that such event happens. As it turns out, it is very unlikely because we do not see eggs magically reassembling in our everyday lives. Instead, we see the eggs dropping and breaking because that is the state entropy favors. “The actual definition of the entropy of a system at any moment is, however, symmetrical with regard to the direction of time” so whichever the direction for future may be, that is where entropy is heading to and increasing. 

    Another puzzling thought I found is how the 2nd law is not a deduction of dynamical laws. I spent a deal of time asking around and searching for information on how this works and basically, it means that because dynamical laws are time reversible, it does not mean that the 2nd law will work the same way (Big thanks to everyone in physics.stackexchange.com for their comments). So, if we travel back in time, we cannot guarantee that entropy is decreasing as a product of reversibility. An example I found of entropy reduction is a freezer; water has a higher entropy as a liquid than as a solid but in a freezer, we can say that is losing entropy as it becomes solid. It does not mean that the water is traveling backward in time because we see time is still moving forward. So, entropy is much deeper than just randomness or disorder.

1.2  Entropy, as state counting

    How do we assign a numerical value to this “randomness”? In 1.2, we see an example of mixing red paint (r) with blue paint (b) to give an idea on how this can be quantified.

    If you think about the paint as small balls rearrange in a 3×3 grid you can make predictions on how the color distribution will look based on the red/blue ratio of paint balls in the grid. Counting different possibilities, a mixture of red and blue paint balls will be redder if r/b ≥ 1, bluer if r/b ≤ 1, and purple if 0.999 ≤ r/b ≤ 1.001.

    This simplistic example does not do justice to reality for a variety of reason. For starters, we imagine balls of paint as perfectly spherical when they may differ in size. In the example given by the book, it also described the paints balls enclosed in a cube to complete the size of the grid. The number of paint balls in the grid does not come close to what we can expect. Even if we choose an example with a population of 108, we should expect a reality with a number closer to 10235,700,000,000,000,000,000,000,000 of arrangements. So, our population will be meaningless as the number of particles we must account for is vastly different.

    In cases like this where number can get completely absurd, finding patterns of behavior in data can be cumbersome. For such cases, the use of logarithm is a most adequate path to take. For entropy measurements, the use of logarithms is more appropriate because the logarithmic properties make calculations much simpler, as it is stated in the book “we want the entropy of a system to be what we get by simply adding the entropies of the individual parts,” and this is something that is accomplished by using logarithms.

    Finally, Penrose introduces a concept that will be explored with more detail in 1.3 and is the concept of configuration space. A simple definition of configuration space is a space defined by generalized coordinates. Generalized coordinates are a set of parameters that represent the state of a system. Because it is generalized, it is not dependent on a coordinate system. if we were to describe the position of a pendulum using the angle relative to the vertical position, we wouldn’t need the conventional x and y which are exclusive of the Cartesian coordinate. The configuration space will contain all possible configurations (states) the generalized coordinates set in the system.  If this is confusing to understand, it’s because this concept tries to break away from the coordinate dependencies and create solutions that will hold true for any system.

1.3  Phase space, and Boltzmann’s definition of entropy

    When I first started reading this chapter, my first question was, “What is phase space?” It’s a term I’ve heard before but never studied it, so I took this opportunity to do a little investigation and see what I can find about it.

    Phase space is a mathematical concept used in dynamical systems theory and control theory. It is a space in which all possible “states” are represented, with each possible state corresponding to one unique point in the phase space. This very theoretical explanation can be summarized in the idea that phase space provides a comprehensive view of all possible states a system can be in and its evolution over time. If we consider a gas composed of many molecules, each molecule’s position and momentum would require a separate dimension in the phase space, so a monoatomic gas would be a 6-dimensional phase space (x, y, z, px, py, pz).

    I encountered this concept of phase space for the first time in a classical mechanics class when we studied the Hamiltonian mechanics (If you are not familiar with this, I recommend you take a quick look at it as it is very useful). The beauty of this framework lies in its deterministic characteristic; if we know the state of our system at one time, we can determine the state at any other time. This dynamical evolution goes through an evolution curve in phase space that must be unique and reversible, but most importantly its volume is dimensionless (it’s just a unitless number) which is basic key point in the Boltzmann definition of entropy as it defines volumes in phase space.

    I encountered more technical definitions that somewhat take the reader away from the importance of the chapter as it spends a considerable amount of time explaining what a coarse graining is. I understand that Penrose wants the reader to grasp the basics where the Boltzmann definition of entropy lies, but for an average, and even some in the field like me, this becomes quite boring and hard to read and follow. So, basically, the equation proposed by Boltzmann for the measurement of entropy is as follows:

    The constant K also known as KB is called the Boltzmann constant and it has a value of 1.3805…×10-23 Joules/Kelvin. We use the logarithm base 10 because of its properties and because we will be dealing with very large number, but this can be substituted for the natural log as well without any issues. Finally, we have V which sometimes is presented as W or Ω, representing the volume of the coarse graining in phase space or the multiplicity of the microstates of a particular macrostate. To my consideration, this approach is much more simplified for general audiences than going through a bunch of cryptic mathematical language. As a reader, I can immediately understand what I need to know to apply this formula.

    The book goes into more details on how to deal with the volume and the coarse graining, but the final point of this chapter lies on the how if the external and internal degrees of freedom are completely independent from each other, then we can calculate the entropy of the system individually and add them together thanks to their logarithmic properties.

    Where V is the internal coarse graining of the phase space P and W is the external coarse graining of the phase space X that creates the product space G = P×X.    

  ...

My apologies for the delay in the publication. I thought I had published this a long time ago, but apparently I did not. Hopefully I will be back to regular posting every week.

 If you like this please share and comment. It will let me know that you want to see more of this and it will also help me grow.

Cycles of Time - A Summary Review Part 2

     If you read the first post on this topic, you’ll recall that the book is divided into three parts. In this publication, we cover some o...