Technology & Engineering

Entropy Change for Ideal Gas

The entropy change for an ideal gas refers to the measure of disorder or randomness in the system as it undergoes a process. When an ideal gas expands, its entropy increases, as the molecules become more dispersed and the system becomes more disordered. Conversely, when an ideal gas is compressed, its entropy decreases, as the molecules become more ordered and the system becomes less disordered.

Written by Perlego with AI-assistance

7 Key excerpts on "Entropy Change for Ideal Gas"

Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.
  • BIOS Instant Notes in Physical Chemistry
    • Gavin Whittaker, Andy Mount, Matthew Heal(Authors)
    • 2000(Publication Date)
    • Taylor & Francis
      (Publisher)
    Fig. 1. Work done by an expanding gas under reversible and non-reversible conditions.

    Thermodynamic definition of entropy

    Entropy is a thermodynamic property of a system. It is denoted as S , and like the enthalpy and internal energy , it is a state function . In thermodynamic expressions, entropy is defined in terms of changes in entropy rather than its absolute value. For any process in any system, under isothermal conditions, the change in entropy, d S , is defined as:
    The system entropy change for an irreversible process is unchanged compared to that for a reversible process as entropy is a state function. The entropy change of the surroundings is always −dq /d T . Thus the total entropy change is zero for a reversible process and >0 for an irreversible process. This is the second law of thermodynamics (see Topic B5 ).
    It is possible to measure the system entropy changes by measuring the heat capacity , C , as a function of temperature. If heat is added reversibly to a system, dq rev =C dT and dS =C d T /T , and the entropy change is then given by:
    The area under a plot of C/T against T gives a direct measure of the entropy change in a system (see Fig. 2 ).
    Fig.2. Calculation of entropy changes from heat capacity data. The entropy change between T1 and T2 is equal to the shaded area under the curve.
    For a phase change at constant pressure,
    qrev
    is equal to ∆H phase change . In the case of fusion, for example, ∆S fus =∆H fus /T . In the fusion of 1 mole of mercury at 234 K, for example, ∆H fus =2333 J, and so, ∆S =(2333/234)=9.96 J K−1 . All phase changes may be similarly treated. The entropy change of vaporization, ∆S vap =∆H vap /T , is notable for being dominated by the large absolute entropy of the gas phase. This is very similar for most materials, and gives rise to Trouton’s Rule , which states that ∆S vap is approximately equal to 85 J K−1 mol−1
  • Biomolecular Thermodynamics
    eBook - ePub

    Biomolecular Thermodynamics

    From Theory to Application

    • Douglas Barrick(Author)
    • 2017(Publication Date)
    • CRC Press
      (Publisher)
    Equation 4.13 , we can calculate the entropy change as
    Δ S =
    S f
    S i
    =
    i f
    d S
    =
    i f
    d q
    T
    =
    i f
    C p
    d T
    T
    (4.14)
    If the heat capacity is independent of temperature, the integral is easily solved:
    Δ S =
    C p
    i f
    d T
    T
    =
    C p
      ln
    (
    T f
    T i
    )
    (4.15)
    For a monatomic ideal gas (where C p is independent of T ), this can be written as
    Δ S =
    5 2
    n R     ln
    (
    T f
    T i
    )
    (4.16)
    Figure 4.9A compares the temperature dependence of the entropy change to that of the heat flow. Whereas heat flow increases linearly with T f for the constant pressure expansion, entropy increases logarithmically. As anticipated above, at low temperature, a unit of heat flow produces a large increase in entropy, whereas at higher temperatures, the same unit of heat flow produces a small increase in entropy.
    This logarithmic temperature dependence of the entropy connects to a useful heuristic classical view of the entropy as a measure of the energy in a system that is unavailable to do work (i.e., energy that is not “free”). The thermal energy in a hot system (e.g., a unit of heat flow) is more available for work than in a cold system. In terms of the Carnot cycle (Equation 4.3 ), efficiency (and thus, work) is maximized when the reservoir delivering the heat is at a high temperature, and the reservoir receiving the wasted heat is at a low temperature. Figure 4.9B compares q and Δ S as a function of volume, which is perhaps a more appropriate coordinate for expansion than temperature. For an ideal gas, the entropy change is (Problem 4.13 )
    (4.17)
    For a constant pressure expansion, entropy increases with volume, but the increase is nonlinear (as with temperature, Figure 4.9A
  • Philosophy of Chemistry
    • Dov M. Gabbay, Paul Thagard, John Woods(Authors)
    • 2011(Publication Date)
    • North Holland
      (Publisher)
    Entropy in Chemistry
    Robert J. Deltete

    1. Introduction

    Contemporary textbooks in physical chemistry and chemical thermodynamics regularly refer to the importance of the concept of entropy in describing the course of chemical reactions and the conditions for chemical equilibrium (e.g., [Winn, 1995 , p. 63]). This was not always the case. In fact, for the most part, it was quite the opposite for a long time, enough so that two recent authors could subtitle a paper “the tortuous entry of entropy into chemistry” [Kragh and Weininger, 1996 ]. In this essay, I begin in Section II with a brief description of the entry of entropy into physics through the work of Rudolf Clausius. I then sketch, in Sections III and IV, the productive use to which the concept was put in the work of Josiah Willard Gibbs and Max Planck, before turning in Section V to the reasons that most chemists did not follow Gibbs and Planck. Section VI offers some speculations on how resistance to entropy on the part of chemists was gradually overcome.

    2. Clausius on Entropy

    The essential step leading to the concept of entropy was taken by Clausius in 1850, when he argued that two laws are needed to reconcile Carnot's principle about the motive power of heat with the law of energy transformation and conservation. Efforts to understand the second of the two laws finally led him in 1865 to his most concise and ultimately most fruitful analytical formulation. In effect, two basic quantities, internal energy and entropy, are defined by the two laws of thermodynamics. The internal energy U is that function of the state of the system whose differential is given by the equation expressing the first law,
    (1)
    where đQ and đW are, respectively, the heat added to the system and the external work done on the system in an infinitesimal process. 1 For a simple fluid, the work is given by the equation
    1
  • Einstein's Fridge
    eBook - ePub

    Einstein's Fridge

    The Science of Fire, Ice and the Universe

    Replace the open doors in the house with heat engines. Heat flows through them as it moves from hotter to colder rooms. In each engine, some heat is turned into work – perhaps it pumps water out of a mine. The rest disperses. Eventually, the rooms’ temperatures equalise. Once the house reaches this state of maximum entropy, the engines will stop working. The heat in the house will no longer be of any use.
    Increasing entropy is thus a measure of the decreasing usefulness of heat.
    All this can seem fanciful. But the multichambered house is a way of understanding any system in which heat disperses. It is a simulacrum for the modern world. We release concentrated heat from fossil fuels, atomic nuclei, sunshine, geothermal sources, or wind . As it flows, we turn some into work that enables our homes, factories, and transport.
    Using engines to exploit low entropy Life, too, runs on this principle. Plants live by dispersing solar energy, animals by dissipating calories from food. ΔS >=0 rules us all.
    In 1865, Clausius revisited the two laws of thermodynamics that he had first stated in his paper of fifteen years prior. He updated them by employing the word energy instead of Kraft, and
    he added his own coinage, entropy
    . The laws state :
    1. The energy of the universe is constant. 2. The entropy of the universe tends to a maximum.
    (Universe means any system that’s closed or sealed off. But because the universe we live in has nothing beyond it, it is true that its energy cannot change and its entropy tends to rise. More intuitively, the second law can be stated: the entropy of any closed system tends to increase.)
    These two lapidary statements are a testament to the human intellect and imagination. They are a scientific milestone every bit as significant as Newton’s laws of motion, which were published two centuries earlier.
    Since 1865, when Clausius published these principles, they have been at the forefront of physics, helping humans to better understand everything from atoms to living cells to black holes. They are a remarkable testament to the power of the human intellect and imagination.
  • Fundamentals of Engineering Thermodynamics
    • V. Babu(Author)
    • 2019(Publication Date)
    • CRC Press
      (Publisher)
    CHAPTER 9

    ENTROPY

    In the previous chapter, the focus was primarily on cyclic processes and their efficiencies. In this chapter, the focus is on individual processes with the objective of assessing their performance. As mentioned earlier, real world processes are irreversible and they cause a degradation in the performance of not only the cycle, but the individual processes that comprise the cycle as well. In this chapter, a metric is developed for assessing the amount of this degradation. This metric utilizes the change in a new property called entropy. By evaluating the change in entropy of an ideal process and the actual process, a comparison of their performance can be carried out and means to improve the performance of the latter may be proposed. Calculation of the change of entropy for a system as well as the rate of change of entropy in a control volume are discussed in detail. A very fundamental law involving entropy, namely, the principle of increase of entropy is also given. This is regarded universally as a profound statement on the evolution of the universe and everything within. Hence, this forms the starting point for most non-engineering thermodynamic treatises.

    9.1    Clausius inequality

    Consider an internally reversible process A-B shown in Fig. 9.1 (left) that is executed by a system. The same is shown in Fig. 9.2 as well. It is desired to replace this process by a sequence of processes, namely, A-1, 1-2 and 2-B (Fig. 9.1 , right). The first and the last processes in this sequence are reversible adiabatic processes and are unique since only one reversible adiabat each passes through state points A and B. The middle process in this sequence is a reversible isothermal process and is not unique since state points 1 and 2 are, as yet, not known. We determine this isotherm by requiring that the net work and heat interaction for process A-B and A-1-2-B be the same. This is illustrated graphically in Figs. 9.1 and 9.2
  • Quest For A Unified Theory
    • Wolfgang Hofkirchner, Wolfgang Hofkirchner(Authors)
    • 2013(Publication Date)
    • Routledge
      (Publisher)
    σ .
     
    dS q/T = σ .
    (3)
    σ never can be negative. Positivity of σ expresses the unidirectionality of spontaneous changes, σ is the “time arrow”. Entropy production is a measure of changes. When nothing happens σ is zero. σ > 0 is a sign that something happened.
    The great success of the entropy approach is classical thermodynamics. A theory describing systems in equilibrium or undergoing reversible processes and is particularly applicable to isolated systems, or to systems with isolation (walls). In isolated systems the equilibrium state is characterized by entropy maximum. When S = S 0 there is no place for further changes, the system is “dead”. Non-equilibrium thermodynamics describes the processes, and provides tools to calculate non-equilibrium entropy changes.
    There were attempts to derive thermodynamics on information basis. In a wonderful paper “Information and thermodynamics” Rothstein [52] described the connection. He argued that entropy is the missing information. Rothstein said, that from an informational viewpoint quantity of heat is energy transferred in a manner which has eluded mechanical description, about which information is lacking in terms of mechanical categories, and entropy can be interpreted as a measure of missing information relative to some standard state. Rothstein used this relation to give thermodynamics an informational formulation. “The basic laws of thermodynamics can be stated as:
     
    a. The conservation of energy b. The existence of modes of energy transfer incapable of mechanical description c. The third law is true by definition, for a perfectly ordered state at absolute zero there is no missing information.”
    2.3 Statistical Entropy
      In 1872 Boltzmann defined entropy in terms of possible microstates  
    S = k log W
    (4)
    where W
  • Elements of Gas Dynamics
    Article 1.8 , we could make a similar statement about the temperature discontinuity. Such a measure of irreversibility is possible but not very useful since it is not general enough. We are trying to find a general criterion such that these simple conditions come out as special cases.
    This may be found in a new variable of state which is called S , the entropy. S and T are both concepts that are particular to thermodynamics. In fact, one can introduce S formally as follows: E , the internal energy, has the character of a potential energy ; for a reversible change of state of an adiabatically enclosed system , we have (e.g., Eq. 1.27 )
    i.e., the pressure is equal to the derivative of the internal energy with respect to the volume , or the force is equal to the derivative of E with respect to displacement . We can ask whether a similar relation can be found for the temperature T . Hence we try to write E as a function of two variables such that the partial derivatives are p and T , respectively; this variable we call S . Thus, let us write tentatively,
    such that and hence
    The first law in the form (
    1.8a
    ) yields
    and hence we identify:
    Hence T dS is equal to the element of heat added in a reversible process . Integrating, we have
    where the integral is to be evaluated for a reversible process leading from a state A to a state B .
    S was defined as a variable of state in Eq. 1.37 . Thus to relate it to dQ the system has to be in thermodynamic equilibrium, and hence we have to assume a reversible process. We now demonstrate that S as defined by Eq. 1.37 or Eq. 1.38 has the properties that we expect. For this purpose, we shall work out Eq. 1.38 in detail for a simple irreversible process, e.g., the mixing of a calorically perfect gas of initially nonuniform temperature (the second example of Article 1.8 ). We shall consider the process through which we return the system reversibly to the initial condition: state B consists of a gas with mass (M 1 + M 2 ) at a temperature
    TB
    . We intend to return the system to state A where the mass M 1 has the temperature T 1 and the mass M 2 , the temperature T 1 . For simplicity, choose M 1 = M 2 = M