VIEWS: 279 PAGES: 15 POSTED ON: 5/22/2009
From Wikipedia, the free encyclopedia Entropy Entropy constant of proportionality k depends on what units are chosen to measure S. When SI units are chosen, we have k = kB = Boltzmann’s constant = 1.38066×10−23 J K−1. If units of bits are chosen, then k = 1/ln(2) so that . Entropy is central to the second law of thermodynamics. The second law in conjunction with the fundamental thermodynamic relation places limits on a system’s ability to do useful work.[3][4] The second law can also be used to predict whether a physical process will proceed spontaneously. Spontaneous changes in isolated systems occur with an increase in entropy. The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").[5] Definitions and descriptions Ice melting is a common example of "entropy increasing"[1] described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.[2] Entropy articles Introduction History Classical Statistical In science, the term "entropy" is generally interpreted in three distinct but semi-related ways; from a macroscopic viewpoint (classical thermodynamics), microscopic viewpoint (statistical thermodynamics), and information viewpoint (information theory). The statistical definition of entropy (see below) is the fundamental definition because the other two can be mathematically derived from it, but not vice versa. All properties of entropy (including second law of thermodynamics) follow from this definition. Entropy is a concept applied across physics, information theory, mathematics and other branches of science and engineering. The following definition is shared across all these fields: Microscopic definition of entropy (statistical mechanics) In statistical thermodynamics, entropy is defined as where S is the conventional symbol for entropy. The sum runs over all microstates consistent with the given macrostate and is the probability of the ith microstate. The as previously discussed. For almost all practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathematically derived from it, but not vice 1 From Wikipedia, the free encyclopedia versa. (In some rare and recondite situations, a generalization of this formula may be needed to account for quantum coherence effects, but in any situation where a classical notion of probability makes sense, is the entropy.) In Boltzmann’s 1896 Lectures on Gas Theory, he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics. If we now restrict attention to a microcanonical system, i.e. a system where all accessible microstates have the same probability, then as a corollary of the definition of entropy it is easy to show that Entropy density of states. In essence, the most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system.[6] This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model. The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has very deep implications: if two observers use different sets of macroscopic variables, then they will observe different entropies. For example, if observer A uses the variables U, V and W, and observer B uses U, V, W, X, then, by changing X, observer B can cause an effect that looks like a violation of the second law of thermodynamics to observer A. In other words: the set of macroscopic variables one chooses must include everything that may change in the experiment, otherwise one might see decreasing entropy![7] where is the number of microstates corresponding to the observed thermodynamic macrostate (an "accessible" microstate is one with nonzero probability, in contrast to "inaccessible" microstates which all have zero probability). As previously stated, k depends on the choice of units. In 1877, Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems. In general, entropy can be defined for any Markov processes with reversible dynamics and the detailed balance property. Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic variables, like temperature and volume, the entropy measures the degree to which the probability of the system is spread out over different possible quantum states. The more states available to the system with appreciable probability, the greater the entropy. More specifically, entropy is a logarithmic measure of the Macroscopic viewpoint (classical thermodynamics) Conjugate variables of thermodynamics Pressure Volume (Stress) (Strain) Temperature Entropy Chem. potential Particle no. In a thermodynamic system, a "universe" consisting of "surroundings" and "systems" and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time—because equilibrium state has higher probability (more possible combinations of microstates) than any other—see statistical mechanics. In the ice melting example, the difference in temperature between a warm room (the surroundings) and cold glass of ice and water (the system and not part of the room), begins to be equalized as portions of the heat energy from the warm surroundings spread out to the cooler system of ice and water. 2 From Wikipedia, the free encyclopedia Entropy work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system’s surroundings as unusable heat (TR is the temperature of the system’s external surroundings). Otherwise the process will not go forward. In 1862, Clausius stated what he calls the “theorem respecting the equivalence-values of the transformations” or what is now known as the second law of thermodynamics, as such: The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing. Quantitatively, Clausius states the mathematical expression for this theorem is as follows. Let δq be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and T the absolute temperature of the body at the moment of giving up this heat, then the equation: A thermodynamic system Over time the temperature of the glass and its contents and the temperature of the room become equal. The entropy of the room has decreased as some of its energy has been dispersed to the ice and water. However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the "universe" of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed. A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work - the entropy increase will be entirely due to the mixing of the different substances.[8] From a macroscopic perspective, in classical thermodynamics the entropy is interpreted as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic must be true for every reversible cyclical process, and the relation: must hold good for every cyclical process which is in any way possible. This is the essential formulation of the second law and one of the original forms of the concept of entropy. It can be seen that the dimensions of entropy are energy divided by temperature, which is the same as the dimensions of Boltzmann’s constant (kB) and heat capacity. The SI unit of entropy is "joule per kelvin" (JK−1). In this manner, the quantity ΔS is utilized as a type of internal energy, which accounts for the effects of irreversibility, in the energy balance equation for any given system. In the Gibbs free energy equation, ΔG = ΔH − TΔS, for example, which is a formula commonly utilized to determine if chemical reactions will occur, the energy related to entropy changes TΔS is subtracted from the "total" system energy ΔH to give the "free" 3 From Wikipedia, the free encyclopedia energy ΔG of the system, as during a chemical process or as when a system changes state. Entropy restricted to situations where thermal conduction is the only form of energy transfer (in contrast to frictional heating and other dissipative processes). It is further restricted to systems at or near thermal equilibrium. In systems held at constant temperature, the change in entropy, ΔS, is given by the equation where Q is the amount of heat absorbed by the system in an isothermal and reversible process in which the system goes from one state to another, and T is the absolute temperature at which the process is occurring.[9] If the temperature of the system is not constant, then the relationship becomes a differential equation: Correspondence The statistical definition of entropy matches up with the thermodynamic formula for calculating entropy, because adding heat to a system, which increases its classical thermodynamic entropy, also increases the system’s thermal fluctuations, so giving an increased lack of information about the exact microscopic state of the system, i.e. an increased statistical mechanical entropy. The thermodynamics approach to entropy is less general, because it only applies to systems where energy and temperature are well defined. In contrast, the statistical notion of entropy applies to all of thermodynamics as well as to other systems, such as cryptography, data compression and pattern recognition, where energy and temperature may be irrelevant and/or undefinable. Entropy versus heat and temperature Loosely speaking, when a system’s energy is divided into its "useful" energy (energy that can be used, for example, to push a piston), and its "useless energy" (that energy which cannot be used to do external work), then entropy can be used to estimate the "useless", "stray", or "lost" energy, which depends on the entropy of the system and the absolute temperature of the surroundings. As the "useful" and "useless" energy both depend on the surroundings, neither one is a function of the state of the system, and both can be quite tricky to quantify. This stands in contrast to the system’s Gibbs free energy, Helmholtz free energy, entropy, and temperature, all of which are well-defined functions of state. The Gibbs and Helmholtz free energies depend on the temperature of the system (not the surroundings), and do not purport to measure the "useful" energy. When heat is added to a system at high temperature, the increase in entropy is small. When heat is added to a system at low temperature, the increase in entropy is great. This can be quantified as follows: in thermal systems, changes in the entropy can be ascertained by observing the temperature while observing changes in energy. This is Then the total change in entropy for a transformation is: This thermodynamic approach to calculating the entropy is subject to several narrow restrictions which must be respected. In contrast, the fundamental statistical definition of entropy applies to any system, including systems far from equilibrium, and including experiments where "heat" and "temperature" are undefinable. In situations where the thermodynamic approach is valid, it can be shown to be consistent with the fundamental statistical definition. In any case, the statistical definition of entropy remains the fundamental definition, from which all other definitions and all properties of entropy can be derived. History The first law of thermodynamics, formalized through the heat-friction experiments of James Joule in 1843, deals with the concept of energy, which is conserved in all processes; the first law, however, lacks in its ability to quantify the effects of friction and dissipation. Entropy began with the work of French mathematician Lazare Carnot who in his 4 From Wikipedia, the free encyclopedia Entropy of the working body". This latter comment was amended in his foot notes, and it was this comment that led to the development of entropy. In the 1850s and 1860s, German physicist Rudolf Clausius gravely objected to this latter supposition, i.e. that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.[11] Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[11] This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass. Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability. Rudolf Clausius, originator of the concept of entropy 1803 paper Fundamental Principles of Equilibrium and Movement proposed that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat-engines whenever "caloric", or what is now known as heat, falls through a temperature difference, that work or motive power can be produced from the actions of the "fall of caloric" between a hot and cold body. This was an early insight into the second law of thermodynamics. Carnot based his views of heat partially on the early 18th century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, and partially on the contemporary views of Count Rumford who showed in 1789 that heat could be created by friction as when cannon bores are machined.[10] Accordingly, Carnot reasoned that if the body of the working substance, such as a body of steam, is brought back to its original state (temperature and pressure) at the end of a complete engine cycle, that "no change occurs in the condition Consequences and applications The second law A law of physics, the second law of thermodynamics, states that the total entropy of any system cannot decrease except insofar as it flows outward across the boundary of the system. As a corollary, in an isolated system, the entropy cannot decrease (the second law places no restrictions on the increase of entropy). By implication, the entropy of the whole universe, assumed to be an isolated system, cannot decrease; it is always increasing. This is because there are processes that produce entropy from scratch, and the second law defines that these increases cannot be undone elsewhere. Two important consequences are that heat cannot of itself pass from a colder to a hotter body: i.e., it is impossible to transfer heat from a cold to a hot reservoir without at the same time converting a certain amount of work to heat. It is also impossible for any device that can operate on a cycle to receive heat from a single reservoir and produce a net amount of work; it can only get useful work out of the heat if heat is at the same 5 From Wikipedia, the free encyclopedia time transferred from a hot to a cold reservoir. This means that there is no possibility of a "perpetual motion" system. Also, from this it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient. In general, according to the second law, the entropy of a system that is not isolated may decrease. An air conditioner, for example, cools the air in a room, thus reducing the entropy of the air. The heat, however, involved in operating the air conditioner always makes a bigger contribution to the entropy of the environment than the decrease of the entropy of the air. Thus, the total entropy of the room and the environment increases, in agreement with the second law. Entropy microscopic mdetails of the system. Important examples are the Maxwell relations and the relations between heat capacities. Entropy in chemical thermodynamics Thermodynamic entropy is central in chemical thermodynamics, enabling changes to be quantified and the outcome of reactions predicted. The second law of thermodynamics states that entropy in the combination of a system and its surroundings (or in an isolated system by itself) increases during all spontaneous chemical and physical processes. Spontaneity in chemistry means “by itself, or without any outside influence”, and has nothing to do with speed. The Clausius equation of δqrev/T = ΔS introduces the measurement of entropy change, ΔS. Entropy change describes the direction and quantifies the magnitude of simple changes such as heat transfer between systems – always from hotter to cooler spontaneously.[12] Thus, when a mole of substance at 0 K is warmed by its surroundings to 298 K, the sum of the incremental values of qrev/T constitute each element’s or compound’s standard molar entropy, a fundamental physical property and an indicator of the amount of energy stored by a substance at 298 K.[13][14] Entropy change also measures the mixing of substances as a summation of their relative quantities in the final mixture.[15] Entropy is equally essential in predicting the extent of complex chemical reactions, i.e. whether a process will go as written or proceed in the opposite direction. For such applications, ΔS must be incorporated in an expression that includes both the system and its surroundings, ΔSuniverse = ΔSsurroundings + ΔS system. This expression becomes, via some steps, the Gibbs free energy equation for reactants and products in the system: ΔG [the Gibbs free energy change of the system] = ΔH [the enthalpy change] −T ΔS [the entropy change].[13] The arrow of time Entropy is the only quantity in the physical sciences that seems to imply a particular direction for time, sometimes called an arrow of time. As we go "forward" in time, the second law of thermodynamics states that the entropy of an isolated system tends to increase or remain the same; it will not decrease. Hence, from one perspective, entropy measurement is thought of as a kind of clock. The fundamental thermodynamic relation The entropy of a system depends on its internal energy and the external variables, such as the volume. In the thermodynamic limit this fact leads to an equation relating a the change in the internal energy to changes in the entropy and the external variables. This relation is known as the fundamental thermodynamic relation. If the volume is the only external variable, this relation is: dE = TdS − PdV Since the internal energy is fixed when one specifies the entropy and the volume, this relation is valid even if the change from one state of thermal equilibrium to another with infinitessimally larger entropy and volume happens in a non-quasistatic way (so during this change the system may be very far out of thermal equilibrium and then the entropy, pressure and temperature may not exist). The fundamental thermodynamic relation implies many thermodynamic identities that are valid in general, independent of the Entropy balance equation for open systems In chemical engineering, the principles of thermodynamics are commonly applied to "open systems", i.e. those in which heat, work, and mass flow across the system boundary. In a system in which there are 6 From Wikipedia, the free encyclopedia Entropy flows of both heat ( ) and work, i.e. (shaft work) and P(dV/dt) (pressure-volume work), across the system boundaries, the heat flow, but not the work flow, causes a change in the entropy of the system. This rate of entropy change is where T is the absolute thermodynamic temperature of the system at the point of the heat flow. If, in addition, there are mass flows across the system boundaries, the total entropy of the system will also change due to this convected flow. = the net rate of entropy flow due to the flows of mass into and out of the system (where per unit mass). = entropy = the rate of entropy flow due to the flow of heat across the system boundary. = the rate of internal generation of entropy within the system. Note, also, that if there are multiple heat flows, the term is to be replaced by where is the heat flow and Tj is the temperature at the jth heat flow port into the system. Entropy in quantum mechanics (von Neumann entropy) During steady-state continuous operation, an entropy balance applied to an open system accounts for system entropy changes related to heat flow and mass flow across the system boundary. To derive a generalized entropy balanced equation, we start with the general balance equation for the change in any extensive quantity Θ in a thermodynamic system, a quantity that may be either conserved, such as energy, or non-conserved, such as entropy. The basic generic balance expression states that dΘ/dt, i.e. the rate of change of Θ in the system, equals the rate at which Θ enters the system at the boundaries, minus the rate at which Θ leaves the system across the system boundaries, plus the rate at which Θ is generated within the system. Using this generic balance equation, with respect to the rate of change with time of the extensive quantity entropy S, the entropy balance equation for an open thermodynamic system is:[16] In quantum statistical mechanics, the concept of entropy was developed by John von Neumann and is generally referred to as "von Neumann entropy", namely . where ρ is the density matrix and Tr is the trace operator. This upholds the correspondence principle, because in the classical limit, i.e. whenever the classical notion of probability applies, this expression is equivalent to the familiar classical definition of entropy, S = − k ∑ PilnPi i Von Neumann established a rigorous mathematical framework for quantum mechanics with his work Mathematische Grundlagen der Quantenmechanik. He provided in this work a theory of measurement, where the usual notion of wave collapse is described as an irreversible process (the so called von Neumann or projective measurement). Using this concept, in conjunction with the density matrix he extended the classical concept of entropy into the quantum domain. It is well known that a Shannon based definition of information entropy leads in the classical case to the Boltzmann entropy. It is tempting to regard the Von Neumann where 7 From Wikipedia, the free encyclopedia entropy as the corresponding quantum mechanical definition. But the latter is problematic from quantum information point of view. Consequently Stotland, Pomeransky, Bachmat and Cohen have introduced a new definition of entropy that reflects the inherent uncertainty of quantum mechanical states. This definition allows to distinguish between the minimum uncertainty entropy of pure states, and the excess statistical entropy of mixtures.[17] Entropy expression similar to Shannon’s channel capacity, and CO is the "order" capacity of the system.[20] Energy dispersal The concept of entropy can be described qualitatively as a measure of energy dispersal at a specific temperature.[23] Similar terms have been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels. Ambiguities in the terms disorder and chaos, which usually have meanings directly opposed to equilibrium, contribute to widespread confusion and hamper comprehension of entropy for most students.[24] As the second law of thermodynamics shows, in an isolated system internal portions at different temperatures will tend to adjust to a single uniform temperature and thus produce equilibrium. A recently developed educational approach avoids ambiguous terms and describes such spreading out of energy as dispersal, which leads to loss of the differentials required for work even though the total energy remains constant in accordance with the first law of thermodynamics.[25] Physical chemist Peter Atkins, for example, who previously wrote of dispersal leading to a disordered state, now writes that "spontaneous changes are always accompanied by a dispersal of energy", and has discarded ’disorder’ as a description.[12][26] Approaches to understanding entropy Order and disorder Entropy has often been loosely associated with the amount of order, disorder, and/or chaos in a thermodynamic system. The traditional qualitative description of entropy is that it refers to changes in the status quo of the system and is a measure of "molecular disorder" and the amount of wasted energy in a dynamical energy transformation from one state or form to another.[18] In this direction, a number of authors, in recent years, have derived exact entropy formulas to account for and measure disorder and order in atomic and molecular assemblies.[19][20][21][22] One of the simpler entropy order/disorder formulas is that derived in 1984 by thermodynamic physicist Peter Landsberg, which is based on a combination of thermodynamics and information theory arguments. Landsberg argues that when constraints operate on a system, such that it is prevented from entering one or more of its possible or permitted states, as contrasted with its forbidden states, the measure of the total amount of “disorder” in the system is given by the following expression:[21][22] Ice melting example The illustration for this article is a classic example in which entropy increases in a small "universe", a thermodynamic system consisting of the "surroundings" (the warm room) and "system" (glass, ice, cold water). In this universe, some heat energy δQ from the warmer room surroundings (at 298 K or 25 °C) will spread out to the cooler system of ice and water at its constant temperature T of 273 K (0 °C), the melting temperature of ice. The entropy of the system will change by the amount dS = δQ/T, in this example δQ/273 K. (The heat δQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy Similarly, the total amount of "order" in the system is given by: In which CD is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, CI is the "information" capacity of the system, an 8 From Wikipedia, the free encyclopedia of fusion, i.e. the ΔH for ice fusion.) The entropy of the surroundings will change by an amount dS = −δQ/298 K. So in this example, the entropy of the system increases, whereas the entropy of the surroundings decreases. It is important to realize that the decrease in the entropy of the surrounding room is less than the increase in the entropy of the ice and water: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of δQ/298 K for the surroundings is smaller than the ratio (entropy change), of δQ/273 K for the ice+water system. To find the entropy change of our "universe", we add up the entropy changes for its constituents: the surrounding room and the ice+water. The total entropy change is positive; this is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy. As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/T over the continuous range, at many increments, in the initially cool to finally warm water can be found by calculus. The entire miniature "universe", i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that "universe" than when the glass of ice water was introduced and became a "system" within it. Notice that the system will reach a point where the room, the glass and the contents of the glass will be at the same temperature. In this situation, nothing else can happen: although heat does exist in the room (in fact, the amount of heat is the same as in the beginning, since it is a closed system), it is now unable to do useful *work*, as there are no more heat transfers. Unless an external event intervenes (thus breaking the definition of a closed system), the room is destined to remain in the same condition for all eternity. Therefore, following the same reasoning but considering the whole universe as our "room", we reach a similar conclusion: that, at a certain point in the distant future, the whole universe will be a uniform, isothemic and inert body of matter, in which there will be no available energy to do work. This condition is known as the "heat death of the Universe". Entropy Topics in entropy Entropy and life For nearly a century and a half, beginning with Clausius’ 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy as asserted in the 1944 book What is Life? by physicist Erwin Schrödinger served as a further stimulus to this research. Recent writings have used the concept of Gibbs free energy to elaborate on this issue. In the 1982 textbook Principles of Biochemistry by American biochemist Albert Lehninger, for example, it is argued that the "order" produced within cells as they grow and divide is more than compensated for by the "disorder" they create in their surroundings in the course of growth and division. In short, according to Lehninger, "living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy."[27] Evolution related definitions: • - a shorthand colloquial phrase for negative entropy.[28] • - a measure of the tendency of a dynamical system to do useful work and grow more organized.[18] • - a tendency towards order and symmetrical combinations and designs of ever more advantageous and orderly patterns. • – a metaphorical term defining the extent of a living or organizational system’s intelligence, functional order, vitality, energy, life, experience, and capacity and drive for improvement and growth. • - a measure of biodiversity in the study of biological ecology. In a study titled “Natural selection for least action” published in the Proceedings of The Royal Society A., Ville Kaila and Arto Annila of the University of Helsinki describe how the second law of thermodynamics can be written as an equation of motion to describe evolution, showing how natural selection and the principle of least action can be connected by expressing natural selection in terms of 9 From Wikipedia, the free encyclopedia chemical thermodynamics. In this view, evolution explores possible paths to level differences in energy densities and so increase entropy most rapidly. Thus, an organism serves as an energy transfer mechanism, and beneficial mutations allow successive organisms to transfer more energy within their environment.[29] Entropy adapted in other fields of study, including information theory, psychodynamics, thermoeconomics, and evolution.[20][31][32] Entropy and Information theory In information theory, entropy is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy.[33] Shannon entropy is a broad and general concept which finds applications in information theory as well as thermodynamics. It was originally devised by Claude Shannon in 1948 to study the amount of information in a transmitted message. The definition of the information entropy is, however, quite general, and is expressed in terms of a discrete set of probabilities pi: Entropy and cosmology As a finite universe may be considered an isolated system, it may be subject to the Second Law of Thermodynamics, so that its total entropy is constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source. If the universe can be considered to have generally increasing entropy, then—as Roger Penrose has pointed out—gravity plays an important role in the increase because gravity causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes, if they are totally effective matter and energy traps. Hawking has, however, recently changed his stance on this aspect. The role of entropy in cosmology remains a controversial subject. Recent work has cast some doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly - thus entropy density is decreasing with time. This results in an "entropy gap" pushing the system further away from equilibrium.[30] Other complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.{{Fact|date=January 2009} In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of how much information was in the message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message. The question of the link between information entropy and thermodynamic entropy is a hotly debated topic. Some authors argue that there is a link between the two,[34][35][36] while others will argue that they have absolutely nothing to do with each other.[37] The expressions for the two entropies are very similar. The information entropy H for equal probabilities pi = p is: where K is a constant which determines the units of entropy. For example, if the units are bits, then K=1/ln(2). The thermodynamic entropy S , from a statistical mechanical point of view was first expressed by Boltzmann: Other relations Although the concept of entropy was originally a thermodynamic construct, it has been where p is the probability of a system being in a particular microstate, given that it is in a particular macrostate, and k is Boltzmann’s constant. It can be seen that one may think of the thermodynamic entropy as Boltzmann’s constant, divided by ln(2), times the number 10 From Wikipedia, the free encyclopedia of yes/no questions that must be asked in order to determine the microstate of the system, given that we know the macrostate. The link between thermodynamic and information entropy was developed in a series of papers by Edwin Jaynes beginning in 1957.[38] There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors, like Tom Schneider, argue for dropping the word entropy for the H function of information theory and using Shannon’s other term "uncertainty" instead.[39] Entropy • a measure of disorder in the universe or of the availability of the energy in a system to do work.[50] Miscellaneous definitions • - a non-S.I. unit of thermodynamic entropy, usually denoted "e.u." and equal to one calorie per Kelvin per mole, or 4.184 Joules per Kelvin per mole.[51] • - the usual statistical mechanical entropy of a thermodynamic system. • - a type of Gibbs entropy, which neglects internal statistical correlations in the overall particle distribution. • - a generalization of the standard Boltzmann-Gibbs entropy. • - is the entropy content of one mole of substance, under conditions of standard temperature and pressure. • - is the entropy carried by a black hole, which is proportional to the surface area of the black hole’s event horizon.[52] • - the entropy present after a substance is cooled arbitrarily close to absolute zero. • - the change in the entropy when two different chemical substances or components are mixed. • - is the entropy lost upon bringing together two residues of a polymer within a prescribed distance. • - is the entropy associated with the physical arrangement of a polymer chain that assumes a compact or globular state in solution. • - a microscopic force or reaction tendency related to system organization changes, molecular frictional considerations, and statistical variations. • - an entropic thermodynamic potential analogous to the free energy. • – an explosion in which the reactants undergo a large change in volume without releasing a large amount of heat. • – a change in entropy dS between two equilibrium states is given by the heat transferred dQrev divided by the absolute temperature T of the system in this interval.[53] • - the entropy of a monatomic classical ideal gas determined via quantum considerations. Standard textbook definitions The following is a list of definitions of entropy from a collection of textbooks. Note that textbook definitions are not always the most helpful definitions, but they are an important aspect of the culture surrounding the concept of entropy. • – energy broken down in irretrievable heat.[40] • Boltzmann’s constant times the logarithm of a multiplicity; where the multiplicity of a macrostate is the number of microstates that correspond to the macrostate.[41] • "In words, entropy is just the logarithm of the number of ways of arranging things in the system (times the Boltzmann’s constant).".[42] • a non-conserved thermodynamic state function, measured in terms of the number of microstates a system can assume, which corresponds to a degradation in usable energy.[43] • a direct measure of the randomness of a system.[44] • a measure of energy dispersal at a specific temperature.[45] • a measure of the partial loss of the ability of a system to perform work due to the effects of irreversibility.[46] • an index of the tendency of a system towards spontaneous change.[47] • a measure of the unavailability of a system’s energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.[48] • a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level.[49] 11 From Wikipedia, the free encyclopedia Entropy Other mathematical definitions • - a mathematical type of entropy in dynamical systems related to measures of partitions. • - a way of defining entropy in an iterated function map in ergodic theory. • - is a natural distance measure from a "true" probability distribution P to an arbitrary probability distribution Q. • - a generalized entropy measure for fractal systems. • - a Riemannian invariant measuring the exponential rate of volume growth. “ Sociological definitions The concept of entropy has also entered the domain of sociology, generally as a metaphor for chaos, disorder or dissipation of energy, rather than as a direct measure of thermodynamic or information entropy: • – the study or discussion of entropy or the name sometimes given to thermodynamics without differential equations.[9][54] • - the distribution of energy in the psyche, which tends to seek equilibrium or balance among all the structures of the psyche.[55] • – a semi-quantitative measure of the irrevocable dissipation and degradation of natural materials and available energy with respect to economic activity.[56][57] • – a measure of social system structure, having both theoretical and statistical interpretations, i.e. society (macrosocietal variables) measured in terms of how the individual functions in society (microsocietal variables); also related to social equilibrium.[58] • - energy waste as red tape and business team inefficiency, i.e. energy lost to waste.[59] (This definition is comparable to von Clausewitz’s concept of friction in war.) My greatest concern was what to call ” it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. Quotes “ Any method involving the notion of entropy, the very existence of which depends on the second law of thermodynamics, will doubtless seem to many far-fetched, and may repel beginners as obscure and difficult of comprehension. ” 12 From Wikipedia, the free encyclopedia Entropy [4] More explicitly, an energy TRS is not available to do useful work, where TR is the temperature of the coldest accessible reservoir or heat sink external to the system. For further discussion, see Exergy [5] "Entropy". Online Etymology Dictionary. http://www.etymonline.com/ index.php?term=entropy. Retrieved on 2008-08-05. [6] EntropyOrderParametersComplexity.pdf [7] Jaynes, E. T., "The Gibbs Paradox," In Maximum Entropy and Bayesian Methods; Smith, C. R.; Erickson, G. J.; Neudorfer, P. O., Eds.; Kluwer Academic: Dordrecht, 1992, p.1-22 [8] See, e.g., Notes for a “Conversation About Entropy” for a brief discussion of thermodynamic and "configurational" ("positional") entropy in chemistry. [9] ^ Perrot, Pierre (1998). A to Z of Thermodynamics. Oxford University Press. ISBN 0-19-856552-6. [10] McCulloch, Richard, S. (1876). Treatise on the Mechanical Theory of Heat and its Applications to the Steam-Engine, etc.. D. Van Nostrand. [11] ^ Clausius, Rudolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff’s Annalen der Physick, LXXIX (Dover Reprint). ISBN 0-486-59065-8. [12] ^ Atkins, Peter; Julio De Paula (2006). Physical Chemistry, 8th edition. Oxford University Press. ISBN 0-19-870072-5. [13] ^ Moore, J. W.; C. L. Stanistski, P. C. Jurs (2005). Chemistry, The Molecular Science,. Brooks Cole. ISBN 0-534-42201-2. [14] Jungermann, A.H. (2006). “Entropy and the Shelf Model: A Quantum Physical Approach to a Physical Property”. Journal of Chemical Education 83: 1686-1694 [15] Levine, I. N. (2002). Physical Chemistry, 5th edition. McGraw-Hill. ISBN 0-07-231808-2. [16] Sandler, Stanley, I. (1989). Chemical and Engineering Thermodynamics. John Wiley & Sons. ISBN 0-471-83050-X. [17] The information entropy of quantum mechanical states, Europhysics Letters 67, 700 (2004) [18] ^ Haddad, Wassim M.; Chellaboina, VijaySekhar; Nersesov, Sergey G. (2005). See also Autocatalytic reactions and order creation Brownian ratchet Chaos theory Clausius-Duhem inequality Configuration entropy Departure function Enthalpy Entropy rate Geometrical frustration Introduction to entropy Maxwell’s demon Multiplicity function Stirling’s formula Thermodynamic databases for pure substances • Thermodynamic potential • • • • • • • • • • • • • • References [1] Note: In complex systems of molecules, such as at the critical point of water or when salt is added to an ice-water mixture, entropy can either increase or decrease depending on system parameters, such as temperature and pressure. For example, if the spontaneous crystallization of a supercooled liquid takes place under adiabatic conditions the entropy of the resulting crystal will be greater than that of the supercooled liquid (Denbigh, K. (1982). The Principles of Chemical Equilibrium, 4th Ed.). In general, however, when ice melts, the entropy of the two adjoined systems, i.e. the adjacent hot and cold bodies, when thought of as one "universe", increases. Here are some further tutorials: Icemelting – JCE example; Ice-melting and Entropy Change – example; Ice-melting and Entropy Change – discussions [2] Clausius, Rudolf (1862). Communicated to the Naturforschende Gesellschaft of Zurich, January 27, 1862; published in the Vierteljahrschrift of this Society, vol. vii. P. 48; in Poggendorff’s Annalen, May 1862, vol. cxvi. p. 73; in the Philosophical Magazine, S. 4. vol. xxiv. pp. 81, 201; and in the Journal des Mathematiques of Paris, S. 2. vol. vii. P. 209. [3] Daintith, John (2005). Oxford Dictionary of Physics. Oxford University Press. ISBN 0-19-280628-9. 13 From Wikipedia, the free encyclopedia Thermodynamics - A Dynamical Systems Approach. Princeton University Press. ISBN 0-691-12327-6. [19] Callen, Herbert, B (2001). Thermodynamics and an Introduction to Thermostatistics, 2nd Ed.. John Wiley and Sons. ISBN 0-471-86256-8. [20] ^ Brooks, Daniel, R.; Wiley, E.O. (1988). Entropy as Evolution – Towards a Unified Theory of Biology. University of Chicago Press. ISBN 0-226-07574-5. [21] ^ Landsberg, P.T. (1984). “Is Equilibrium always an Entropy Maximum?” J. Stat. Physics 35: 159-69. [22] ^ Landsberg, P.T. (1984). “Can Entropy and “Order” Increase Together?” Physics Letters 102A:171-173 [23] Frank L. Lambert, A Student’s Approach to the Second Law and Entropy [24] Carson, E. M. and J. R. Watson (Department of Educational and Professional Studies, Kings College, London), Undergraduate students’ understandings of entropy and Gibbs Free energy, University Chemistry Education - 2002 Papers, Royal Society of Chemistry. [25] Frank L. Lambert, JCE 2002 (79) 187 [Feb Disorder—A Cracked Crutch for Supporting Entropy Discussions] [26] Atkins, Peter (1984). The Second Law. Scientific American Library. ISBN 0-7167-5004-X. [27] Lehninger, Albert (1993). Principles of Biochemistry, 2nd Ed.. Worth Publishers. ISBN 0-87901-711-2. [28] Schrödinger, Erwin (1944). What is Life the Physical Aspect of the Living Cell. Cambridge University Press. ISBN 0-521-42708-8. [29] Lisa Zyga (2008-08-11). "Evolution as Described by the Second Law of Thermodynamics". Physorg.com. http://www.physorg.com/ news137679868.html. Retrieved on 2008-08-14. [30] Stenger, Victor J. (2007). God: The Failed Hypothesis. Prometheus Books. ISBN 159-102-481-1. [31] Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 981-238-399-9. [32] Yockey, Hubert, P. (2005). Information Theory, Evolution, and the Origin of Life.. Cambridge University Press. ISBN 0-521-80293-8. Entropy [33] Balian, Roger (2003). Entropy – Protean Concept (PDF). Poincaré Seminar 2: 119-45. [34] Brillouin, Leon (1956). Science and Information Theory. name. ISBN 0-486-43918-6. [35] Georgescu-Roegen, Nicholas (1971). The Entropy Law and the Economic Process. Harvard University Press. ISBN 0-674-25781-2. [36] Chen, Jing (2005). The Physical Foundation of Economics - an Analytical Thermodynamic Theory. World Scientific. ISBN 981-256-323-7. [37] Lin, Shu-Kun. (1999). “Diversity and Entropy.” Entropy (Journal), 1[1], 1-3. [38] Edwin T. Jaynes - Bibliography [39] Schneider, Tom, DELILA system (Deoxyribonucleic acid Library Language), (Information Theory Analysis of binding sites), Laboratory of Mathematical Biology, National Cancer Institute, FCRDC Bldg. 469. Rm 144, P.O. Box. B Frederick, MD 21702-1201, USA. [40] de Rosnay, Joel (1979). The Macroscope – a New World View (written by an M.I.T.-trained biochemist). Harper & Row, Publishers. ISBN 0-06-011029-5. [41] Baierlein, Ralph (2003). Thermal Physics. Cambridge University Press. ISBN 0-521-65838-1. [42] Schroeder, Daniel, R. (2000). Thermal Physics. New York: Addison Wesley Longman. ISBN 0-201-38027-7. [43] McGraw-Hill Concise Encyclopedia of Chemistry, 2004 [44] Chang, Raymond (1998). Chemistry, 6th Ed.. New York: McGraw Hill. ISBN 0-07-115221-0. [45] Atkins, Peter; Julio De Paula (2006). Physical Chemistry, 8th edition. Oxford University Press. ISBN 0-19-870072-5. [46] Cutnell, John, D.; Johnson, Kenneth, J. (1998). Physics, 4th edition. John Wiley and Sons, Inc.. ISBN 0-471-19113-2. [47] Haynie, Donald, T. (2001). Biological Thermodynamics. Cambridge University Press. ISBN 0-521-79165-0. [48] Oxford Dictionary of Science, 2005 [49] Barnes & Noble’s Essential Dictionary of Science, 2004 [50] Gribbin’s Encyclopedia of Particle Physics, 2000 14 From Wikipedia, the free encyclopedia [51] [www.iupac.org/goldbook/E02151.pdf "Entropy unit"]. www.iupac.org/ goldbook/E02151.pdf. [52] von Baeyer, Christian, H. (2003). Information - the New Language of Science. Harvard University Press. ISBN 0-674-01387-5. [53] Serway, Raymond, A. (1992). Physics for Scientists and Engineers. Saunders Golden Subburst Series. ISBN 0-03-096026-6. [54] Example: "Entropology, not anthropology, should be the word for the discipline that devotes itself to the study of the process of disintegration in its most evolved forms." (In A World on Wane, London, 1961, pg. 397; translated by John Russell of Tristes Tropiques by Claude Levi-Strauss.) [55] Hall, Calvin S.; Nordby, Vernon J. (1999). A Primer of Jungian Psychology. New York: Meridian. ISBN 0-452-01186-8. [56] Georgescu-Roegen, Nicholas (1971). The Entropy Law and the Economic Process. Harvard University Press. ISBN 0-674-25781-2. [57] Burley, Peter; Foster, John (1994). Economics and Thermodynamics – New Perspectives on Economic Analysis. Kluwer Academic Publishers. ISBN 0-7923-9446-1. [58] Bailey, Kenneth, D. (1990). Social Entropy Theory. State University of New York Press. ISBN 0-7914.... [59] DeMarco, Tom; Lister, Timothy (1999). Peopleware: Productive Projects and Teams, 2nd. Ed.. Dorset House Publishing Co.. ISBN 0-932633-43-9. • P. Pluch Quantum Probability Theory, PhD Thesis, University of Klagenfurt (2006) Entropy Further reading 1. Ben-Naim, Arieh (2007). Entropy Demystified. World Scientific. ISBN 981-270-055-2. 2. Dugdale, J. S. (1996). Entropy and its Physical Meaning (2nd Ed. ed.). Taylor and Francis (UK); CRC (US). ISBN 0748405690. 3. Fermi, Enrico (1937). Thermodynamics. Prentice Hall. ISBN 0-486-60361-X. 4. Kroemer, Herbert; Charles Kittel (1980). Thermal Physics (2nd Ed. ed.). W. H. Freeman Company. ISBN 0-7167-1088-9. 5. Penrose, Roger (2005). The Road to Reality: A Complete Guide to the Laws of the Universe. ISBN 0-679-45443-8. 6. Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN 0-07-051800-9. 7. Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN 0-674-75325-9. 8. vonBaeyer; Hans Christian (1998). Maxwell’s Demon: Why Warmth Disperses and Time Passes. Random House. ISBN 0-679-43342-2. 9. Entropy for beginners External links • Entropy - A Basic Understanding A primer for entropy from a chemical perspective • Interactive Shockwave Animation on Entropy • Max Jammer (1973). Dictionary of the History of Ideas: Entropy • Frank L. Lambert; entropysite.com – links to articles including simple introductions to entropy for chemistry students and for general readers. • Thermodynamics - a chapter from an online textbook • Entropy on Project PHYSNET • Entropy Journal - a free journal on Entropy Retrieved from "http://en.wikipedia.org/wiki/Entropy" Categories: Articles with separate introductions, Thermodynamic entropy, Philosophy of thermal and statistical physics, Fundamental physics concepts, Greek loanwords This page was last modified on 20 May 2009, at 20:23 (UTC). All text is available under the terms of the GNU Free Documentation License. (See Copyrights for details.) Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a U.S. registered 501(c)(3) taxdeductible nonprofit charity. Privacy policy About Wikipedia Disclaimers 15