Ambiguity in the definition of entropy Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) 2019 Moderator Election Q&A - Question CollectionHow can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?

A term for a woman complaining about things/begging in a cute/childish way

Project Euler #1 in C++

What order were files/directories output in dir?

Significance of Cersei's obsession with elephants?

Why are my pictures showing a dark band on one edge?

AppleTVs create a chatty alternate WiFi network

Did Mueller's report provide an evidentiary basis for the claim of Russian govt election interference via social media?

Do I really need to have a message in a novel to appeal to readers?

How many morphisms from 1 to 1+1 can there be?

Put R under double integral

Is it possible to force a specific program to remain in memory after closing it?

Is it possible to give , in economics, an example of a relation ( set of ordered pairs) that is not a function?

How many time has Arya actually used Needle?

How do living politicians protect their readily obtainable signatures from misuse?

What is Adi Shankara referring to when he says "He has Vajra marks on his feet"?

How often does castling occur in grandmaster games?

How to draw/optimize this graph with tikz

Can a Beast Master ranger change beast companions?

What happened to Thoros of Myr's flaming sword?

Is it fair for a professor to grade us on the possession of past papers?

Converted a Scalar function to a TVF function for parallel execution-Still running in Serial mode

Semigroups with no morphisms between them

How could we fake a moon landing now?

Can a new player join a group only when a new campaign starts?



Ambiguity in the definition of entropy



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)
2019 Moderator Election Q&A - Question CollectionHow can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?










18












$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    Apr 4 at 12:06















18












$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$







  • 1




    $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    Apr 4 at 12:06













18












18








18


4



$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$




The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?







statistical-mechanics entropy definition






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Apr 3 at 16:35









Qmechanic

108k122001249




108k122001249










asked Apr 3 at 0:46









PiKindOfGuyPiKindOfGuy

638724




638724







  • 1




    $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    Apr 4 at 12:06












  • 1




    $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    Apr 4 at 12:06







1




1




$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
Apr 4 at 12:06




$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
Apr 4 at 12:06










3 Answers
3






active

oldest

votes


















37












$begingroup$

Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
    $endgroup$
    – Run like hell
    Apr 3 at 14:59






  • 1




    $begingroup$
    @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
    $endgroup$
    – Acccumulation
    Apr 3 at 15:31










  • $begingroup$
    +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
    $endgroup$
    – M. Winter
    Apr 4 at 8:19










  • $begingroup$
    @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
    $endgroup$
    – Aleksey Druggist
    Apr 4 at 10:32



















13












$begingroup$

Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






share|cite|improve this answer











$endgroup$




















    1












    $begingroup$

    Entropy is a matter of perspective.



    You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



    Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



    If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



    Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



    The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



    Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



    Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



    The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






    share|cite|improve this answer









    $endgroup$













      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "151"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      37












      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        Apr 3 at 14:59






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        Apr 3 at 15:31










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        Apr 4 at 8:19










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        Apr 4 at 10:32
















      37












      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        Apr 3 at 14:59






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        Apr 3 at 15:31










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        Apr 4 at 8:19










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        Apr 4 at 10:32














      37












      37








      37





      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$



      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited Apr 3 at 15:35

























      answered Apr 3 at 1:22









      AcccumulationAcccumulation

      3,024514




      3,024514











      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        Apr 3 at 14:59






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        Apr 3 at 15:31










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        Apr 4 at 8:19










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        Apr 4 at 10:32

















      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        Apr 3 at 14:59






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        Apr 3 at 15:31










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        Apr 4 at 8:19










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        Apr 4 at 10:32
















      $begingroup$
      +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
      $endgroup$
      – Run like hell
      Apr 3 at 14:59




      $begingroup$
      +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
      $endgroup$
      – Run like hell
      Apr 3 at 14:59




      1




      1




      $begingroup$
      @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
      $endgroup$
      – Acccumulation
      Apr 3 at 15:31




      $begingroup$
      @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
      $endgroup$
      – Acccumulation
      Apr 3 at 15:31












      $begingroup$
      +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
      $endgroup$
      – M. Winter
      Apr 4 at 8:19




      $begingroup$
      +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
      $endgroup$
      – M. Winter
      Apr 4 at 8:19












      $begingroup$
      @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
      $endgroup$
      – Aleksey Druggist
      Apr 4 at 10:32





      $begingroup$
      @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
      $endgroup$
      – Aleksey Druggist
      Apr 4 at 10:32












      13












      $begingroup$

      Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



      Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



      As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






      share|cite|improve this answer











      $endgroup$

















        13












        $begingroup$

        Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



        Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



        As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






        share|cite|improve this answer











        $endgroup$















          13












          13








          13





          $begingroup$

          Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



          Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



          As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






          share|cite|improve this answer











          $endgroup$



          Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



          Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



          As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 3 at 14:55

























          answered Apr 3 at 1:23









          CR DrostCR Drost

          23.1k11964




          23.1k11964





















              1












              $begingroup$

              Entropy is a matter of perspective.



              You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



              Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



              If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



              Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



              The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



              Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



              Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



              The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






              share|cite|improve this answer









              $endgroup$

















                1












                $begingroup$

                Entropy is a matter of perspective.



                You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






                share|cite|improve this answer









                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  Entropy is a matter of perspective.



                  You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                  Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                  If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                  Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                  The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                  Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                  Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                  The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






                  share|cite|improve this answer









                  $endgroup$



                  Entropy is a matter of perspective.



                  You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                  Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                  If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                  Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                  The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                  Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                  Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                  The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Apr 3 at 14:17









                  YakkYakk

                  3,0501714




                  3,0501714



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Physics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                      Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

                      Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?