Skip to main content

ENTROPY

Submitted by Rajdeep P Rajput on

I want to what exactly entropy means???I know the definition and formula but still not getting the physical significance.When we say temperature we can understand and feel it.But,in case of entropy each time I read about it I don't get it.Can anybody explore it's significance in easy way so that i can understand what exactly entropy means?

Thank you, Li Han, for your kind words.

Dear Rajdeep, on the page pointed out by Li Han, you may wish to focus on two short sections:

  • Isolated system
  • Temperature

It was a revalation to me that entropy is relatively a simple concept.  What needs to be understood is temperature.

Tue, 11/10/2009 - 13:01 Permalink

By far the best discussion on Entropy that I have ever read is the one in Fermi's "Thermodynamics" based on an analysis of the Carnot cycle. Highly recommended.

Thu, 11/12/2009 - 23:48 Permalink

Thanks a lot guys for your replies and help. I know its very late to reply. But, late is better than never. I was not using iMechanica meanwhile and forgot about it. But, I think I should start using it like I use my FB account.

Wed, 08/20/2014 - 22:40 Permalink

Hello Rajdeep, there are several kinds of entropy: in thermodynamics the quantity of heat per temperature or in information theory the quantity of information produced by an information source like data streams in bits which is the appearace probability of random sources as it is introduced by shannon with the boltzmann H-theorem and is known as a measure of appearance uncertainty. In this last one the ambiguity is that the conditional entropy is cumulative in conditions and don't use the same rules as in bayes theorem.

Mohammed Lamine

Sun, 08/24/2014 - 21:05 Permalink