Submitted 26. 7. 95 to the journal
Symmetry: Culture and Science,
Editor G. Darvas, Hungary. Without any
reply.
Entropy as a
Measure of Symmetry
Milan Kunz
Jurkovičova 13,
63800 Brno, The Czech Republic
In sciences, it is not allowed to
speculate, what would pass if, but the late development of the symmetry theory
led to a great confusion in modern science. Its consequences are spread from
mathematics, over physics, biology, social sciences to philosophy.
One word to change, such a small correction
could be enough, and J. Von Neumann could not say to Shannon1: "You should call it entropy, for two reasons.
In the first place, your uncertainty function has been used in statistical
mechanics under that name so, it already has a name. In the second place, and
more important, no one knows what entropy really is, so in a debate you will
always have the advantage."
The word, which could change this
uncertainty was the word symmetry. It should be used by Boltzmann2, when he explained his idea, what the entropy of
the ideal gas is. He had another problem, too. His proof needed a quantum
hypothesis, 30 years ahead Planck. He had no faith in his thoughts and
abandoned them for conventional mathematics. Instead the term
"symmetry", he used the nothing saying word "probability".
Since this probability is determined by the symmetry, he was correct, but the
basic idea of his proof of the H theorem was not recognized and remained
obscure (Kac3 "a demonstration").
Orbits in multidimensional
spaces
S. Weinberg4 mentioned in his lecture about importance of
mathematics the case of the Ramanudjan-Hardy equation, determining the number
of partitions of the number n into n parts5.
Hardy thought, that it never would have physical applications. This part of the
number theory became important in the theory of elementary particles. It
remained completely forgotten, that even before Hardy, the partitions had a
more practical use, they were the base of the Boltzmann´s proof.
Boltzmann gave an example for 7 particles
partitioning 7 quantas of energy. All possible cases can be written in a table,
where upper indexes count particles with the same energy
706 |
|
|
|
|
|
|
|
6105 |
|
|
|
|
|
|
5205 |
51204 |
|
|
|
|
|
4305 |
42104 |
41303 |
|
|
|
|
|
33104 |
321203 |
31402 |
|
|
32204 |
||||||
|
|
|
23103 |
221302 |
2150 |
|
|
|
|
|
|
|
17 |
It is a diagram of the seven dimensional
plane orthgonal to the diagonal unit vector I, its crossection. The
columns represent consecutively vertices, points on lines, two dimensional
bodies (surfaces in three dimensions), and so on. In rows, the partitions are
arranged according to the size of the largest part. Each partition of the
number seven represents one orbit (Boltzmann used the term
"complexion"), and so they count them. All points lying on such an
orbit have same length, they are obtained by cyclic permutations of the
partition vector and therefore the orbits are spherical. At the beginning, all
energy is concentrated in one particle. After a big bang, the particle, all
energy of the system is concentrated in, collides at its flight with other particles
and the energy dissipates. The system goes spontaneously on the orbit with the
largest symmetry, determined by the group of cyclic permutations. The number of
points on each orbit is determined by the polynomial coefficient for n
permutations, and the function Hn is
just the logarithm of this number
Hn = ln(n!/P nk!) (1)
There is no doubt about it, the factorials
of large numbers are approximated quite satisfactorily using the Stirling
formula. A problem is, that the system must be very large, it must contain a
huge number of particles for obtaining sufficient many simultaneous collisions,
to keep it on one orbit6. A question is, if the
function Hn corresponds really to the entropy. We will see later, that additional
terms must be added at real gases, but the function Hn itself is defined without any uncertainty as a measure of the symmetry.
Information
entropy
Shannon7 build
his theory of communication using axioms. They need not to be explained.
Nevertheless, he used binary logarithm and then the Hm function has a quite certain interpretation8.
To index a set of m objects by a regular
code (symbols 0 and 1), we need mlog2m digits,
for example 000, 001, 010, 011, 100, 101, 110, 111 for 8 items. If these
objects are classified into n groups with the index j, say aaaabbcd, we need
only ....mjlog2mj digits, in our example 10
digits: a00, a01, a10, a11, b0,, b1, c, d. The difference (24- 10) divided by n
= 8 is a measure of information, we have about the set. The fractions mj/m, obtained after manipulations with the formula, are again
interpreted as a probabilities pj.
There remains a question, if both functions
H are related and how. This question was answered differently, but always
wrongly.
We have shown, that Boltzmann connected
function Hn with cyclic permutations of the partition vector. At messages, there
are possible two kinds of permutations, either permutations of the order of
symbols in a string, e. g. when aaaabbcd permutes to babacada, or
substitutions, when e. g. ddddccba is obtained.
This can be done by a formal mathematical
operation, if we write a string of symbols as a naive matrix9,10 N
a |
b |
c |
d |
0 |
1 |
0 |
0 |
1 |
0 |
0 |
0 |
0 |
1 |
0 |
0 |
1 |
0 |
0 |
0 |
0 |
0 |
1 |
0 |
1 |
0 |
0 |
0 |
0 |
0 |
0 |
1 |
1 |
0 |
0 |
0 |
Both symmetry operations are then performed
by multiplying the matrix N by the unit permutation matrices P from
the left and from the right. The number of strings leading to each point on the
partition orbit is determined by the polynomial coefficient for m permutations
m!/P mj! = m!/P nkmk!
The function Hm could be again just the logarithm of this number, since natural and
binary logarithms differ only by a factor. The function Hm measures the number of messages which can be formed from a given set of
symbols. But at each string, the Boltzmann function Hn is defined too, therefore the total number of strings of length m going
to a n dimensional plane is
(n!/P nk!)(m!/P nkmk!) = nm
The logarithm of the product is a sum,
therefore both measures H are additive. In our example, the vector of
frequencies mk is (4, 2, 1, 1) and n(0) = 0, n(1) = 2, n(2) = 1 n(3) = 0, n(4) = 1, etc. Since 0! =1, the zero frequencies can be
neglected and
Hn = log(4!/2!1!2).
The existence of two measures of symmetry
can explain the observed redundancy of natural languages. It is true, that if
all symbols are used equally, it is possible to formulate the greatest number
of messages. But many of them were alike. It is better to explain this
difficulty speaking about words instead of symbols. In a message without
redundancy, there were no key words, and we could not recognize, what is spoken
about.
Symmetry of
graphs
At ideal gases, all energy is concentrated
in a point. At real gases the energy is dispersed, but only within a molecule,
its quanta can not be loose through the whole matrix as symbols are in a naive
matrix. The ideal gas can be formally considered as a quadratic form PTNTNP where PT is
the transposed permutation matrix for n permutations and NTN is the
diagonal vector. Its molecules are represented as points on vector axes. A real
molecule forms a blot in the matrix representing the system.
Molecules are isomorphic with graphs,
mapping their structures. A graph is described by an incidence matrix. This
matrix is either a difference of two naive matrices (oriented graph) or their
sum (unoriented graph). The symmetry of graphs is determined similarly as at
naive matrices, by multiplying the incidence matrix by the unit permutation
matrices from the left and from the right. This leads to the wreath products of
the cyclic groups11
and to rather complicated
formulas transforming the group of cyclic permutations Sn into graph groups.
But here, the relation of the entropy with
symmetry was stated by many authors many times before. Unfortunately, in
specialized journals, only12,13.
Moreover, there exists the entropy of
mixing. Consider now, that the string aaaabbcd represents 8 molecules of 4
different kinds (another embodiment of this kind of entropy is sorting hot and
cool molecules inside the system). The entropy of such a mixture depends on mixing
of molecules inside the system. If the original arrangement permutes to
babacada, its entropy must change. Both Hn and Hm do not measure this effect14,15.
Discussion
If we interpret the spontaneous growing of
entropy as the spontaneous growing of symmetry in the Universe, then we do not
need the term negentropy for living organisms16. They have a greater number of symmetry elements of
higher order, a greater complexity than non-living things, only. The increase
of symmetry is a spontaneous process. Elementary particles form atoms, atoms
molecules, molecules structures as crystals or living cells. Living cells
assemble into organisms, organisms into societies. In each step, new symmetry
elements appear to the old ones17,18.
On the molecular level, an integrating
factor exists, the temperature. The physical entropy is a logarithmic measure
of the amount of energy needed to increase the temperature.
Outside physics, we can calculate functions
H on many levels. But we do not know, if an integrating factor exists here. The
apparent disorder is only unrecognized symmetry.
References
1. M. Tribus, E. C. McIrvine, Energy and
Information, Scientific American, 225 (1971), 3, 179.
2. L. Boltzmann, Über die Beziehung
zwischen dem zweiten Hauptsatze der mechanishen Wärmetheorie und die
Wahrscheinlichkeitsrechnung, Wiener Berichte 76, (1877), 373.
3. M. KAC in J. MEHRA, Ed. The Physicist's
Conception of Nature, Reidel, Dordrecht, 1073, p.560.
4. S. Weinberg, Mathematics, the unifying
thread in Science, Notices AMS, 33 (1986), 716.
5. G. E. Andrews, The Theory of Partitions,
Addison-Wesley Publ. Comp., Reading, MA, 1976.
6. M. Kunz: How to distinguish
distinguishability: Physical and combinatorial definitions, Physics Letters
A 135 (1989) 421-424.
7. C. E. Shannon, The Mathematical Theory
of Communication, Bell System Technical Journal, 27 (1948), 379, 623.
8. M. Kunz: Entropies and information
indices of star forests, Coll. Czech. Chem. Commun., 51 (1986)
1856-1863.
9. M. Kunz, Information processing in
linear vector space, Information Processing and Management, 20 (1984)
519-524.
10. M. Kunz, About metrics of
bibliometrics, J. Chem. Inform. Comput. Sci., 33, 193-196.
11. F. Harary, E. M. Palmer, Graphical
Enumeration, Academic Press, New York, 1973.
12. M. Gordon, W. B. Temple, Chemical Combinatorics. Part I. Chemical
Kinetics, Graph Theory and Combinatorial Entropy, J. Chem. Soc (A), Inorg.
Phys. Theor., 1970, 729.
13. R. M. Dannenfelser, N. Surendran, S. H.
Yalkowsky, SAR, QSAR Environ. Res., 1 (1993), 273.
14. M. Kunz, Time Spectra of Patent
Information, Scientometrics, 11 (1987) 163.
15. M. Kunz: A note about the negentropy
principle, MATCH, 23 (1988) 3.
16. E. Schroedinger, What Is Life?,
Cambridge University Press, Cambridge, 1944.
17. J. Tonnelat, Conformity of Evolution
towards Complexity from Thermodynamic Conclusions, Arch. Int. Physiol.
Biochim. 94 (1986) C11.
18. M. W. Evans, Three Principles of Group
Theoretical Mechanics, Phys. Lett. A, 134 (1989) 409.