0 Shares

The butterfly effect 3/5

Irreversibility, memory and entropy

irreversibilite-memoire-entropie

Butterfly effect, fur­ther inves­ti­ga­tion ! Before conti­nuing your rea­ding, it is still time to explore the his­to­ry of chaos theo­ry and broa­den your pers­pec­tive by loo­king at chao­tic sys­tems from the pers­pec­tive of inter­de­pen­dence. Are you rea­dy ? So keep this infor­ma­tion in mind, we will put it in pers­pec­tive with the notions of irre­ver­si­bi­li­ty and entro­py.

                   

The irreversibility of the phenomena

Course of time, arrow of time

There is no consen­sus in phy­sics on the concept of time. Or in phi­lo­so­phy, for that mat­ter. This sub­ject pro­ba­bly deserves a full article. Lack of time… I’d set­tle for a short­cut here ! In an attempt to shed light on the mys­te­ry of time and to lay the foun­da­tions for this article, I bor­row from the french phy­si­cist and phi­lo­so­pher of science Etienne Klein the dis­tinc­tion bet­ween the course of time and the arrow of time.


« To put it in one sen­tence, the course of time ensures the conti­nui­ty of the world — it pre­vents the world from disap­pea­ring — while the arrow of time pro­duces sto­ries whose epi­logue is never the same than the begin­ning. »
[1]

« Let’s say that the course of time will prevent you in the future from having a lower age than your present one, and the time arrow will prevent you in the future from loo­king like the child you were. » [2]

 

Etienne Klein’s words are easi­ly concei­vable on our scale : we expe­rience the irre­ver­si­bi­li­ty of phe­no­me­na. Our expe­rience conti­nual­ly teaches us that we can­not go back in time. You can’t put the eggs that were used to make the ome­lette back in the box.

What about at the micro­sco­pic level ? The ans­wer is far from obvious because things are not real­ly com­pa­rable. On this scale, in fact, one can­not have a direct expe­rience of time, nor any other, one can only observe phe­no­me­na. And our obser­va­tion leads to a com­ple­te­ly dif­ferent interpretation.

                       

Experiment Vs Observation

phenomene-reversible

All the equa­tions of micro­sco­pic phy­sics, such as the Schrödinger equa­tion, are rever­sible in time. Which means that wha­te­ver they allow to be done, they also allow to be undone. Thus, par­ticle col­li­sions can be per­for­med in the labo­ra­to­ry… and also reverse col­li­sions. But let’s be clear : it’s the phe­no­me­non that is rever­sible, not time !

Let us note all the same that a par­ticle col­li­sion, even if it is rever­sed, does not make much sense for us, it is the case to say so ! Because we never expe­rience it. Therefore, whe­ther time at the micro­sco­pic level flows one way or the other, whe­ther time and phe­no­me­na are inde­pendent of each other, it remains for us only an obser­va­tion unre­la­ted to our expe­rience. However, are we sure what we’re loo­king at ? We’ll come back to that.

If for rever­sible phe­no­me­na, there is the­re­fore no time arrow, then how can we explain the emer­gence of this arrow at the macro­sco­pic level ? Or, in other words, how can macro­sco­pic irre­ver­si­bi­li­ty be explai­ned from micro­sco­pic phy­si­cal laws that are all reversible ?


« It is not exclu­ded that the course of time and the time arrow are ulti­ma­te­ly the result of one and the same, dee­per rea­li­ty, that they are both pro­ducts of under­lying phe­no­me­na that a « new phy­sics » may bring to light (…). »
[3]


It would seem that Nassim Haramein’s « new phy­sics » reveals an under­lying phe­no­me­non. First of all, for him, time is a human concept. And not a fea­ture of the uni­verse : what cha­rac­te­rizes the uni­verse is memo­ry.

                    

Memory rather than time

An underlying phenomenon…

 

« Without memo­ry, there is no time. If you can’t remem­ber the pre­vious moment, you have no sense of time. » [4]


memoire-et-irreversibiliteIn his theo­ry of the uni­fied field (connec­ted uni­verse), Nassim Haramein shows that the uni­verse encodes infor­ma­tion on the sur­face of space-time. The enco­ding is so meti­cu­lous­ly done conse­cu­ti­ve­ly that it gives us that fami­liar fee­ling of « time pas­sing ». Thus is assu­red « the conti­nui­ty of the world » of which Etienne Klein speaks.

The infor­ma­tion sto­red by the uni­verse forms a regu­lar pro­gres­sion that we inter­pret as the pas­sage of time, with its begin­nings and its epi­logues. Beginnings and endings that are never the same because their res­pec­tive space-memory coor­di­nates [5] can never be iden­ti­cal. The only way for us to dis­tin­guish an epi­logue from a begin­ning is to remem­ber the begin­ning. And what dif­fe­ren­tiates a begin­ning from an epi­logue is the lear­ning that takes place bet­ween the two [6]. Each sys­tem learns about itself and pro­gresses. Since no sys­tem is iso­la­ted in the connec­ted uni­verse, eve­ryone par­ti­ci­pates equal­ly in the lear­ning of others. Thus, at any scale, no sys­tem can ever return to the space-memory coor­di­nates it was at, and in that sense there is a time arrow. An arrow of time that, so to speak, crosses scales.

Therefore, while the labo­ra­to­ry reverse col­li­sion does not take us back in time, it is not incom­pa­tible with a time arrow. The par­ticles are in fact conti­nuous­ly infor­ming them­selves, enco­ding each time infor­ma­tion that lit­tle by lit­tle accu­mu­lates to form their memo­ry. In the same way, we can believe that the tra­jec­to­ry of a clas­si­cal oscil­la­tor, like a pen­du­lum, always passes through the same points again and again. In rea­li­ty, each time the pen­du­lum passes, these points are loca­ted at dif­ferent spatio-temporal coor­di­nates [7].

                  

… for an emerging property

Moral : The sto­rage of infor­ma­tion allows both the conti­nui­ty of the world to be assu­red and the epi­logues to be always dif­ferent from the begin­nings. In other words, it explains both the course and the arrow of time, and thus the irre­ver­si­bi­li­ty of phenomena.

Our age and appea­rance will not more change with the notion of memo­ry than with the notion of time. But sto­ring offers an inter­es­ting pos­si­bi­li­ty : connec­ting to a known age or phy­si­cal state in the past. Such a pos­si­bi­li­ty exists because of the holo­gra­phic nature of the uni­verse [8], which makes all infor­ma­tion avai­lable at eve­ry point in space. Then, from there, it is pos­sible to modi­fy the infor­ma­tion enco­ded to pre­cise coor­di­nates, and thus our fee­ling of this infor­ma­tion in the present and for the future.

Finally, the dyna­mics that construct the time­line of the uni­verse logi­cal­ly fol­low the dyna­mics that under­lie the uni­verse. This dyna­mic constant­ly does, undoes and redoes the mate­rial world. Through this feed­back bet­ween emp­tiness and mat­ter, it creates the illu­sion of move­ment, and the­re­fore of time and irre­ver­si­bi­li­ty that we expe­rience. Then, time becomes an emergent pro­per­ty [9] of the dyna­mics of the universe.

If for the phy­si­cist Ilya Prigogine, « there is an arrow of time at the macro­sco­pic level, but the micro­sco­pic level creates the illu­sion that there is none » [10], for Nassim Haramein, there is an arrow of time, from the micro­sco­pic to the macro­sco­pic scale, because the uni­verse stores infor­ma­tion.


 irreversibility-and-dynamics-of-the-universe

Before going on with this new phy­sics, let’s go back for a moment to the stan­dard theo­ries, which consi­der irre­ver­si­bi­li­ty in rela­tion to the concept of entropy :


« Entropy is the essen­tial ele­ment intro­du­ced by ther­mo­dy­na­mics, the science of irre­ver­sible, i.e. time-oriented pro­cesses. »
[11]

                   

Complexity rather than chaos

The concept of entropy

What is entropy ?

Entropy is defi­ned in a clo­sed sys­tem, i.e. a por­tion of space that has no inter­ac­tion with the out­side. Here I approach this concept from the angle of the notion of heat [12].

Heat refers to a flow of ener­gy bet­ween two sys­tems. It is a ther­mal agi­ta­tion trans­fer [13] which, as its defi­ni­tion indi­cates… is mes­sy ! Shocks bet­ween par­ticles create an agi­ta­tion that pro­pa­gates in all direc­tions, with heat trans­fer always taking place from the hot­test to the col­dest system.

An example ? You put a cup of hot tea in a cold clo­sed room. The tea and the room form an iso­la­ted sys­tem, which evolves as follows :

  • In its ini­tial state, tea, due to its high tem­pe­ra­ture, has the most impor­tant ther­mal agitation.
  • Gradually, and irre­ver­si­bly, the tea will trans­mit its ther­mal agi­ta­tion to the room. Its tem­pe­ra­ture will the­re­fore decrease, while that of the room will increase, until the tem­pe­ra­ture bet­ween the tea and the room is final­ly homogenized.
  •  The sys­tem will never spon­ta­neous­ly return to its ini­tial state. For the tea to warm up, it would have to pro­vide work (a bring of ener­gy). The equi­va­lence bet­ween the heat recei­ved (Q) and the work done (W) is : W + Q = 0, i.e. W = — Q (which means : the work pro­duces heat, which is trans­fer­red to the outside).

                       

What does entropy measure ?

Entropy mea­sures the ten­den­cy of ener­gy to dis­perse. It quan­ti­fies two things :

  • the degree of dis­per­sion of ener­gy (in all its forms : ther­mal, che­mi­cal, elec­tri­cal) among the par­ticles of a system,
  • and the degree of dis­tri­bu­tion of these par­ticles in all direc­tions throu­ghout the acces­sible volume.

 

entropy

 

Entropy is a macro­sco­pic phe­no­me­non that only makes sense if there is a high num­ber of par­ticles, a sine qua none condi­tion for the appea­rance of irre­ver­si­bi­li­ty. Over time, the entro­py can only increase. There are more ways to dis­tri­bute ener­gy than to concen­trate it. You’ve pro­ba­bly noti­ced that it’s very easy to create chaos in a Rubick’s Cube, but the­re’s only one way to put it in order (even if there are seve­ral methods to do so)!

Any clo­sed sys­tem even­tual­ly reaches the state of maxi­mum entro­py, in which ener­gy is uni­form­ly dis­tri­bu­ted. A state also known as « ther­mo­dy­na­mic equilibrium ».

                      

Thermodynamics

A pendulum story

Thermal ener­gy is one of the various forms that ener­gy can take in the uni­verse. There is also elec­tri­cal ener­gy, che­mi­cal ener­gy and mecha­ni­cal ener­gy. The lat­ter also occurs in two forms : poten­tial ener­gy (rela­ted to alti­tude) and kine­tic ener­gy (rela­ted to speed).

Thermodynamics, which appea­red in the nine­teenth cen­tu­ry, is his­to­ri­cal­ly the science of heat. Because ther­mo­dy­na­mi­cists have demons­tra­ted the irre­ver­sible trans­for­ma­tion of kine­tic ener­gy into ther­mal ener­gy, cal­led « entro­py pro­duc­tion », it can also be defi­ned as the science of large sys­tems in equi­li­brium, or the science of irre­ver­si­bi­li­ty.

pendulumPotential ener­gy, kine­tic ener­gy, entro­py… are you lost ? Stay with me, I’ll get my pen­du­lum and explain ! If I place the mass of the pen­du­lum in the high posi­tion, it has a maxi­mum poten­tial ener­gy, because of its alti­tude, and zero kine­tic ener­gy since it is immo­bile. Careful, I’m drop­ping eve­ry­thing ! When it passes through its lowest point, the mass has, in fact, zero poten­tial ener­gy. On the other hand, its speed, and thus its kine­tic ener­gy, are maxi­mum. As the mass moves up on the other side, it loses speed and, at the same time, kine­tic ener­gy. But as it gains height, its poten­tial ener­gy increases.

As the pen­du­lum swings, it is bra­ked by air resis­tance. It then loses in kine­tic ener­gy, or more pre­ci­se­ly, its kine­tic ener­gy is trans­for­med into ther­mal ener­gy : entro­py increases.

                  

Quality energy… or not…

What one has to unders­tand is that not all ener­gies are equal. Thermal ener­gy is much less « use­ful » than others, in the sense that it can never be ful­ly trans­for­med into work, whe­reas the oppo­site is possible.

Thermodynamics thus teaches us the fol­lo­wing two principles :

1st prin­ciple : In an iso­la­ted sys­tem, the total amount of ener­gy, inclu­ding ther­mal ener­gy, is conser­ved.

In the uni­verse, ener­gy can nei­ther be pro­du­ced from nothing nor des­troyed. It can only change form, or be trans­mit­ted from one sys­tem to ano­ther. In the case of the pen­du­lum, so we have : Mechanical ener­gy + Thermal ener­gy = Constant

2nd prin­ciple : If the amount of ener­gy is conser­ved, this does not mean that the sys­tem is stable. For the qua­li­ty of ener­gy, on the other hand, is dete­rio­ra­ting : more and more ener­gy is dis­per­sed into unu­sable ther­mal energy.

I’ll take my pen­du­lum back and explain ! While the mass is in motion, it is sub­jec­ted to air resis­tance. This causes a micro­sco­pic disor­de­red agi­ta­tion of the atoms that make up the mass, and a trans­fer of this ther­mal agi­ta­tion from the sys­tem to the out­side envi­ron­ment. The mass loses speed, its kine­tic ener­gy is trans­for­med into ther­mal ener­gy which is dis­per­sed and becomes unusable.

So there is still as much ener­gy in the sys­tem, but of les­ser qua­li­ty : the entro­py increases until the sys­tem reaches ther­mo­dy­na­mic equi­li­brium (the state of maxi­mum entro­py). It can also be said that the second prin­ciple esta­blishes the irre­ver­si­bi­li­ty of phenomena.

                   

Between order and disorder

According to ther­mo­dy­na­mics, entro­py can the­re­fore only increase in the uni­verse, consi­de­red by stan­dard phy­sics as an iso­la­ted sys­tem. So be it. However, we have seen an irre­ver­sible increase in com­plexi­ty since the « Big Bang ». The uni­verse has indeed evol­ved from a « plas­ma soup » close to ther­mal equi­li­brium to the for­ma­tion of galaxies, pla­nets and even human beings. That is to say, struc­tures that are as order­ly as pos­sible.

It would seem that order does not the­re­fore contra­dict the ten­den­cy of the gene­ral move­ment of the uni­verse to disor­der. Does this mean that orga­ni­za­tion has a cost and disor­der is the price to pay for the orga­ni­za­tion of the uni­verse ? From a ther­mo­dy­na­mic point of view, it’s not so strange. One can indeed conceive the appea­rance of orde­red struc­tures as long as disor­der deve­lo­ped simultaneously.

So eve­ry­thing would be for the best in the best of iso­la­ted sys­tems… if dis­si­pa­tive struc­tures did not exist !

                 

Dissipative structures

prigogine

Dissipative struc­tures are open sys­tems. Far from ther­mo­dy­na­mic equi­li­brium, they are the seat of spon­ta­neous orga­ni­za­tion. The phy­si­cist and che­mist Ilya Prigogine named them so in 1969.

Associating the terms struc­ture and dis­si­pa­tion is like asso­cia­ting order and chaos : a bit para­doxi­cal, isn’t it ? Not if we look at the second prin­ciple of ther­mo­dy­na­mics from a new angle. Not the one where irre­ver­si­bi­li­ty leads the sys­tem to the state of maxi­mum entro­py, but the one where irre­ver­si­bi­li­ty becomes a source of cohe­rence and self-organization.

Unlike chao­tic sys­tems, which depend only on ini­tial condi­tions, dis­si­pa­tive struc­tures are condi­tio­ned by per­ma­nent dis­tur­bances or fluctuations.


« Our world pre­sents per­sistent inter­ac­tions (…) Classical mecha­nics consi­ders iso­la­ted move­ments whe­reas irre­ver­si­bi­li­ty only makes sense when we consi­der par­ticles immer­sed in an envi­ron­ment where inter­ac­tions are per­sistent. »
[14]


In this sense, an aneu­rysm is more of a dis­si­pa­tive struc­ture than a chao­tic sys­tem [15].

The divi­sion of the large swirls that make it up into smal­ler swirls allows a trans­fer of ener­gy from large to small scales. We talk about ener­gy cas­cades, which cause high ener­gy dis­si­pa­tion, and thus lead to an increase in entropy.

This state of non-equilibrium is never­the­less sta­bi­li­zed thanks to the ener­gy that the vor­tex sys­tem draws from its envi­ron­ment. Precisely, it accu­mu­lates ener­gy by reso­nance [16], and thus com­pen­sates for entro­py. Finally, it self-feeds and main­tains the orga­ni­za­tion of its struc­ture… to some extent, howe­ver. In the case of an aneu­rysm, in fact, the natu­ral evo­lu­tion is towards the inevi­table increase in its cali­ber. Eventually, eve­ry aneu­rysm is threa­te­ned with rup­ture. Indeed, the more the blood pres­sure increases, the lar­ger the radius of the aneu­rysm, the higher the pres­sure on the wall, the less it can resist…

                        

Entropy and negentropy, a great love story

irreversibilite-temps-fractal

Before Ilya Prigogine, Erwin Schrödinger [17] had alrea­dy poin­ted out the phy­si­cal pos­si­bi­li­ty of « nega­tive entro­py » pro­cesses. Schrödinger wan­ted to mark the dif­fe­rence bet­ween phy­si­cal ther­mo­dy­na­mic pro­cesses and life pro­cesses. Along these lines, the french mathe­ma­ti­cian and phy­si­cist Léon Brillouin coi­ned the term « negen­tro­py » in 1956 to replace the term nega­tive entro­py [18].

Thermodynamics teaches us that the ten­den­cy of ener­gy is to go from order to disor­der. Experience shows, howe­ver, that irre­ver­si­bi­li­ty is rather a source of cohe­rence in the uni­verse. Through dis­si­pa­tive struc­tures, and at the cost of a large amount of ener­gy, cer­tain­ly. Because the increase of order inside a struc­ture leads to an increase of disor­der out­side and vice versa.

Rather than being oppo­sed, entro­py and negen­tro­py are in fact com­ple­men­ta­ry. They are part of the same dyna­mic, that of self-organization, that which builds order out of disorder.

As we will see in the next article Gravity, entro­py and self-organization, the dif­fe­rence bet­ween phy­si­cal ther­mo­dy­na­mic pro­cesses and vital pro­cesses is not as clear-cut as it seems. The notions of feed­back and reso­nance, high­ligh­ted by Nassim Haramein and essen­tial to the consti­tu­tion of a hie­rar­chi­cal level of orga­ni­za­tion, are valid for all processes !

          


Key points

  • Time is a human concept and not a cha­rac­te­ris­tic of the uni­verse. What cha­rac­te­rizes the uni­verse is memory.

  • The sto­rage of infor­ma­tion explains both the course and the arrow of time, and thus the irre­ver­si­bi­li­ty of the phe­no­me­na. Time is an emergent pro­per­ty of the dyna­mics of the universe.

  • Entropy mea­sures the ten­den­cy of ener­gy to dis­perse. It is a macro­sco­pic phe­no­me­non that only makes sense if there is a high num­ber of par­ticles, a sine qua none condi­tion for the appea­rance of irre­ver­si­bi­li­ty. Entropy can only increase in a clo­sed system.

  • In dis­si­pa­tive struc­tures, irre­ver­si­bi­li­ty does not lead the sys­tem to the state of maxi­mum entro­py, but becomes a source of cohe­rence and self-organization, albeit at the cost of a large amount of ener­gy.

                  

                       

                      



Notes and references


The irreversibility of the phenomena

[1] KLEIN Etienne, Faut-il dis­tin­guer cours du temps et flèche du temps ? [Is it neces­sa­ry to dis­tin­guish bet­ween the course of time and the arrow of time?], p. 9 (in French)
[2] KLEIN Etienne, Le temps, son cours et sa flèche, [Time, its course and its arrow], L’université de tous les savoirs, confé­rence n°188, 6 juillet 2000 (in French)
[3] KLEIN Etienne, Faut-il dis­tin­guer cours du temps et flèche du temps ?, op.cit., p.10


Memory rather than time

[4] HARAMEIN Nassim (2015, 20 juin), L’univers connec­té [The connec­ted universe]


[5] Since we are tal­king more about memo­ry than about time, it is more accu­rate to speak of memory-space rather than space-time.
[6] The notion of feed­back was dis­cus­sed in par­ti­cu­lar in the sec­tion Chaotic sys­tems in the connec­ted uni­verse of article 2.
[7]  See the heli­cal dyna­mics of the solar sys­tem : the pla­nets and stars never pass through the same coor­di­nates, because they move in a spi­ral around the sun accor­ding to Nassim Haramein’s model
[8] The holo­gra­phic prin­ciple will be deve­lo­ped in the article Gravity, entro­py and self-organization. You can also read the article The holo­gra­phic uni­verse : the under­lying unit.
[9] « An emergent pro­per­ty is a cha­rac­te­ris­tic that is unpre­dic­table (or at least invi­sible) at the local level and that appears at the glo­bal level. It results from the col­lec­tive acti­vi­ty of the sys­tem’s consti­tuents ». ZWIRN Hervé, Les sys­tèmes com­plexes [Complex sys­tems], Paris, Editions Odile Jacob, 2006, p.35
[10] PRIGIGINE Ilya quo­ted by Etienne KLEIN, Faut-il dis­tin­guer cours du temps et flèche du temps ?, op.cit., p.11
[11] PRIGOGINE Ilya, La fin des cer­ti­tudes [The end of cer­tain­ties], Paris, édi­tions Odile Jacob, 1996

Complexity rather than chaos

[12] We will see in the next article Gravity, entro­py and self-organization that entro­py can also be approa­ched from an infor­ma­tion pers­pec­tive.
[13] Thermal agi­ta­tion, or micro­sco­pic agi­ta­tion of mole­cules and atoms, is mea­su­red by tem­pe­ra­ture.
[14] PRIGOGINE Ilya, La Fin des cer­ti­tudes, op.cit., p.133
[15] An aneu­rysm is a dila­tion of the wall of an arte­ry that causes the crea­tion of a pocket inside which the blood changes its beha­vior. See also the sec­tion on aneu­rysm as a chao­tic sys­tem in Article 2.
[16] The concept of reso­nance has alrea­dy been dis­cus­sed in article 1 Chaotic sys­tems and will also be detai­led in article 4 Gravity, entro­py and self-organization.
[17] In his book What is Life ?, Londres : Cambridge University Press, 1944
[18] The term « syn­tro­py » pro­po­sed by L. Fantappie in 1944 is some­times used, but it was not adopted.

 




Leave a Reply

Your email address will not be publi­shed. Required fields are mar­ked *

©2018–2023 My quan­tum life All rights reserved
0 Shares
Tweet
Share
Share