16
Research Paper On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context Thomas Mavrodes 1 * , Achilleas Kameas 2 , Dimitris Papageorgiou 1 and Antonios Los 1 1 University of the Aegean, Mytilene, Greece 2 Hellenic Open University, Patras, Greece The main problem that systems theory tries to solve is the problem of complexity. The notion of complexity is very often correlated to variables such as entropy and energy, organization and disorganization and others that eventually converge to the common ground of comprehensibility or incomprehensibility. In this paper, we try to shed light on the notions of entropy and energy as they should be conceived in the theoretical framework of social sciences. We analyse the different meanings of entropy, dealing with the social systems as if they were information management systems (following Niklas Luhmann), and we also present an initial approach, as to the meaning of energy for those systems, bringing Pierre Bourdieus theory of symbolic capital into a Luhmannian context. Copyright © 2011 John Wiley & Sons, Ltd. Keywords entropy; energy; autopoiesis; social systems INTRODUCTION The term entropy is very often found in works about societal complexity. The theoretical frame- work that tackles the concepts of complexity as an emergent phenomenon is of course that of systems theory (Wiener, 1961; Bertalanffy, 1968). But the meaning of entropy in the social context is often not clear, and it results in obscure arguments about the factors that continuously reproduce complexity. In particular, references to entropy usually imply disorganization and quite often leave aside the fact that disorganization (or disorder) can justiably be considered as constraints imposed on an observer by his language, that is, his system of distinctions (Foerster, 2003, p. 280). Theoretical frameworks such as the system entropy theory (SET) (Bailey, 1990, 1997a, 1997b, 2006a, 2006b) have been developed and rened to measure entropy as an indicator of the internal state of social systems, namely, their disorder as a temporal variable. * Correspondence to: Thomas Mavrodes, University of the Aegean, Terpandrou 5 Str., 81100 Mytilene, Greece. Email: [email protected] This article was published online on 2 March 2011. An error was subsequently identied in the ordering of author rst and surnames and some additional minor amendments have been made to the text. This notice is included in the online and print versions to indicate that both have been corrected 11 March 2011. © Received 23 March 2010 Accepted 28 January 2011 Copyright © 2011 John Wiley & Sons, Ltd. Systems Research and Behavioral Science Syst. Res. 28, 353 368 (2011) Published online 2 March 2011 in Wiley Online Library (wileyonlinelibrary.com) DOI: 10.1002/sres.1084

On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Embed Size (px)

Citation preview

Page 1: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Systems Research and Behavioral ScienceSyst. Res. 28, 353–368 (2011)Published online 2 March 2011 in Wiley Online Library(wileyonlinelibrary.com) DOI: 10.1002/sres.1084

■ Research Paper

On the Entropy of Social Systems:A Revision of the Concepts of Entropyand Energy in the Social Context†

Thomas Mavrofides1*, Achilleas Kameas2, Dimitris Papageorgiou1

and Antonios Los11University of the Aegean, Mytilene, Greece2Hellenic Open University, Patras, Greece

* CorTerpaE‐ma†Thissubseand sThis nboth

Cop

The main problem that systems theory tries to solve is the problem of complexity. Thenotion of complexity is very often correlated to variables such as entropy and energy,organization and disorganization and others that eventually converge to the commonground of comprehensibility or incomprehensibility. In this paper, we try to shed light onthe notions of entropy and energy as they should be conceived in the theoreticalframework of social sciences. We analyse the different meanings of entropy, dealing withthe social systems as if they were information management systems (following NiklasLuhmann), and we also present an initial approach, as to the meaning of energy for thosesystems, bringing Pierre Bourdieu’s theory of symbolic capital into a Luhmanniancontext. Copyright © 2011 John Wiley & Sons, Ltd.©

Keywords entropy; energy; autopoiesis; social systems

INTRODUCTION

The term entropy is very often found in worksabout societal complexity. The theoretical frame-work that tackles the concepts of complexity as anemergent phenomenon is of course that of systemstheory (Wiener, 1961; Bertalanffy, 1968). But the

respondence to: Thomas Mavrofides, University of the Aegean,ndrou 5 Str., 81100 Mytilene, Greece.il: [email protected] was published online on 2 March 2011. An error wasquently identified in the ordering of author first and surnamesome additional minor amendments have been made to the text.otice is included in the online and print versions to indicate thathave been corrected 11 March 2011.

yright © 2011 John Wiley & Sons, Ltd.

meaning of entropy in the social context is often notclear, and it results in obscure arguments about thefactors that continuously reproduce complexity. Inparticular, references to entropy usually implydisorganization and quite often leave aside the factthat disorganization (or disorder) can justifiably beconsidered as constraints imposed on an observerby his language, that is, his system of distinctions(Foerster, 2003, p. 280). Theoretical frameworkssuch as the system entropy theory (SET) (Bailey,1990, 1997a, 1997b, 2006a, 2006b) have beendeveloped and refined to measure entropy as anindicator of the internal state of social systems,namely, their disorder as a temporal variable.

Received 23 March 2010Accepted 28 January 2011

Page 2: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

More specifically, entropy is generally consid-ered as a measure of the ability to predict thenext state of a system. If the next state is highlypredictable, then entropy is considered to below and vice versa; consequently, a system thatpresents low entropy is considered to beorganized and, by deduction, desirable. Therefore,predictability seems to be the keyword when itcomes to organization and when references toentropy appear (Wiener, 1961; Arnopoulos,2001). If this is the case, then the univocal useof the thermodynamic meaning of entropy in thesocial sciences context could contingently lead toall kinds of misunderstandings (Bateson, 2000,pp. 458–459). Entropy has at least two distinctscientific meanings and also has its relevantcounterparts: energy and certainty.

In this paper, we try to shed light on the twodifferent meanings of entropy and draw cleardistinctions as to the contexts that those mean-ings pertain to. That may supply contemporarysystems theory with new perspectives, whichcould bring forth a new conception of theimportance of otherness for social systems.

ENTROPY IN THETHERMODYNAMICS CONTEXT

First, let us try to clarify the older (historicallyspeaking) meaning of entropy, that is, the entropyof thermodynamics. There are two ways toconsider and measure entropy: (i) a measure ofthe unavailable energy in a closed thermodynamicsystem; and (ii) a measure of disorder of a closedthermodynamic system. The first measure isassociated with the conversion of heat energy tomechanical energy. The second is associated withthe probabilities of the occurrence of a particularmolecular arrangement in a gas.

To recall that concept, let us use an exam-ple. Suppose we have an adiabatic envelope(a completely insulated chamber). We have asource of energy, say a lighter, inside thatenvelope, and the envelope itself is full of gas.If, by any means, we use the energy containedinto our source to heat the gas (i.e. we just lightup the lighter), after some time, we will end upto a state where the gas will have the same

Copyright © 2011 John Wiley & Sons, Ltd.

354

temperature: every molecule will have absorbedthe same amount of energy (more or less). Thatwill result to an unpredictable (and faster thanbefore) movement of the molecules of the gas toany possible direction; no prediction about theirmovement can be done, therefore, no certainty ispossible in a micro level. Our energy source willbe exhausted and so will be our ability toprobabilistically predict their next position. Weuse to call this situation chaotic. But at this point,keep in mind that there’s no such thing as aperfectly isolated envelope (Popper, 1957, p. 151).

Without any further investigation, we can notesome interesting aspects of our experiment:

(1) The procedure took time to complete. Thedissipative energy raised and our energysource was exhausted. That is, the amountof available energy dropped to zero and theamount of entropy rose to maximum. We cansay that, as soon as the convection started, thegradual increase of entropy could be used as atimer, ticking the moments to the end;inversely, we could use the decrease of energyin our energy source as a measure of time.

(2) During our experiment, we were able topredict the course of the molecules of the gas,with a certain degree of statistical certainty. Thecolder molecules were going down, and thewarmer oneswere going up, forming a currentof hot gas. That was a work in progress, anintentional change. We conceive of work as theprocess that ensures intentional changes in acontext: so, there was no work before theexperiment, and there cannot be any workafter the end of it. There was no certaintybeforewe started using our energy source, andthere is no certainty after it was exhausted.

(3) The amount of energy contained into ouradiabatic envelope is constant; no energy islost (because of the first law‐the conserva-tion of energy). But we do not have any more aform of energy that we can use within thatenvelope to produce work (because of thesecond law). [Correction made here afterinitial online publication.] That is, wheneverwe talk of work, we refer implicitly to theavailable (i.e. useful or organized) energy. Ifnow we try to collect the energy from the

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 3: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

molecules back to our original source, thatwill mean a production of work,1 and there isno energy—at least not in an appropriate form—to use it so as to complete that task.

(4) Consequently, our original source of energywas in an appropriate form (so to producework).

So, we used our source until it was exhausted,and we ended up with a total inability to doanything else. Before our experiment, there waspotential; during the experiment, therewas statisticalcertainty; and at the end, we have concurrently totalcertainty (for we are sure there is nothing more wecan do) and total uncertainty (as of the trajectories ofthemolecules of the gas). Andwe are stuck.We canmake no decisions because there are no options toselect from. We reached a dead end.But beforewegoon,whatwas that thatwe called

a ‘dead end’? Clearly, it is the state at which thereis no potential, that is, no alternatives to select from. Toput it differently, there are no distinctions to draw;the state (the final conditions) is given, and thereis nothing we can do to select another. Up to thispoint, we can conclude that entropy in the domainof thermodynamics measures useless energy (andnot lack of energy) and indirectly reflects uncer-tainty, and also, that maximum entropy signifiescomplete inability to select a successive state.

4Those processes are also known as Markov processes.5Informational entropy is often referred to as negentropy or negativeentropy. One possible explanation for this, adopted by numerousauthors (Ho, 2010), is the minus sign that Shannon put before hisformula. The only reason for that seems to be that the logarithms ofprobabilities always yield a negative number, because a probability iseither 1 (i.e. certainty and log 1 = 0) or less than 1. Schrödinger (1944,

ENTROPY IN THECOMMUNICATIONAL CONTEXT

Let us try now to examine the meaning of entropyin the context of information exchanging systems.2

Based on the work published by Shannon (1948),we wish to concentrate on the properties andcharacteristics of discrete channels. This preferenceoccurs because of the fact that communication istriggered by ‘a sequence of choices from a finite setof elementary symbols’ (Shannon, 1948, p. 3), thatis, a natural language, spoken or otherwise,3 and,

1And would also imply the existence of a perpetual machine (Popper,1957, p. 152)2The reader at this point should keep in mind that the term data isoften preferred over the term information, because usually, the latter isunderstood to carry interpretative connotations.3We chose to note that communication is ‘triggered’ rather than‘constituted’ from a set of symbols, for reasons that hopefully willbecome clear in the next pages of this article.

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

to put it more generally, a sequence of discreteselections or states, as it is the case in the domainof cybernetics (Ashby, 1957). Also, followingLuhmann (1986, 1995), we consider social systemsto be communications systems, that is, systemsthat are constituted through communications(communicative selections), so to examine entropyin the communicational context is more appropri-ate and, as we intent to prove, more plausible.

Shannon points out that each symbol in thesequence of a message depends on certainprobabilities, varying according to the previoussymbols already transmitted, that is, what hecalls a ‘residue of influence’ (Shannon, 1948, p. 8).Therefore, he suggests that we could perceive of adiscrete source as a stochastic process, a processwhere each successive selection is dependent onthe previous ones.4 Thus, entropy in Shannon’sapproach is defined as a measure of probability ofthe next symbol to appear in the message sequence,and therefore, entropy in the communicationalcontext refers to a generalization of Boltzmann’sstatistical entropy. Consequently, entropy refersto the variation of uncertainty during the transmis-sion of a message; in Shannon’s own words,‘Quantities of the form H=−∑ pi log pi … play acentral role in information theory as measures ofinformation, choice and uncertainty’ (Shannon,1948, p. 11). It is of utmost importance for ouranalysis that Shannon refers to informationalentropy as a measure of ‘information, choice anduncertainty’ for it is exactly the communicativeselections (choices) of systems that produce andreproduce those specific concepts (‘information,choice and uncertainty’).5

p. 26) also used the term negative entropy, but stating explicitly that herefers to free energy. In the same page, Schrödinger reformsBoltzmann’s equation as −(entropy) = k log (l/D) [correction madehere after initial online publication], thus, introducing a negative sign;apparently, for the same reasons, Shannon did the same four yearslater. The term is also used by Bertalanffy (1968, p. 42) as ‘information’and Wiener (1961, p.11) as information being the ‘negative’ of entropy.In our discussion here, we chose not to use the term negentropy inorder to stay close to Shannon’s original approach and avoidintroducing possible ambiguities.

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

355

Page 4: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

Shneider (2010, p. 3) following M. Tribussuggests that uncertainty could also be called‘surprisal’. That is, if there is a set of M availablesymbols,6 and, at a certain point of the messagesequence, a symbol u, which has a probability ofappearance Pi, that approaches 0 eventuallyappears, the receiver will be surprised. That meansthat the receiver has expectations, and those areconstituted during and because of the communi-cation and are defined (or bounded) by the receiver’sconception of the communicational content and context(e.g. a language, or more generally, a culture).This leads eventually to a circular procedure:communication produces expectations, which in turnreproduce communication. And this recursiveprocess stabilizes certain bilateral expectationsthat, so to speak, define an intersubjective spacethat makes communication possible (Luhmann,1995). Exactly this is what Shannon defines as‘redundancy’. Redundancy is defined as ‘Oneminus the relative entropy’ (Shannon, 1948, p. 14).But how can we conceive of the notion ofredundancy? Shannon’s definition is strictly math-ematical. To exemplify on the notions of entropyand redundancy, let us try a simple example.Suppose you toss a (supposedly) fair dice. You cansay ‘I know that the outcome will be in the samplespace [1, 2, 3, 4, 5, 6], and additionally, I know thatthe possible outcomes are equiprobable with aprobability equal to 1/6’. How can you know that?The answer is, because of prior experience, youknowwhat a fair dice is, you knowwhat tossing isand there are no extra variables in your experiment,and therefore, your argument will always be valid.Your knowledge constructs a context that we callredundancy, that is, an informational frameworkaboutwhat is going to happen next or is happeningalready, by what you already know. In the certainexample, entropy (the measure of uncertainty)drops to zero and of course (according to Shannon)redundancy equals to 1 (100%).7 Again, you have

6Those symbols could also represent the states of a system, in thiscase, those of the source of the message.7In the case you try to predict the outcome of one only toss, theentropy is at maximum. In Shannon’s own words, ‘Thus, only whenwe are certain of the outcome does H … “(entropy)” … vanish.Otherwise H is positive … H is a maximum and equal to log n whenall pi are equal (i.e. 1/n).This is also intuitively the most uncertainsituation’ (Shannon, 1948, p. 11). So Shannon explicitly equalsmaximum uncertainty to maximum entropy.

Copyright © 2011 John Wiley & Sons, Ltd.

356

expectations, and these arise because of redun-dancy. But suppose now you toss the dice 100times, and surprisingly, the 90 times you get a ‘6’and then different outcomes at the rest of thetimes; redundancy starts to drop (or certaintystarts to collapse), surprisal steps in and entropyrises accordingly as you get over 100 tossestrying to verify that the outcomes are equiprob-able, but every next set of tosses verifies thatsomething is wrong. Eventually, you realize thatthis is not a fair dice, and that using it gives aprobability of 0.9 to ‘6’ and perhaps 0.02 to therest of the numbers. A new variable steps intoyour environment, namely, the notion of a‘crooked dice’. Now, you have new information,albeit about that certain dice, and that happenedbecause for a while, the entropy had risen.Your knowledge has changed, that is, youexperienced a change (Maturana and Varela,1980, pp. 11–12) of your own state. ‘One canspeak of change only in relation to structures.Events cannot change, because there is noduration between their emergence and theirpassing away … Only structures keep what canbe continued (and therefore changed) relativelyconstant. Despite the irreversibility of events,structures guarantee a certain reversibility ofrelationships. On the level of expectations, … asystem can learn, can dissolve what has beenestablished, and can adapt to external or internalchanges’ (Luhmann, 1995, p. 345). And also, ‘Byinformation we mean an event that selectssystem states’ (Luhmann, 1995, p. 67). The eventthat Luhmann refers to was in our case theabnormal ‘behavior ’ of the dice. Of course, after awhile, when certainty about the specific dice willhave risen high, you would stop tossing it,exactly because entropy will start to drop downagain: the event ‘… retains its meaning inrepetition but loses its value as information …The information is not lost, although it disap-pears as an event. It has changed the state of thesystem and has thereby left behind a structuraleffect; the system then reacts to and with thesestructural effects’. Redundancy then can beconceived as ‘[a] surplus of informational possi-bilities …’ (Shannon, 1948, p. 172), that is a wayto reduce entropy, which of course, is in the samevein with Shannon’s original idea.

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 5: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

Soentropymeasures the ‘surprisal’or—perhapsmore specifically—the importance of an event8

for the receiver; that is to say, it is the measure ofdeviations from his expectations, and the measure ofsurprisal of the receiver is correlated to theselections of the source. But what is the relation ofsurprisal to information and information toenergy or entropy?

THE RELATION OF INFORMATIONTO UNCERTAINTY

As we saw, entropy measures the uncertainty,albeit indirectly in the thermodynamic context andin a straightforward manner in the theory ofinformation. Therefore, entropy is also a reverse9

measure for information. But what is information?From Shannon’s viewpoint, information is the

measure of reduction of the statistical entropy. Beforea transmission (or an event), there is an H(x)uncertainty to the receiver as to the next symbolto be transmitted. After the transmission (ofsymbol y), the uncertainty (R) is reduced10:

R=H(x)‐Hy(x) (1)

To be sure, the symbol R stands for the rate ofactual transmission (Shannon, 1948, p. 20) andclearly shows that information is transmitted ifand only if H(x) >Hy(x) > 0. If Hy(x) = 0, then y isindependent to x, and there is no information.Exactly the same applies if H(x) =Hy(x), for in thelatter case, there is no transmission at all (R= 0).Shannon states clearly that Hy(x) is the conditionalentropy, that is, the possibility of occurrence ofthe event y after the event x. It deduces that ifHy(x) = 1, that is, if the receiver is 100% sure that y

8Maturana and Varela use the term perturbation to denote an event thatcauses a structural change: ‘… we can view these perturbingindependent events as inputs, and the changes of the machine thatcompensate these perturbations as outputs’ (1980, p. 82). Thus, the termperturbation could be considered an approximate synonym to the termevent as used by Niklas Luhmann when he refers to structural changes,that is, changes to the perception of the environment (see aboveLuhmann’s reference on the correlation of event to information andstructural change). Luhmann himself though did seem to prefer theterm event.9Thus the term ‘negative entropy’.10Shannon denotes the conditional probability as y(x), but in othertexts, it is denoted as (y|x), so formula (1) could equivalently bewritten as R=H(x)−H(y|x).

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

will occur and it actually occurs, then theinformation produced is 0 [for H= –pi log(pi)].Therefore, in the latter case, it is irrelevant from aninformational point of view if y is transmitted at all.Certain conclusions can be drawn from thoseremarks:

(1) The rate of transmission of a signal is inde-pendent from the rate of transmission ofinformation. That is, the transmission of asignal alone does not necessarily pertain totransmission of information, and consequently,transmission does not guarantee communica-tion. That is to say that the rate of informationtransmission is always lower than the rate ofthe transmission of the signal.

(2) If a state (or symbol) that is absolutelyexpected eventually presents itself, then theuncertainty is not reduced.

(3) States that have lowprobability of appearanceproduce a large quantity of information. Thelower the probability, the higher the commu-nicational value of the symbol transmitted(or of the next state presented).

(4) Only unpredictable sequences of states pro-duce information and therefore constitutecommunication. If certainty is not disrupted,information is not produced.

(5) Because the receiver defines the informationalvalue of the symbol y as the conditionalprobability of y occurring after x, then it is thereceiver who connects the states together assuccessive instances of the same phenomenon:namely, communication. To put it differently,the receiver attributesmeaning to y according tox, perceiving them as successive phenomena in adeterministic fashion. It must be clear here thatthe deterministic aspect of that procedure isconstructed also by the receiver. In case thephenomena turn to seem independent, then noinformation is produced; that is to say that nomeaning can connect them [and Hy(x) isindefinite because P(y|x) = 0].

It is of utmost importance to be clear at thispoint that what we are examining is an observingsystem, to be sure a self‐referential system thatobserves its environment, and our discussionmoves around the uncertainty the observing

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

357

Page 6: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

system experiences when interacting with anentropic environment.

Now, into the core of the theoretical frame-work of systems theory lays Bateson’s famousdefinition of the information: ‘Of this infin-itude [of differences], we select a very limitednumber, which become information. In fact,what we mean by information—the elementaryunit of information—is a difference, which makes adifference, and it is able to make a differencebecause the neural pathways alongwhich it travelsand it is continually transformed are themselvesprovided with energy. The pathways are ready tobe triggered. We may even say that the question isalready implicit in them.’ (Bateson, 2000, p. 459).The whole paragraph is included here because aswe will see, it says somewhat more than anabstract definition of the meaning of information.This reference, when examined within its originalsystemic theoretical context, describes the relationof a system to its environment. Let us try to elicitcertain conclusions from that:

(1) Only differences (i.e. changes through time, ordifferent aspects of concurrent phenomena)produce information. Continuadonot produceinformation; therefore, a steady environmentdoes not contribute to information.

(2) Not every difference accounts as information.(3) The types of sensors a system has (i.e. inputs)

along with their pathways predefine what thesystem can conceive as information, and as aconsequence, the system is inevitably boundedinto a specific abstraction of its environment.That is, the environment of the system is anabstraction of its contingent environment. Wededuce that if the system finds a way to developnew sensors, its environment expands.

(4) The system consumes energy to collectinformation, for the ‘pathways are … pro-vided with energy’ (Bateson, 2000, p. 459),that is, the system uses energy to conceive ofits environment. The system not only needsenergy to trigger its outputs but also needs toread its inputs. It follows that the systemmust be open to energy sources from itsenvironment, or else, the second law ofthermodynamics applies and entropy tendsto maximum.

Copyright © 2011 John Wiley & Sons, Ltd.

358

(5) The system is an observer.(6) The observer conceives of his environment

by placing questions and, namely, only thosequestions he is able to form.

(7) The transformation of surprisal to informationtakes place within the observing system only.The environment cannot define the productionof meaning, because the environment cannotdefine the observing system’s inputs (let aloneits internal organization).

A PRIMER ON AUTOPOIESIS

In order to achieve an integration of theaforementioned concepts, we need to recall thetheory of autopoiesis (Maturana and Varela,1980) and use it as a tier so to speak, which canhelp bring those views (theories) together.

The term autopoiesis (Greek, auto: self, poiesis:creation) was coined by the Chilean biologistsHumberto Maturana and Francisco Varela in anendeavour to define rigorously the characteristicsof living systems. They came up with theconclusion that a system can be considered asliving, if and only if that system continuouslyrecreates itself; they called it an ‘autopoieticsystem’. By definition, ‘An autopoietic machineis a machine organized (defined as a unity) as anetwork of processes (transformation and de-struction) components that produce the compo-nents which: (i) through their interactions andtransformations continuously regenerate andrealize the network of processes (relations) thatproduced them; and (ii) constitute it (themachine)as a concrete unity in the space in which they (thecomponents) exist by specifying the topologicaldomain of its realization as such a network’(Maturana and Varela, 1980, p. 72). [Correctionmade here after initial online publication.] Theconsequences of this definition are paramount. Inshort, ‘autopoietic machines are autonomous’and ‘subordinate all changes to the mainte-nance of their own organization’ (Maturana andVarela, 1980, p. 80); they have individuality,they always function as unities and theirautopoietic network is strictly internal and circular:the living machine regenerates itself, and

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 7: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

therefore, autopoiesis triggers autopoiesis in acircular process.That is not to say that living systems remain

unchangeable; on the contrary, it is exactly thechange—a continuous process of becoming—thatguarantees the continuation of the living systemas such in a nontrivial environment. Therefore,autopoiesis is a continuous process of becoming thatconserves being. We need to emphasize here that inno way autopoiesis is governed by the living system’senvironment; autopoiesis remains autonomous orelse there is no autopoiesis and the living systemdisintegrates. Putting it in another way, we cansay that the system changes in a circularhomeostatic procedure, which is triggered but inno way defined by the environment.This is surely more than adaptation; the system

reacts to its environment by reconstitution of itsboundaries. Thus, it continuously manifests itsdifference from its environment, and doing so,the system internalizes this distinction as its basicmode of existence (Luhmann, 1995).Another important aspect of the theory of

autopoiesis is that of the system’s autocatalysis;the living system, because of its autopoieticprocedure, destroys its very own componentsand creates new ones; at any given moment, theonly thing that is really important is the ability ofthe components to participate supportively in theautopoietic cycle, despite other characteristicsthey may hold (which of course could signifytheir own autonomy). Autopoietic machines—orliving systems—are in fact relation‐static systemsrather than homeostatic (Maturana and Varela,1980). And it is only through continuouscatalysis/recreation that the system achievesthe continuous reconstitution of itself; namely,the confirmation and manifestation of its indi-viduality, continuously identifying itself as such,achieves a unity—in an ontological sense—intime and space: ‘Pure ostension plus identifica-tion conveys, with the help of some induction,spatiotemporal spread’ (Quine, 1980, p. 68).But those remarks bring forth a great problem

if one intends to use the autopoietic paradigm inthe sociological context; the idea of a ‘super‐process’ that destroys the components of socialsystems is obviously not appealing—in fact, it isgrotesque. For if one conceives of social systems

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

as networks of people (e.g. actors), then onewould boldly refuse an autopoietic procedurethat—for instance—would ‘work for the publicinterest’, disregarding the individuals (or worse).

Thankfully, Niklas Luhmann paved the way toa new theoretical apparatus that brings thenotion of autopoiesis into sociology and evadesthe problem of systemic autocatalysis; he cameup with the notion that humans (psychicsystems) are not in fact part of the social systemsbut of their environment. Apparently, that posesa paradox; how can there be a society withoutpeople? But there were never psychic systems inthe society, replies Luhmann. Social systems areconstituted from communications and functionby production of meaning. That is, psychicsystems do not communicate to each otherdirectly (as if their nervous systems were inter-acting directly) but through the social system,and doing so, they reproduce it; every commu-nicative action is inherently social and vice versa:there cannot be any communication out of thesocial system. The ‘components’ of the social systemsare exactly those communicative actions. Admittedly,it is now understandable how the social systemscan catalyse their components; a stable repetitivecommunicative action does not contribute tocommunication (for in that case, R= 0). Conse-quently, in an ongoing communicative context,the content consists of communicative actionsthatmust be catalysed in order to give place to thenext ones so that communication can producecommunication, and every previous step pavesthe way for the next one. And communicativeactions must also be connected to each other withmeaning (for Hy(x) must be greater than zero).Therefore, Luhmann concludes that social sys-tems owe their coherence to meaning (Luhmann,1986, 1995).

THE IMPORTANCE OF ENTROPY FORSOCIAL SYSTEMS

At this point, we wish to clarify the meaning ofcertain terms together with their counterparts,namely, certainty/uncertainty, organization/disorganization and action/inaction. Usually, forreasons that are widely considered obvious,

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

359

Page 8: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

certainty correlates to organization and uncer-tainty to disorganization. Consequently, predict-ability (i.e. a clear answer to ‘what if …’) isconsidered a sine qua non for any work, that is, totake action. This is so trivial that it led to severalmisunderstandings and gave birth to a paradox:if—as we saw—only uncertainty can contributeto information, are we to deduce that informationand action belong to competitive—and perhapsmutually exclusive—realms? This is definitelyabsurd because it leads to the conclusion that onlyuninformed systems take action. Another pathwecould take is to suppose that information andaction are independent to each other, but every-one knows that this is never the case, for an actionis an endeavour to change a state of affairs, and so,we must accept that an action is an informedselection.

Let us try to inspect that matter more closely.Following Bateson (2002, p. 95), we can theorizetwo systems that function in interdependence: asystem that opens a gate or relay, that is, anetwork of triggers and a system whose energyflows through that gate when it is opened. Itfollows that only uncertainty triggers action, thatis, only a difference can bring forth an endeavourfor change. But living systems—as autopoieticsystems—need to take continuous action to avoiddisintegration. And the only source of differencesthat can account as information must be outsidethe boundaries of themselves, that is, the onlyreason for the living system to act, lies in itsenvironment—and this includes any operationand especially autopoiesis. Furthermore, thatenvironment must be unpredictable, at least to acertain degree. The autopoietic system is in need ofinformation, therefore surprisal, to continuouslytrigger its own self‐creation: ‘Every event, everyaction appears with a minimal feature of surprise,namely, as different from what preceded it …Uncertainty is and remains a condition of struc-ture. Structure would cease were all uncertainty tobe eradicated, because structure’s function is tomake autopoietic reproduction possible despiteunpredictability’ (Luhmann, 1995, p. 288). Again,this is not adaptation, for it is the system thattransforms the event to information—not theenvironment. The autopoietic system is closedwith respect to meaning production; therefore,

Copyright © 2011 John Wiley & Sons, Ltd.

360

the autopoietic system ‘adapts’ to its environ-ment only by (and because it can retain) itsindividuality as autopoietic organization. Andthat organization (autopoiesis) initially emergesbecause its environment is unpredictable.

This leads to a spectacular inference: uncer-tainty need not entail disorganization. On thecontrary, uncertainty fires up a living system’sself‐reconstructing processes (i.e. autopoiesis)and evenmore: if we take into account the notionof internal differentiation (Luhmann, 1992, 2002;Willke, 1996), we conclude that uncertaintyactually triggers the emergence of new systems,for as a living system tries to compensate everynew experience, it can—contingently—createsubsystems as solutions to its environment’sunpredictability, according to Ashby’s Law ofRequisite Variety (Ashby, 1957, pp. 206–207), orbecause of structural coupling (Maturana andVarela, 1980, pp. 107–111), a number of systemscan form a new entity of autopoietic nature, justbecause of their repetitive (and continuouslychanging) interactions, as each one of thembeholds the others as its own environment.

Therefore, uncertainty can attract a livingsystem to higher complexity: the system absorbsuncertainty from its environment and transformsit to organization increasing its internal complex-ity. It follows that the living system evolves because ofits environment’s entropy. To put it another way,systems in general are solutions to the problem ofcomplexity, because a surplus of complexity firesup the basal operation of distinction that precedesand entails organization—environmental com-plexity motivates the self‐referential systems:‘There can be no distinction without motive, andthere can be no motive unless contents are seen todiffer in value’ (Spencer Brown, 2008, p. 1). Andthe self‐referential systems are both solutions andproblems, for their continuation (the preservationof their individuality) becomes their main prob-lem, and also, being autopoietic, they becomecomplex for the other self‐referential systems intheir environment.

If, on the other hand, the environment becomeshighly predictable (i.e. statistical entropy tendsto zero), the living system, as observer, faces atwofold problem. First, a number of its subsys-tems become obsolete and probably dispensable.

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 9: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

Those systems may be autopoietic themselves,and that could lead to a contradiction betweenthe interests of a suprasystem and its subsystemsand therefore to an operational conflict. Thisphenomenon is not uncommon—one has only toremember the problems that many militaryindustries in the USA faced when the formerUSSR eventually collapsed. Second, the wholesystem itself may become obsolete. This couldenforce the system to seek a new role, a new nichein its wider environment, thus attracting thesystem to high instability and jeopardizing itsown existence.As Karl Popper and Konrad Lorenz point out,

‘the real moment of freedom relies on uncer-tainty’ (Popper, 2003, p. 31), and ‘our willingnessto take risks is connected to the pursuance of thebest possible lifeworld (…). The absence ofproblems can cause a deadlock’ (Popper, 2003,p. 35). Those matters may be considered trivial,but the conception of uncertainty that unfoldedalready is not, which of course calls for a totalreconsideration of the role of uncertainty, entropyand disorganization.But before we get to that, we face another

problem: are we to assume that uncertainty is anattractive condition? We know that a lot oftheoretical concepts try to tackle this certainproblem, theories like double contingency, man-agement theories, governance problems, etc., justto name a few. On the other hand, we alreadyproved here that uncertainty is not ‘bad’; in fact, itis uncertainty that triggers creativity as well ascreation. However, it is evident that at the end ofthe day, everyone expects a stabilized environ-ment and tries to avoid surprises, and in general,everyone tries to trivialize his lifeworld. We usecaller IDs at our cell phones, alarm systems at ourhomes and GPS navigation systems at our cars soto avoid a wrong turn. And people before us, inthe Middle Ages or antiquity, used to have greatwalls around their cities so as to keep theunexpected out. So is it not a problem when ourexperience (let alone common sense and thehistory of humanity) contradicts the theoreticalframework we just presented? The answer is thatthe reason we developed such artefacts (alarms,GPS devices, architecture and even theories ingeneral) is the presence of uncertainty. Further-

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

more, the reason we developed systems thatdevelop other systems is to deal with uncertainty.And the same holds true for the structuredsymbolic systems such as science, theories,technology, metatheories and even the verylanguage we are using, that is, the systems thatwe develop through communication, and theyexist only in it, as networks of regulative normstrying to reduce complexity.

Of course, statistical entropy, uncertainty andcomplexity are not substantial entities like, say, ahuman or a chair. They are just explanatoryprinciples, but nothing more than that. Those areterms that describe certain aspects of a relation; inparticular, they are used to express the inabilityof an observer to establish a causal explanationfor the phenomena he or she observes or toconceive them as a unity. To assume that ‘this is acomplicated situation’ is in fact to express ourown inability (which could be temporal butnevertheless our own) to classify our ownexperience, that is, to produce meaning. So therole of the observer and that of the context iscrucial here. For instance, a network of inter-related behaviors in the social domain could beidentified as ‘disorganized’when, in fact, it couldbe considered as a different pattern oforganization (and thus organization) beyond theorganizational patterns that the observer is ableto comprehend, or a financial crisis could besignified as a destructive phenomenon for acountry or a company, whereas in a widercontext, one could speak of a ‘wide economicalsystem self‐regulation’. The importance of thelevel of observation and the different conceptionsit entails have been excessively described else-where and in different scientific contexts (Joslyn,1990; Luhmann, 1992, 1995; Foerster, 2003;Spencer Brown, 2008), and we do not need togo into details here.

Already, from 1980, Humberto Maturana andFrancisco Varela concluded that ‘… compensa-tion of deformation keeps the autopoietic systemin the autopoietic space’ (Maturana and Varela,1980, p. 93), thus underlining the importance ofdeformation for living systems, which can beattributed to a nontrivial environment. It followsthat in life and its matters, we face a twofoldsituation: systems formation and emergent forms

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

361

Page 10: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

of organization that try to oppose to complexity,and if successful in a temporal dimension, theirown operation as autopoietic entities triggers theemergence of complexity again, which in turnsignifies the need for new complexity‐absorbingcomplex formations, that is, systems.We concludethen that

(1) Systems are solutions. No solution can have asubstantial meaning unless there is a problem(or a class of problems) defined.

(2) Systems are problems for their environments.Every time a system emerges, its environ-ment faces a rise in complexity and reacts byupgrading its own complexity.

(3) Autopoietic systems are problems for them-selves. Their continuation demands theircontinuous circular self‐destruction and re‐establishment; but in order to do so, theyneed to conceive of their environment as aproblem pool using it as a guide so to selecttheir next state.

(4) Autopoietic systems are self‐organizing sys-tems, and self‐organization is the manifes-tation of their autopoiesis.

(5) Therefore, complexity creates complexity,and precisely, this is evolution. On the otherhand, trivialization stops the evolutionaryprocess, forcing systems to disintegration.Systems that manage to trivialize theirenvironment by narrowing it or controllingit excessively (e.g. the case of dictatorships)are triggering their own collapse.

In a nutshell, the raison d’être of the systemicphenomenon (i.e. organization) is statistical en-tropy. That is why ideas such as double contingency,regulation (as opposed to deregulation) or evenorganized knowledge have been proved fruitful andeffective throughout history. Those are manifesta-tions of autopoiesis (and not concrete unchange-able structures) that emerge and constantlychange as a result of the continuous transformation of living systems. We should note though,that we cannot overemphasize here the importance of time: systems take time to compensate theevents they perceive, although it is reasonable toassume that that time reduces inversely to theirinternal complexity, for higher complexity implies

Copyright © 2011 John Wiley & Sons, Ltd.

362

higher redundancy and therefore more availableinternal solutions to external or internal problems,conflicts and contradictions.

Thus, statistical entropy in social systemstheory is the reason of creation rather thandestruction; quite aptly, Niklas Luhmann notes:‘If such a system [i.e. autopoietic] did not havean environment, it would have to invent it as ahorizon of its hetero‐referentiality’ (Luhmann,1986, p. 176). We could say that a system thatconceives of its environment as an entropic one reacts(if it can) with higher self‐organization. And so itevolves—otherwise evolution halts.

WHAT ABOUT ENERGY?

Of course, as we already saw, entropy hasanother meaning, the one that pertains tothermodynamics. In order to use the notion ofthermodynamic entropy though, it is necessary toinclude the notion of energy in social systems. It isobvious that if one wants to speak legitimatelyabout ‘thermodynamic entropy’ pertaining tosocial systems, then one cannot avoid the needto refer to the ‘energy’ of social systems;otherwise, any use of thermodynamic entropy ismeaningless. But what could be considered asenergy in this context? Clearly, we cannotanymore confuse energy with information, as itis sometimes the case in humanities. Informationis correlated to energy, and energy is needed togather information, but it is not deduced thatenergy and information are the same variable.

Therefore, we are in need of a parallel theoreticalframework, interrelated and in coherence to theone we already unfolded, to see how we are toperceive of the energy of social systems. Admit-tedly, this is not going to be easy; to exempt anotion from a whole theoretical apparatus (likewe tried to do hereto), and then rehabilitate it ina different manner, is a demanding task that weexpect to take time and, hopefully, a lot ofdebate. So here, we will only present somepreliminary thoughts, hoping to initiate a widerdiscussion on the topic of social energy.

Georgescu‐Roegen (1986) has already intro-duced an approach to the problem of thermo-dynamic entropy in the economical system, but

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 11: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

he strictly refuted that his theory implied theconception of economic capital as a form ofenergy (Gowdy and Mesner, 1998, p. 140). This ofcourse poses an obvious paradox: how can onetalk of thermodynamics and entropy and leavethe notion of energy out of the discussion? This isto make a distinction, indicate one side of it anddisregard the other.In order to talk about energy, we need to recall

that in thermodynamic terms, it denotes poten-tiality; to be sure, we are talking about what wealready denoted as an ‘appropriate form’ ofenergy that is energy in a form so to producework. The nature of that work is contingent, and(the measure of) energy symbolizes the potenti-ality that is available to whom controls theenergy resource(s). Self‐referential systems tem-poralize their experience, living always in thepresent and designing a future so as to knowhow to act ‘here‐and‐now’ (Foerster, 1971). Thatis, those systems develop expectations of the(their) future, conceiving here‐and‐now as asubject to change, that is, a temporal situation(Luhmann, 1995). It follows that self‐referentialsystems make decisions (i.e. they make selec-tions) based on the knowledge they havegathered and organized (i.e. in their past) in anendless endeavour to trivialize their future. And,for all those operations, systems are triggered byinformation and they consume energy. Thus, thework we are talking about (i.e. the meaning ofenergy) is change; self‐referential systems, beingcoupled to their environment, trigger mutual andrecursive changes between the ‘other’ and the‘self’, with ‘self’ (the internal process of meaningconstruction) being their only criterion.Now, it is very common to say that some social

systems ‘exert power and authority’, or some-times ‘apply force’ over other social systems or ontheir environment in a more general sense, whileattaining certain ends. In the social context, theterm power is often used interchangeably with theterm force (which of course is not the case inphysics), and this is how we will use it here. Butwhat is the meaning of the word ‘force’ that we soeasily use? According to Arnopoulos (2001, p. 20),‘… force is a central concept because it serves toproduce a change of state …’ But, in order to useforce, one needs energy at one’s disposal; thus, we

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

deduce that in order to gain the ability to change astate of affairs, one needs energy to apply force (or‘exert power’): ‘As Parson’s social action theoryemphasizes, energy plays a crucial role in society… Since social action requires energy, activesocieties can only be those with an excess ofenergy’ (Arnopoulos, 2005, p. 31).

But where does energy stem from, or putdifferently, which are the energy resources ofsocial systems? Pierre Bourdieu (1989, p. 17)writes, ‘… these fundamental powers are eco-nomic capital (in its different forms), culturalcapital, social capital, and symbolic capital,which is the form that the various species ofcapital assume when they are perceived andrecognized as legitimate’. What exactly are thoseforms of capital? Bourdieu notes, ‘Depending onthe field in which it functions, and at the cost ofthe more or less expensive transformationswhich are the precondition for its efficacy in thefield in question, capital can present itself in threefundamental guises: as economic capital, whichis immediately and directly convertible intomoney and may be institutionalized in the formof property rights; as cultural capital, which isconvertible, in certain conditions, into economiccapital andmay be institutionalized in the form ofeducational qualifications; and as social capital,made up of social obligations (“connections”),which is convertible, in certain conditions, intoeconomic capital and may be institutionalized inthe form of nobility’ (Bourdieu, 1986, p. 242).

How could we bring those views together interms of systems theory? We could recall thatautopoietic systems try to ‘increase the numberof choices’ (Foerster, 2003, p. 227) at theirdisposal, conceiving the multiplicity of thiscontingency as the extent of their freedom (Willke,1996). Thus, in order for those systems to gainand retain freedom, energy is needed. And thatthat secures the ability of social (or psychic)systems to perform an act of change could becapital, in its widest sense: social influence forinstance, or political power, or money andproperty (economic capital), or knowledge (cul-tural capital) and numerous other notions, thatall seem to converge to one thing, exactly thatwhich Pierre Bourdieu defines as symbolic capital.Therefore, we propose to conceive of systemic

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

363

Page 12: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

energy as the symbolic capital that a self‐referentialsystem possesses. This assumption, though, needsfurther clarification.

SYMBOLIC CAPITAL AS SOCIAL ENERGY

Bourdieu suggests we conceive of capital as a ‘visinsita, a force inscribed in objective or subjectivestructures, but (…) also a lex insita, the principleunderlying the immanent regularities of thesocial world’ (Bourdieu, 1986, p. 241). Here weencounter the first point of deviation fromBourdieu’s theory, because vis insita implies animmanent potentiality in the structures, whichremains unexplained as to its causal nature (in astructuralist context). Of course, one couldascribe vis insita in the structure’s own efforts toaccumulate symbolic capital in any form that thismay be, but still, one premise holds true—energyis not work per se; it has always to be manifestedin a wider context, that is, at the environment ofany structure. One could oppose to that claim,pointing out that—in certain cases—the structuremay accumulate energy to compensate its owninternal problems and that this has nothing to dowith its externalities, but such an assumptionwould overlook the importance of the environ-ment as a pool of potential energy sources and ahorizon of potential events, let alone it mightlead to the chimaera of a perpetual machine.Moreover, that would theorize the structure as asystem, attracting us back to systems theory.

On the contrary, considering symbolic capitalstrictly as lex insita brings it right in the centre ofour theoretical framework. Energy is contin-gency, that is to say, potential to produce work.And because a social system is manifested purelyby communication, what we mean as work iscommunicative actions. Those actions, whateverthey might be and however interpreted by theenvironment (psychic systems or other socialsystems) can only be asserted as externalities ofthe communicating system. It goes withoutsaying that a wide horizon of available systemicexternalities signifies a higher ability of thesystem to reconstitute its boundaries reacting toits environments uncertainty. And this is whatactivates a system to accumulate energy; the fact

Copyright © 2011 John Wiley & Sons, Ltd.

364

that double contingency is always present as theknowledge of the never ending existence ofthe ‘other’ and of his (hers or ‘its’) systemicautopoietic nature, and, thus, as a potentialthreat to systemic identity. This is to say thatthe self‐referential system is constantly aware ofthe contingency of its environment, and so, itconstantly accumulates and expends energy so toreform it, targeting to the ‘best possible lifeworld’(Popper, 2003).

Thus, entropy may have one more meaning,analogous to that of the thermodynamics; the lackof symbolic capital—the lack of an appropriate formof energy. It follows that high thermodynamicentropy signifies the inability of a system tocompensate the disturbances and so (i) the systemmight disintegrate, or (ii) the system might select(if that is feasible) a scenario of isolation, trying toreduce the information it receives, and thus,narrowing its horizon of meaning.

ECONOMIC CAPITAL OR MEANING?

Bourdieu, as we already saw, defines threedistinct types of symbolic capital, namely, eco-nomic capital, cultural capital and social capital.Although, the potentiality of economic capitalneeds no further explanation, we encounter here asecond point of deviation from his conception;Bourdieu (1986, p. 265) claims that ‘… every typeof capital is reducible in the last analysis toeconomic capital …’, a conception that he fails torecognize as such, characterizing it a ‘… brutalfact …’. This factualization of conception, anobjectification of a theoretical apparatus, disre-gards the problems posed by the contingency ofthemeaning, reducesmeaning to a trivial variableand therefore is totally incompatible with con-temporary systems theory, especially in the fieldof self‐referential systems. It is the blind spot ofstructuralism that has been exposed numeroustimes in the works of system theorists (e.g.Foerster, 1984, 2002, 2003; Luhmann, 1990, 1995,2002; Checkland, 1999; Bateson, 2000, 2002;Heylighen and Joslyn, 2001) and by manyscholars from a wider context (e.g. Wittgestein,1978; Popper, 2003; Heidegger, 2006; SpencerBrown, 2008), and there is no need to get into

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 13: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

detail here about the problems and contradictionsit entails. In fact, as analysed by Niklas Luhmann(2002, pp. 187–193), it is the blind spot of mod-ernism, interestingly enough, a spot indicatedalso by Bourdieu himself in his later works(Bourdieu, 2005a, 2005b).Of course, the problem of survival could be

introduced here as an opposing argument; shouldthe system not protect its own existence beforeeverything else? And if so, is that not a brutalproblem that can only be solved through econ-omy? What about people who are starving todeath? Suchquestions disregard thewhole conceptof social systems altogether; what we are dealingwith here is distinct systems, namely, psychic andsocial systems that emerge together through andbecause of communication, and the integraloperation of communication is meaning construc-tion. Undoubtedly, biological organisms are aprerequisite, and their survival is important. Butin no way their mere existence guarantees theformation of highly complex autopoietic structuressuch as the psychic or the social systems. Andfurthermore, the biological death of individualsdoes not entail the disintegration of social systems(not even in biology, let alone human societies). It isthe meaning that stands out as a sine qua non insocial systems, not survival; the reconstruction oftheir identity as sameness, that is, a condensation ofthe temporal manifestations of the system as a self‐referencing, always present, unity.This deviation from survival to meaning as the

main criterion of systems’ self‐governance hasprofound consequences. Only by detaching thenotion of symbolic capital from the economicaldeduction can we explain the social phenomenathat otherwise pose paradoxes. The Christianmartyrs for instance, or the political prisoners inbrutal authoritarian regimes that suffer torturesand face capital sentence, are typical examples ofpsychic systems that choose meaning oversurvival; in a strict economic sense that selectionis irrational, to choose not‐being over beingcannot be explained unless we consider meaningas the basic variable of systemic reconstructionrather than survival.One can present numerous similar examples (cf.

Willke, 1996) offering empirical proof about theimportance of meaning rather than economical

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

profit or survival. At this point, one may alsorecall that Max Weber (2006, pp. 55–68) actuallysuggests to conceive of the production processesin the capitalistic context, as circular meaningreconstitution processes; the economical profit takesthe place of meaning in theWeberian analysis ratherthan that of a financial telos: the meaning of profitis to reproduce the ability to produce profit—a clearcase of an autopoietic system.

Thus, we suggest conceiving of the economiccapital as a distinct form of symbolic capitalrather than an underlying nexus connecting theother forms of capital. Therefore, we can assumethat the ‘connecting pattern’ between the differ-ent forms of capital is meaning.

From this point on, several options are openedto our investigation—and of course the field is soextended that it is impossible to cover it here. Butwe can try to examine the other two forms ofcapital suggested by Bourdieu, namely, culturaland social capital.

We already mentioned that information per seis useless—unless it leads to an informedselection, that is, an act of change. But the sameholds true for energy (or just power for thatmatter). Accumulated energy is meaningless;only a horizon of contingent selections can turna surplus of energy into something meaningful.And thus, although information and energy aredistinct variables, only their coexistence cangrant them with substance—a mutual influencebetween information and energy.

The advantage of the economic capital lies on itsflexibility; as a symbol of credit, it can be easilyexchanged, and in different contexts, providing itsowner withmany different options. However, it isself‐evident that there are still numerous perspec-tives that cannot come into existence just byexchanging them with money, namely, thosesituations that depended on nonrational concep-tions, such as ‘fatherland’ or ‘love’. Even knowl-edge is not something that can be acquiredsimply by paying for it—it requires personalefforts (and no one can do it for you, even if youpay him). For instance, it would be nice to pay theprice and instantly learn German, but yet, thiscannot be done. You have to pay and invest onyour own time and personal efforts. To put itclearly, learning German can give you many

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

365

Page 14: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

options, some of whom could grant you access tonew sources of economic capital. But economiccapital alone cannot help you with that. Thus, theflexibility of economic capital is bounded.

And this is the case also with cultural and socialcapital. They can be transformed to other forms,but they are flexible only in certain contexts. Forinstance, consider the head of the Catholic Church(or any other religious leader for that matter). Hestands in a position that is charged with a surplusof authority (‘Devine authority’ that supposedlystems right fromGod). That is, he can legitimatelyexert his power directly onto the Catholic Churchsystem, in any aspect; but just right there, hisunconditional authority ends. He can change anystate of affairs directly, but out of the boundariesof the system he belongs to, his power diminishesdramatically; it is obvious that if it was differently,then, for instance, the abortion would be banned(in fact, unthinkable) throughout the world. Sothe Pope has a social capital, a surplus of energy,but even he is bounded to transform it to actiononly into a certain context.

To use a Bourdieu‐ish expression, the ‘predis-position’ of the environment is what guaranteesthe ability to transform capital into other forms.Furthermore, the same predisposition is whatturns capital to energy. The point is that every formof capital can offer potentiality if, and only if, it ismanifested in a proper environment. It followsthat context is the crucial factor that givessubstance to the capital: only in a proper contextthe capital can be enfolded with meaning and thus beturned to energy. This is precisely the type ofenvironment (i.e. another system) that has propersensors so to be able to conceive of the capital as such.This is not to say that there are social systemswhere the notion of symbolic capital does notapply. It merely means that not any capital isuseful in any context, not even economic capital,although it is designed with flexibility in mind.

CAN SYSTEMIC ENERGY BE CALCULATED?

This is a very wide topic indeed. The problem offinding a way to calculate a system’s energy canbe roughly analysed as the problem of calculatingthe impact, that is, the degree of change a

Copyright © 2011 John Wiley & Sons, Ltd.

366

system’s selection will have on its own environ-ment. Already, the first problem with thisapproach should be clear: to account a changein the environment to a system’s selection wouldmean to account the phenomena to one cause,and thus, one gets back to a strict causalexplanation and, inevitably, to an oversimplifica-tion of the world. In the case of the autopoieticsystems, the problem of meaning slips in, creatinga situation that renders quantitative measure-ments problematic. Consider a network ofautopoietic systems A, B, C, … etc. It is nearlyimpossible to predict the reaction of C if A selectsa new state—for example, a new mode ofoperation. System A may select a small changethat could attract C to a profound one, becausethe latter might want to compensate an event inits relation to B, caused by A’s change; in thatcase, one could grant system Awith high energylevels, but someone else could account thatenergy to B or to the relation between B and C. Itis self‐evident then that one cannot have an‘objective’—so to speak—method to measure thepotentiality of an autopoietic system.

On the other hand, it is logical to assume thatvery big and influential systems, such aspowerful national states, ‘alpha’ cities in theglobalized world (cf. Sassen, 2007; Mavrofidesand Papageorgiou, 2009; GaWC, 2010) or someNGOs (Willke, 2007), have a greater symboliccapital than others. So it might be possible tohave a loose indication of those systems poten-tiality (i.e. energy) with methods similar to thoseproposed by SET (Bailey, 1990), but nothing morethan that. In general, very big (and ultracomplex)systems could be candidates for such measure-ments because of their systemic inertia, which iscaused by the excessive control they performover their environment and their own sophisti-cated internal differentiation, which is a result ofa prolonged interaction with their environment.But those measurements are in fact an effort topredict the influence of the system measured,rather than a measure of a change caused by thatsystem per se. And it is widely accepted thatsuch predictions always rely on statistical dataand result in probabilities. For instance, what isthe amount of energy that one would attribute tothe Lehman Brothers Holdings Inc. in 2005? And

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.

Page 15: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

Syst. Res. RESEARCH PAPER

what would it be now (2011)? The point is that asystem’s symbolic capital is a variable thatcannot be accurately calculated (in fact, far thanthat); it cannot be predicted, and it is alwaysaccounted (if at all) in a temporal dimension.In a few words, to search for a way to

accurately calculate the energy of a social systemis to search for a way to predict the future. Froman epistemological point of view, this could beconsidered meaningless (Popper, 2003).

CONCLUSIONS AND FUTURE WORK

In this paper, we tried to draw clear distinctionsbetween the different meanings of entropy,namely, between thermodynamic and statisticalentropy. We argued that when one refers toentropy as uncertainty in a societal context, thenone refers specifically to statistical entropy, andthat should be explicit so to avoid confusion withthe notion of energy. On the other hand, there is aplace for the concept of energy in the socialsystems theory, because those systems need toundertake action every time they make a selec-tion. Therefore, the notion of thermodynamicentropy has also a place in the social systemstheoretical apparatus, although in a somewhatdifferent theoretical framework, which bringsPierre Bourdieu’s theory of symbolic capital inthe theoretical context of meaning developed byNiklas Luhmann, all things considered of course.Many questions still remain open, but there is

one prominent among them: does the secondlaw of thermodynamics apply to social systemsor not? We ask this because we could observethat a small amount of money and a clue(information) about a company’s new productcouldmultiply that amount in the stock market.The increase of capital in that case pertains toinvestors that spend their money but hope thatthey get greater symbolic capital (stocks) inreturn, and sometimes, they find themselvesright. But where can this end? Does that meanthat in the long run, the total amount ofsymbolic capital available is grossing irrevers-ibly? Our final remark would be that althoughthe co‐existence of psychic systems does notguarantee the emergence of social systems, the

Copyright © 2011 John Wiley & Sons, Ltd.

On the Entropy of Social Systems

humans are an a priori condition before everysocial phenomenon. And therefore, in the lastanalysis, we get down to the problem of naturalresources and their growing scarcity, whichsimply means that the second law still remainsvalid and therefore entails new questions as tothe governing institutions.

ACKNOWLEDGEMENT

Thanks to Evangelia Dimaraki Ed.D., Departmentof Cultural Technology and Communication,University of the Aegean, for her useful remarksduring the revision of this paper.

REFERENCES

Arnopoulos P. 2001. Sociophysics and sociocybernetics:an essay on the natural roots and limits ofpolitical control. In Sociocybernetics: Complexity,Autopoiesis and Observation of Social Systems,Geyer F, van der Zouwen J. (eds.). GreenwoodPress: Westport, CT; 17–40.

Arnopoulos P. 2005. Sociophysics: Cosmos and Chaos inNature and Culture. Nova Science Publishers Inc:Hauppauge, New York.

Ashby RW. 1957. An Introduction to Cybernetics.Chapman & Hall Ltd: London.

Bailey KD. 1990. Social entropy theory: an overview.Systems Practice 3(4): 365–382.

Bailey KD. 1997a. The autopoiesis of social systems:assessing Luhmann’s theory of self‐reference. SystemsResearch and Behavioral Science 14(2): 83–100.

Bailey KD. 1997b. System entropy analysis. Kybernetes26(6/7): 674–688.

Bailey KD. 2006a. Sociocybernetics and social entropytheory. Kybernetes 35(3/4): 375–384.

Bailey KD. 2006b. Living systems theory and socialentropy theory. Systems Research and BehavioralScience 23: 291–300.

Bateson G. 2000. Steps to an Ecology of Mind. Universityof Chicago Press: Chicago.

Bateson G. 2002. Mind and Nature—A necessary unity.Hampton Press: Cresskill, NJ.

Bourdieu P. 1986. The forms of capital. In Handbook ofTheory and Research for the Sociology of Education,Richardson JG. (ed.). Greenwood Press: New York/Westport/London; 241–258.

Bourdieu P. 1989. Social space and symbolic power.Sociological Theory 7(1): 14–25.

Bourdieu P. 2005a.(2001). Science de la science etréflexivité, Cours du Collège de France 2000–2001,Athens, Patakis. Θ., Παραδέλλης, (Trans).

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

367

Page 16: On the Entropy of Social Systems: A Revision of the Concepts of Entropy and Energy in the Social Context

RESEARCH PAPER Syst. Res.

Bourdieu P. 2005b.(2004). Esquisse pour une auto‐analyse (Greek translation), Athens, Patakis.Giannopoulou Efi, (Trans).

Checkland P. 1999. Systems Thinking, SystemsPractice, John Wiley and Sons Ltd: Chichester, WestSussex.

GaWC. 2010. Globalization and World Cities (GaWC)Study Group and Network, Access: 16/03/2010,link: http://www.lboro.ac.uk/gawc/citylist.html

Georgescu‐Roegen N. 1986. The entropy law and theeconomic process in retrospect. Eastern EconomicJournal XII(1): 3–25.

Gowdy J, Mesner S. 1998. The evolution of Georgescu‐Roegen’s bioeconomics. Review of Social EconomyLVI(2): 136–156.

Heidegger M. 2006 (1926), Being and Time .JohnMacquarrie, EdwardRobinson, (trans). BlackwellPublishing: London.

Heylighen F, Joslyn C. 2001. Cybernetics and second‐order cybernetics. In Encyclopedia of Physical Science& Technology, Meyers RA. (ed.). Academic Press:New York; 155–170.

Ho M‐W. 2010, What is (Schrödinger’s) Negentropy?,access: link: http://www.ratical.org/co‐globalize/MaeWanHo/negentr.pdf

Joslyn C. 1990, On the semantics of entropy measuresof emergent phenomena. Cybernetic and Systems22(6): 631–640.

Luhmann N. 1986. The autopoiesis of social systems. InSociocybernetic Paradoxes. Geyer F, van der Zouwen J.(eds.). Sage: London; 172–192.

Luhmann N. 1990. Essays on Self‐Reference. ColumbiaUniversity Press: New York.

Luhmann N. 1992, Operational closure and structuralcoupling: the differentiation of the legal system.Cardozo Law Review 13: 1419–1441.

Luhmann N. 1995. Social Systems. Stanford UniversityPress: CA.

Luhmann N. 2002. Theories of Distinction. StanfordUniversity Press: CA.

Maturana HR, Varela FJ. 1980. Autopoiesis andCognition: The Realization of the Living. ReidelPublishing Company: Boston.

Mavrofides T, Papageorgiou D. 2009. The participationof a region in the global network: integrationor exclusion as consequences of the use of ICT,conference proceedings: 2nd Greco‐Russian Social &Scientific Forum, 14–18 June, St. Petersburg,Russia.

Copyright © 2011 John Wiley & Sons, Ltd.

368

Popper K. 1957. Irreversibility; or entropy since 1905.The British Journal for the Philosophy of Science 8(30):151–155.

Popper K. 2003. (2002). Alle Menschen sind Philoso-phen (Greek edition), Athens, Melani editions.Michalis Papanikolaou, (Trans).

Quine WvO. 1980. From a logical point of view.Harvard University Press: Cambridge.

SassenS. 2007.ASociology ofGlobalization.W.W.Norton&Company, Inc: New York.

Schrödinger E. 1944. What is Life?, access: 10‐10‐2010,link: http://witsend.cc/stuff/books/Schrodinger‐What‐is‐Life.pdf

Shannon CE. 1948. A mathematical theory of commu-nication. Bell System Technical Journal 27(July andOctober): 379–423 και 623–656.

Shneider TD. 2010. Information Theory Primer, access:08/02/2010, link: http://www.ccrnp.ncifcrf.gov/~toms/papers/primer/primer.pdf

Spencer Brown G. 2008. Laws of Form. BohmeierVerlag: Leipzig.

Weber M. 2006. The Protestant Ethic and the Spirit ofCapitalism (Greek edition). Gutenberg: Athens.

Wiener N. 1961. Cybernetics: Or Control and Communi-cation in the Animal and the Machine. MIT Press—John Wiley & Sons, Inc.: New York, London.

Willke H. 1996 (1993), Systemtheorie: eine Einführungin die Grundprobleme der Theorie sozialer Systeme(Greek edition), Kritiki. Livos Nikolaos (Trans).

Willke H. 2007. Smart Governance—Governing the GlobalKnowledge Society. Campus Verlag: Frankfurt.

Wittgestein L. 1978. Tractatus Logigo‐Philosophicus(Greek translation), Athens, Papazisis Editions.Kitsopoulos, Thanassis, (Trans).

von Bertalanffy L. 1968. General System Theory—Foundations, Development, Applications. Braziller:New York.

von Foerster H. 1971. Perception of the Future and theFuture of Perception, access: 10/04/2006, link: http://grace.evergreen.edu/~arunc/texts/cybernetics/heinz/perception/perception.html

von Foerster H. 1984. Observing Systems. IntersystemsPublications: Seaside California.

von Foerster H. 2003. Understanding understanding—Essays on Cybernetics and Cognition. Springer: Verlag,New York.

von Foerster H, Poerksen B. 2002. UnderstandingSystems—Conversation on Epistemology and Ethics.Carl‐Auer‐Systeme Verlag: Heidelberg.

Syst. Res. 28, 353–368 (2011)DOI: 10.1002/sres

Thomas Mavrofides et al.