Information & Causality
October 20, 2011 § Leave a comment
We are living in the …
no, not the yellow submarine (any more), today many are convinced to live in an information society, or even a knowledge1 society. Also, everybody feels well in pretending that the main tool or technology for achieving this particular developmental state of the society is the digital computer.
Everybody is using computers all the day long, not only in desktops and laptops and pads, but also in cell phones, cars or houses. A contemporary middle class car contains more than 100 specialized computers. Cell phones are not just bare phones anymore, but their evolution points towards more and more general purpose capabilities. It seems to be pretty obvious that we are processing information all the time.
As a philosophical concept, information is an interesting subject. Firstly, of course, due to its relevance for epistemology. A theory of knowledge, even in its restricted form as a theory of purposeful behavior, should be able to provide some tools in order to deal with the concept of “information.” Secondly, however, and from the other side so-to-speak, information has a communital aspect, mainly related to an extended form of media philosophy. Vilém Flusser, for instance, coined the term of getting “in-form-ed,” to be understood to become shapeful through external influences. Then there are aspects of power, organization, or also of uncertainty. Finally, still regarding philosophical aspects, information is a very important concept necessary to understand the generation and dynamics of symbols.
We can easily recognize that a clear picture about the concept of information is highly relevant for any investigation of epistemic machines. The prognosis would be that any inappropriate, e.g. reductionist or overly formalistic, version of the concept of information makes it impossible to understand epistemic acts or even to create an entity that is able to grow its own capability for epistemic acts. As a concept “information” then would be restricted to to the purely technical domains of deterministic machines and automata, without any applicability to human life in more generality. This consequence is often not recognized in works about information, causing serious worries on the side of those who are indeed interested in understanding human life better. We will discuss this in some details later on.
In our approach we will address the problematics of information mainly from three perspectives. First, we will take a brief look onto the history of the concept. Second, we will call in the philosophy of language, namely the notion of the language game. Media philosophical aspects we will not consider explicitly, unfortunately. As the third step then, we will present our synthesis.
Since we have to focus to our interests here, we will not check the role that the concept of information plays in science, despite the fact that it has been influential for our thoughts.
In our investigation we will try to avoid ontological stances, which would lead to proposals about what information “is.” Such a stance that assigns ontology the primal position in an investigation neglects the fact the finally there is always language. Saab and Riss  provide an almost funny example of nonsense directly springing off from such “ontologization.” In their recent article “Information as Ontologization” the start :
While traditional conceptions assume a static nature of information, expressed by the equation information=data+meaning, we have argued that this understanding is based on an ontologization of an entwined process of sense making and meaning making. This process starts from the recognition of a pattern that is interpreted in a way that influences our behavior.
It is ridiculous to take the formula “information=data+meaning” as a serious proposal, then yet proceeding with the claim that meaning can be “made,” only to end with a claim that is either follows behaviorism or is empty. Notably they do not provide any ideas about how such interpretation could work. They conclude
The definition of data-information is a simplification that abstracts it from its process character. We hope that we have shown that the phenomenological nature of ontologization necessitates that we consider data, information, and knowledge in such a way that none of the three can exist independently of the others.
At first sight this sounds quite reasonable. Their motivation is certainly justified. The problematic issue is the particular reference to phenomenology. Formulated around 1900 it sticks with the separatedness of objects from each other and objects and subjects. Relations are only secondary in the perspective of phenomenology. This has been criticized not only by Heidegger. Not only are relations put into the second rank, phenomenology implicitly also disregards time and becoming as relevant aspects. A third problem with it is due to their forgetfulness regarding language. Phenomenology is uncritical against language, thus a phenomenologist believes into a highly problematic triad: (i) direct access to the world, (ii) direct influence of other minds, (iii) logic is a real entity within the world of objects.
Claiming an “existence” for knowledge is simply nonsense (e.g. due to ii above). Unfortunately it can’t be recognized as such by the means provided by Logical Positivism, or the attitude popularized by Carnap, Stegmüller, Moulines, among others. More generally we may say that any attempt to stretch a formal definition of information such to reach out into real life, regardless how this is going to be accomplished, is ultimately equivalent with the claim that information “exists” as a thing.
In our perspective, and ultimately leading to better founded results, instead of any kind of “ontologization” it is much more appropriate to ask how we could speak about “information.” This will enable us to address also the difference between disciplinary discourses. On that road we will deal with Capurro’s suggestion to be aware of the question “What differences does it make if we use one or the other theory or concept of information?”  To put this issue short, we propose that conceiving information under the primate of interpretation is more appropriate than to conceive it as a “thing.” Yet, we do not support the stance that investigates the concept of information in a dualistic mode as either objective or subjective, as Zaliwski elaborates on in . Even as Zaliwski, among other results, proposes (p.88)
“[…] information is created (in the detectors) by generalization of physical interactions coming from the environment.”
which we could be fond of (if he does not mention to naturalize information here), he remains in a naive stance with respect to language. We will see that and how especially formalistic (and similarly physical) approaches tend to “forget” the practice of language as an ultimate instance and non-circumventable plane of reference. There are some impressive examples for this neglect.
The result of our investigations will reveal that it is useful to conceive information as a compound concept. This will be helpful in all discourses to avoid contradictions and problematic shortfalls. It also supports a more appropriate way to quantify information. A second aspect of our results will suggest that it is not feasible to talk about information without directly referring also to causality. This bears some consequences not only for the theory of measurement, but also for any pragmatics about truth, and so about understanding. This second point also uncovers some possibilities to relate the concept of information to those of a (generalized) evolution and complexity.
Given these results it will become even more clear that the understanding achieved by this investigation is mandatory for any theory about epistemic machines.
A last word before we start: In a single essay it is simply impossible to do justice to all of the authors that contributed valuable bits to the evolution of the concept of information. A large number of “definitions” have been contributed, and, as in natural evolution, lots of them got almost extinct, surviving merely in their descendents. Already in 1983, more than 700 “definitions” have been counted . Here, we are not interested in a general survey of the concept of information. For that, you may refer to Nafria’s compilation , or other compendia. Our main objections—to our knowledge—hold for all of them. Such, we think that there is still some kind of missing component in the debate about information, being relevant for philosophy as well as for science, and hence for everyday life.
In 1978, Rafael Capurro  published a great work about the history of the concept of “information” to which I refer here. His compilation reaches back into the classic. In Roman culture, to inform meant to educate someone, i.e. quite literally to shape one’s mind, to give a form. In the 14th century it designated advice, or instruction. Only a little later the meaning of “communicated knowledge” appeared. A next important step in the evolution of the concept was around 1800, when society discovered and realized for the first time what we call “population.” Famous for his use of “information” is Napoleon, who could be rated as the first commander in chief taking full advantage of organizing the flow of information . As a correlate, “well-organized” secret services appeared in France and the principality of Baden in south-western Germany. The industrial counterpart of organizing information appeared in the U.S.A. in the context of census, i.e. administrative interests. Hollerith founded the company “Computing Tabulating Recording Corporation” (CTR), which later was relabeled to IBM. Roughly at the same time, scientifically interested engineers started to transfer the label “information” to the area of technical transmission of signals. Hartley (The Transmission of Information, 1928 ), Shannon , Weaver, and Wiener all may be assigned to this lineage. Not to the least, World War II was decided due to the successful efforts in Bletchley Park, employing a large calculation machine (and first of this kind and size). Defective encoding and fast decoding decided the war.
The historical ties to secret services and communications engineering installed a particularly different focus as compared to the more holistic perspective to information. While Hartley refuted clearly any relevance to semantics, Shannon did not so clearly; his theory is not a theory about information, but just about the dynamics of the amount of information, i.e. the quantitative description of changes in the amount of information. Quite early, Warren Weaver  expressed his then unrealizable desire to unify the different aspects of information. He distinguished the following “levels” to be problematic in communication:
- – technical, e.g. regarding accuracy,
- – semantic, e.g. regarding “the interpretation of meaning” (Weaver)
- – influential, that is concerned with the success of conveyed information
Weaver clearly states (p.47):
“One might be inclined to think that the technical problems involve only the engineering details of good design of a communication system, while the semantic and the effectiveness problems contain most if not all of the philosophical content of the general problem of communication. To see that this is not the case, we must now examine some important recent work in the mathematical theory of communication.”
Weaver calls the problematics of influence also that of “effectiveness.” We will return to this notion later.
The decisive turn seems to be due to Wiener’s “Cybernetics” where he adopted Shannon’s mathematical formulation, but also extended the claim to transmission processes in entities of any kind, including organisms and their social organization. Brillouin finally denied any possible interpretation of concept of information that would include semantics. This series in the history of adoption together with the initial success of the Shannon theory in various sciences fixed the style of most of the theoretical considerations about “information” till today. We will discuss the work of Barwise, Seligman and Sommaruga in the next section.
Only comparably few people criticized the dominant and purely syntactic approach. Bar-Hillel and Carnap proposed a concept about semantic information . Unfortunately, they founded their approach on Carnap’s work on scientific language, i.e. itself on the basis of formal logics, which does of course not allow to overcome the purely syntactic level. Carnap was convinced all the time that his approach of a scientific language is correct. In fact, it is not. Irrespective the domain, a formal language in or about that domain will never be expressive enough to satisfy the Lebenswelt. It is simply not a language, but just a code.
Recently, Floridi published a series of works aiming at a more complete concept of information, including the semantic part, yet without sacrificing the possibility to quantify information and keeping philosophical seriousness intact. Ultimately, however, his approach rests on the same assumptions as any of the other formal theories.
Basic Assumptions in the Formal Account
Any formal-logic conception of information is grounded on a set of specific assumptions. These assumption represent a particular theory about meaning. Sometimes, as in the case of Seligman’s or Floridi’s work, the formulation of claim to incorporate semantics is distracting regarding these assumptions.
Unfortunately or not, the claim to incorporate semantics into a theory of information that then is built upon a formal-logical, arithmetic or algorithmic approach induces the necessity to specify explicitly the theory about meaning. This, however, is rarely accomplished, if at all.
The basic assumptions of all the formal or/and quantitative theories about information are:
- (1) Meaning preexists as an extension of objects.
- (2) Meaning can be transferred.
- (3) Information is a thing or a process outside of those entities that interact.
We will see that all these assumptions are deeply inappropriate, as they all ultimately deny the need of interpretation, or if that is accepted, that a historically contingent and constrained entity (“subject”) is needed as an interpreting instance. As a correlate, these theories also refute implicitly the stance that is called the “Linguistic Turn.”
Information: the Formal Account
There have been numerous attempts to provide a formalization of the concept information that would be more general than Shannon’s original arrangement. What Shannon did was to describe the decrease of average uncertainty on the side of a (technical) receiver while it receives a stream of symbols from a completely defined alphabet. This effect, or better, what is transferred beyond the mere symbols, is obviously related to information. Remarkably, in a Shannon system there is no account whatsoever of interpretation. It is simply not necessary, since the Shannon system is a fully determinate system. It is also obvious that any violation of the assumptions therein would render the theory inapplicable. Alternatively, the subject of investigation could be reduced (modeled) in a way such as to fit into the necessary assumptions.
The Shannon-information is related to the question of algorithmic complexity . This is not quite a surprise. Recently it has been demonstrated that the so-called Kolmogorov complexity is equivalent to Shannon measure of decreasing uncertainty. The Kolmogorov complexity of a string of symbols can be described as the shortest set of rules (e.g. a computer program relative to a given machine architecture), which is capable of creating the same sequence of symbols.
The K-complexity has not very much to do with the concept of complexity that is used to characterize self-organizing systems and their emergent properties. One reason is that K-complexity operates purely in the symbolic space, where precision does not appear at all. As  has argued, it is quite reasonable to include precision as a resource into the description of computational complexity as well. For instance, the Mandelbrot set discloses its complexity only with increasing precision.
Recently, Giovanni Sommaruga achieved at a remarkable result in his survey of formal theories of information. He writes (p.264) .
“An examination of the possibility to make out a reductionist approach in the theoretical study of information yields a negative result: There is no such reductionist way. Thus, the answer to the initial question (Q) [How many informal theoretical concepts of information are there?] is: several. The answer to question (Q*) [How many formal theoretical concepts of information are there?] is: at least several.”
This is as interesting as it is unsuitable for any attempt to provide such a formal theory about one’s subject. There are just a few alternatives here, since the language game of “formal theory” prohibits a manifoldness on itself:
- – There is exactly one formal theory.
- – There is no such formal theory.
- – Something is wrong with the formalization approach, e.g. it isn’t possible at all, or it is not possible using known tools.
Well, it tells quite a lot, if there is indeed the claim that “multiple” formalizations should be possible. What could be the reason for this phenomenon? Obviously, there is some kind of mapping which is not a bijective one. In turn this means that on the side of the “starting point” of these theories there is some additional influence on the structural level. As long as this influence is not recognized, the resulting gap is filled more or less arbitrarily. Quite naturally, this leads to different theories, which are often though not necessarily, incommensurable. Of course, there is a reason for this unsuitable and self-contradictory multiplicity: meaning in the sense as Wittgenstein understood it, and the inevitability of interpretation, which in turn refutes any attempt for “pure” formalization apriori.
Yet, we will deal with a positive answer to this challenge a bit later. For now, we proceed with the approach as proposed by Barwise and Seligman , and Seligman . Note, that Seligman is explicitly heading for “[…]a Unified Theory of Information and Communication” . We use the work of Seligman as kind of a placeholder for other formal theories, since it is one of the more advanced examples.
Barwise and Seligman, and later Seligman in  conceive of information as factual semantic contents and they try to ground information on environmental information. Seligman  provides compressed details about the concept. It is unique among any other concept in literature (except ours) that it includes the interpretation as essential part of the information process. Seligman provides detailed formal specifications (at least partially based on category theory) for the concepts of classification, infomorphism, and channel.
Classification is defined as a relation between tokens and types. A signal, consisting from tokens, needs to be classified in order to qualify as information. Infomorphisms relate two classifications, while the channel transfers the tokens from which then classifications are built from. This all sounds quite reasonable, since it refers to the empirical character of a being as a necessary component of a theory of information.
There are, however, still our basic objections, even as they are hidden due to their elaborated concept. (…)
Information: Ongoing Synthesis
As a simple matter of fact we may accept the enormous heterogeneity of regarding information as a concept in language, for whatsoever reason. We do not believe, in contrast to , that classifying this heterogeneity is ultimately a fruitful approach, besides providing the overview. A next step must be taken, without falling back into constraints imposed by disciplinary perspectives. What we are looking for is a level of description which allows to ask: What are the invariances regarding its usage? The appropriate level has to be quite abstract, of course, as there are philosophical questions involved. We thus surely will not ask about receivers or receptors, differences or channels.
Instead, we will employ two different approaches, which we hopefully manage to recombine in the end. The first strain rests on the notion of the language game. The second direction takes the heterogeneity and interprets it deliberately as a consequence of a particular conceptual space. Again, the dimensions we invoke will be abstract ones, and they are not trivial.
Language (Games) and Information
Our basic suggestion is to conceive of information as a language game. Well, everything is a language game, of course, that sounds pretty trivial. The concept of language game, however, abolishes logification, formalization or naturalization. The question about ontology is “like a disease”, as Wittgenstein would have said. The meaning of a word or a concept can not be defined, it is given by—or also in—their usage. Usage means that we use words as “arrows” for pointing towards other a concept, which in turn needs to be interpreted. Meaning in general thus denotes the condition for the possibility to point to. A particular meaning shows up only in a particular interpretation. Less trivial is hence the fact that there is nevertheless no investigation of the concept of information up to date that would have taken this perspective.
Playing language games is a topic of normal life and the various Forms of Life (“Lebenswelten”) implicitly arranged around it, or embedding the former. The context that most frequently is related to information today is “communication” and computer technology.
Computers and their usage as “information storage devices” offer a quite peculiar opportunity: we can arrange and re-arrange bits of data in an almost completely reversible manner. Simulation, scenario analysis, any kind of planning, even financial accounting are predominantly structured by the reversibility offered by the medium “computer.” Nowadays, even communication entered a prespecific phase, loosing some properties previously regarded as being crucial. We call it Twitter.
In that respect “information” could be conceived as a language game about the potential to take a differential action. From this perspective, it would not be the determination of a difference in the real world, but instead the main agency in establishing differences. To take differential action means to actualize a chain of causal physical, and above all irreversible events. Such, information would be regarded as potential irreversibility with alternatives attached. Causality, on the other hand is well-known (and acknowledged as instance) for being the representative of irreversibility.
There are many examples pointing to a mutual exclusion of the language games of information and causality. Consider networks, for example. If you excite a network of rubber bands it eventually resonates. If you send a lot of energy into it, it vibrates stronger. Throwing now a small stone onto such an excitated network, it may immediately be destroyed. What is the cause? With regard to networks, there are no discernible causes. As in this example, memory plays a role here, that is, in other words, information. Philosophically spoken, the grammar suitable to talk about networks and the grammar suitable to talk about causality are not commensurable. The Staatssicherheitsdienst in former German “Democratic” Republic never understood that, and contemporary politicians, dramatically enough, do not understand it either with regard to the so-called international monetary system.
Obviously we are faced with a duality. This duality is not completely unknown; more correctly, it is is well-known from quantum physics. In quantum physics there seem to prevail strange paradoxes, about which Feynman thought that they forbid any deeper understanding of the quantum world . Well, I guess the paradoxes could be resolved, yet not into a classical version characterised by linearity and the separation of observer and observed.
Given such a duality it is not really surprising that one can find heavy and frequently cited books about causality not even mentioning information a single time (e.g. [18, 19]). In quantum physics (QP), however, one can find the attempt to simultaneously treat causality and information. In terms of QP it is said, that the wave equation (Schrödinger’s equation) collapses. Note that Schrödinger’s equation is not about waves of energy, but rather about waves of probability. This decoherence of the probability wave is marking the transition from the world of information (probability) into the world of causality (matter).
A central concept in QP is measurement. The way the measurement is performed determines the quality of the results. The photon either shows up as a particle (matter), or as a wave (energy). The question then is, how to arrange measurement around information and causality?
The key insight is that measurement is involved in the transition between information and causality, in either direction. If we measure once, we create a symmetry break, we create the event, we become effective in the sense of the Aristotelian forth cause: the purpose, the causa finalis. We enforce the “world” to fit into (the structure) of our measurement device, into the structure of our theory.
If we, in contrast, measure repeatedly, we are effectively doing a sampling. We expect reproducibility, i.e. some material basis. Yet, by sampling it, we create an informational representation of it. This process we usually call “modeling.” On this side we expect replicability (see Radder  about this distinction). The interesting thing about the number of repeats here is that it “induces” objectness in a seemingly retrograde manner. Of course, that’s just an illusion. Before we didn’t perform the measurement we could not say anything about the external world. It does neither make sense to “densify” the relational aspects of the world into separated objects before measurement and the implied interpretation (as phenomenology and realism does it), nor does it make sense to conceptualize “everything” of the external world as immaterial. Before measurement it is without form, potentia and energeia if you will. Subsequent to measurement we “see” either particles and objects or information, probabilities and networks.
The obviously distinguishing element is the number of measurements we perform. In fact, we not only never can say in advance how often we will going to perform a measurement. We also can’t repeat a particular measurement exactly in the actual physical world, where irreversibility rules. In other words, we always perform a singular measurement and simultaneously a repeated measurement. It is a more or less a willful decision which of them we would like to emphasize. Yet, the consequences are quite dramatic.
The following figure summarizes what have been said so far:
Figure 1. Linkage between the language games of information and causality. It is appropriate to introduce distinguishing markers for information and causality, according to their position in the sequence of actions. Note, that the elements of measurement and information are included also in concepts like interpretation, filtering, action, or interaction. According to this scheme, causality is implied by decohering measurement. For further details see text above.
Symbols: I-information = invoked information, E-information = evoked information, S-causality = causality accessed through sampling, D‑causality = causality constructed by decoherence. The dashed lines indicate pseudo-transitions, i.e. subsuming labelling which actually hides the structure of the empirical interaction with/in the world.
Of course, measurement is not only mandatory, it is even a transcending category, at least for any entity that is linked in any arbitrary manner into the physical world. It is the condition that expresses the primacy of interpretation.
Invoked information (I-Information) and causality accessed through sampling (S-Causality) can’t be conceived as “objectificatable” entities. They are only as a result of indirect inference; they are implied by the fact that we perform a measurement. Neither I-Information nor S-Causality can be said “to be out there”, they “are” just two complementary modes of speaking about the externalizable world as back-propagated aprioris.
Measurement is also the only means to establish relations between entities. It is at least closely related, if not equal to Leibniz’s perception, who first recognized that perception requires activity. Recently, Peschard has renewed the significance of activity for the whole range of the episteme, which resulted in a theory of knowledge as a practice, as enaction .
In order to perform a measurement, a device has to be built, and the device has to invest energy in order to become able to interpret. That’s not only true for physical devices, it holds also for abstract ones, that is, for any method that is used to transform something. If we take a look how nature (and technology) builds sensors and their embedding, we see, that sensors first need a potential that then secondly will be actualized into different forms. In an engineered sensor we often find an electrical field with the electric current as actualization, while in animal’s sensors we find membrane potential (also an electrical field) or a random base spike activity that is transformed into a spike pattern. In both cases, the sensor has to be active before measurement. In other words, the activity of the measuring device is modulated rather than the device is simply transferring a signal in passive mode.
The particular transcendental quality of measurement is overlooked by empiricism and contemporary philosophy of science. Our view onto the role of a generalized notion of measurement (and the role of information in it) is clearly different from the traditional one in philosophy of science, say, for instance van Fraassen’s, who recognizes only the lower part of the chart displayed above. Van Fraassen , whose concept about measurement and information we use as a means of contrast here, brings in the subjectivity, the intentionality, intensionality and indexicality as a part of the measurement, which according to him has to be arranged by agents. Here, our position clearly differs, since for us observation and “interaction” do not necessarily contain the above cited properties. In fact, he fails to distinguish modeling and decoherence, and modeling from measurement. He writes (p.179):
“A measurement is a physical interaction, set up by agents, in a way that allows them to gather information. The outcome of a measurement provides a representation of the entity (object, event, process) measured, by displaying values of some physical parameters that-according to the theory governing this context-characterize that object.”
In contrast to that we think that measurement does not “allow to gather” information. It is much more appropriate to conceive it as equal to that gathering under conditions of a dualistic principal setup of performance and convention. According to van Fraassen, information could be defined independent of measurement, much like mushrooms can be defined independently from the gatherer collecting them for a soup. Of course, somehow mushrooms and information are drastically different with respect to their relation to the “being gathered.” Information does not “exist” outside of measurement. If I would save a two files, one containing “0-0-0-0-0-0-0-1”, the other one containing “0-0-0-0-0-1-0-1”, as an intended representation of the numerical values “1” and “5”, respectively, could I really claim that these formatted collections of graphical (re-)presentations are identical to “information” before anybody (such like an alien, or a Neanderthal, or an individual of the homo erectus species) would have interpreted it? Despite it is clear that this is indeed a bizarrely and non-sensical attitude, most people belief exactly in such ontology of information. Else, and thus, it is clear that measurement need not to be set up by”cognitive” agents; any material instance is “measuring” all the time.
The diagram shown above also demonstrates the particular relationship between modeling, causality and belief structures. Remarkably, we do not need the concept of knowledge here, fortunately enough. Knowledge is far too complex to be an adequate category on the level of measurement. We refute thus for instance Donald MacKay’s suggestion to link information to an increase in knowledge on the receiver’s side (besides the fact that the category of receivers is not a reasonable one for discussing knowledge and signs). He writes :
“Suppose we begin by asking ourselves what we mean by information. Roughly speaking, we say that we have gained information when we know something now that we didn’t know before; when ‘what we know’ has changed.” (p. 10)
Nevertheless, he is pointing to an important issue, which we will deal with in the next section: interpretation.
A last word about measurement. As it is commonly known, in quantum physics the separation between observer and the observed breaks down. There is nothing mysterious about that, it is simply a matter of scale. If the scale of the measured and and the scale of the measurement device become sufficiently similar, either with regard to resolution, or with regard to the set of properties of the scale, then aspects of self-referentiality are being introduced into the whole setting. If the measurement device is part of the game, if the measurement device is inside the measured or no decoupled from it, well then it is coupled. Not very surprising the results of measurement then depend on the activities or the structure of the measurement. As a consequence, we easily experience the illusionary threat by paradoxes. In fact, there are no no paradoxes, we just experience the duality between information and causality. This structure we find not only in quantum physics. We find it in sociology as well, or in language philosophy. Wittgenstein necessarily organized his investigations as he did, refusing any “explanation”, because that has been the only way not to get into struggles with the duality. The same duality is invoked in so-called computer experiments (which aren’t experiments so far at all). Last but not least, any “engineering” in social contexts, such as urban planning, management and business process re-engineering is affected by the duality of information and causality. Major problems are to be expected in those fields if no attention is paid to the basic structure described here.
Here we cease discussing this issue, despite a lot more could be said about measurement here, of course. For our purpose here the important issue is the particular link between modeling and action (performance) through measurement.
This obvious centrality of the role of measurement also justifies any particular emphasis of a theory about theory, as there is no measurement without theory. Theory and information are co-extensive. Actually, one could say that we do not live in the information age, but also in the age of theory.
Elements of Information
We already admitted that we embrace the heterogeneity of the various language games about information. (We do not accept nonsense exaggerated claims about information, yet.) Besides the duality spinning between information and causality there are still other challenges waiting.
Everybody is talking many times a day about “storing information”, “processing information,” or “deleting” it. We also talk about “having information about.” We indicate that we could act differentially, that we have an alternative, perhaps thoroughly investigated regarding the (differentiality) of consequences. As long as we do not act, nothing happens, trivially enough. Yet, the state of “having information” is clearly different from the state not even to think about alternatives. (You may excuse that I am referring to “states” here…) So, we again are allowed to relate “potentiality” and “information.” Yet, “information as potential” is clearly immaterial.
The particular challenge that we meet here is due to those immaterial properties of information. It may be conceived as subjective (due to interpretation), as a potential, as a measure for uncertainty, a language game: in any case we regard it as something immaterial. If information is something which only exists within an act of interpretation, and we are convinced about that, how then can we speak also and reasonably about “storing” that? Acts can not be stored as little as it is possible for potentials. And what are we then actually storing if not information? The answer here is quite similar to the question about the role of language and the relation between language and meaning, as we will see.
At this point we now could set one of the aspects of information atop the others. For instance, subsuming semantics in the formal approach, or vice versa. Yet, that would not match our goals, besides the fact that we would fall back into a reduced view. The question is hence:
How to incorporate incommensurable kinds of properties?
The answer is: through elementarization.
Elementarization is an old and powerful strategy. In a later chapter we will investigate this issue in much more detail. Today, elements are better known as dimensions. Yet, elements and dimensions are not always equivalent. They are equivalent only in the case if the elements can be changed independently from each other. The result is a Cartesian space, or one of its relatives. If one can not change the intensity of elements independently from each other, we should call the resulting structures “aspects.” Note, that aspects are not necessarily orthogonal to each other. In fact, mostly they are not.
Yet, there is a strong difference between elements and dimensions. The elementarization which we will propose here is of a more fundamental character than interpreting variables as dimensions. The elements we propose are not variables but principles.
Elements qua aspects allow to open a space for incommensurable, yet dependent perspectives. In the case of information we could construct a 3-aspect space that would be able to comprise any possible concept of information.
Before we introduce the three aspects (“elements”) we would like to recall Weaver’s distinction:
- – technical, e.g. regarding accuracy, can be modeled using Shannon’s approach;
- – semantic, e.g. regarding “the interpretation of meaning” (Weaver)
- – influential, that is concerned with the success of conveyed information
Weaver’s distinction definitely bears some important aspects, they are however not perfectly well chosen.Besides the fact that the formulation of an “interpretation of meaning” is not quite meaningful, its weakness is particularly given by the partial overlap of these categories, which is mainly introduced by his reference to semantics. Semantics, i.e. the dynamics of meaning certainly influences effectiveness, and technical transmission must be considered also in the semantic aspect, since the semantic aspect is based on an exterior-interior dialects. Weaver’s terms also are complicated terms, since technicality or semanticality are itself cultural assemblages. Last but not least it is inappropriate from a formal point of view to define semanticality as an aspect if one tries to explain or describe just the semanticality.
We agree with Collier  who argues that information has to be described in non-intentional terms. Thus, we propose a slight modification of Weaver’s aspects resulting in its generalization and giving rise to the following list. The three aspects could be reformulated as:
- (1) Form
- (2) Effectiveness
- (3) Extension
A particular property of that space is that the zero is a transcendental value. These aspects never can take the value 0 for real contexts. We should briefly discuss these three aspects.
Information has a form aspect. It refers simply to the form of the encoding. One possibility for such a form of encoding that is quite abundant nowadays is bytes as a sequence of grouped sequences of binary symbols. Bytes, and the underlying notion of bits allow for a very high signal-to-noise ratio, which is a nice property for any transmission task. Bytes are, of course, not the only form which we can use to encode information, although any other encoding could be transformed into bytes. This, however, is either not lossless or it requires extrinsic efforts for standardization and symbolic embedding. Even bit-encoding is not without extension (see below), i.e. even a bit-stream in a computer is not semantically neutral. Here we see that it was crucial for Shannon to demand for a closed alphabet that is known to the sender and the receiver.
Effectiveness comprises notions of reliability, certainty or relative syntactic completeness. Syntactic completeness simply describes the fact whether there are recognizable gaps, say white spots, cut holes, etc. or not and to which extent. Effectiveness also comprises what has been called “information dynamics” . Another concept belonging as a particular operationalization into this category is “informativeness,” as it has been recently proposed by Floridi . Floridi is equating “informativeness” with the semantic aspect of information. This is, however, definitely highly inappropriate, since semantics is a result of interpretation and meaning, which of course can not be quantified in principle.
If an information is not reliable or certain, or if it is obviously too incomplete, we will simply not use it as a basis for a decision, i.e. it will not lead to the decoherence into D-causality (see Figure 1 above). The propensity for such a decoherence is proportional to the “quality”, or better: effectiveness of the information. It is not necessary that the decoherence actually takes place. Nevertheless it will be highly effective in providing potential alternatives.
Of course, given a particular information we never could know whether it is complete or not, or how reliable it is. Relative effectiveness is thus always <1.
The extension of information refers to the strictness of encoding. The strictness of encoding is different from the notion of reliability. Strictness is related to the resolution of the encoding process. For instance, if in case of music the resolution of that encoding, i.e. the sampling rate, falls below 8 Khz then the perceived quality of the musical information becomes severely deteriorated, while the form, the reliability and completeness may remain fully intact. From all the three aspects, the extension refers most to the necessity of further conventions. Thus it requires context, which implies conventions and interpretation. Through the extension of information we evade the threat of closing our account of formalization. The context dependency as well as the recursive necessity for interpretation could be even operationalized.
The extension of information is inevitably invoked for all amounts of information larger than a few bits. As Quine correctly stated, any empirical situation is underdetermined, i.e. we can not find enough “information” in a message to find a single conclusive interpretation of the message. Hence, Shannon’s assumption of a closed alphabet known to both the sender and the receiver is violated in real-world situations. In turn, messages can not be conceived as “formal strings” for which a single algorithm would be possible. Messages in real worlds are not Kolmogorov equivalent. Instead, dealing with the uncertainty of the message requires to take into consideration possible relations between parts of the message. It is much like in music (or concerning texts): we only understand it, we only can decode it, if we establish a virtual network of relations between the parts of the received string. This virtual network is not part of the message, though. It is part of the conventions, i.e. the cultural background in which the message is embedded. Thus we call this aspect “extension.”
The aspect of extension comprises everything related to interpretation. Here, the story about information becomes fractal, since any interpretation comprises the listed three aspects again, yet on a different (nested) level.
So, we conclude that the proposed structure is indeed a holistic representation of the language game of “information.”
The mutual dependency of aspects in our holistic philosophical concept of information can be expressed as a multiplicative relation.
aspected Information = of.F * oE.E * oX.X
where F=form, E=effectiveness, X=extension; o(i) = the operationalization of the respective aspect.
If any of the terms equals zero, no information is available. The case of unreliable information is pretty clear. More interesting is the case, when the extension of information is reduced close to zero, e.g. by removing any possibility to refer to a convention. Even if the alphabet is perfectly known to sender and receiver, e.g. as in case of telegraphing, we could transmit equivalents of Maya symbols for which we do not have a matching convention. We even may recognize after while that the Maya text is repeated. So, no information has been transmitted despite the fact that the signals are clearly recognizable.
We also can see that the “amount” of information is dependent on the respective operationalization, say: method, to deal with the aspect. Any of the aspects may be instantiated by many different operationalizations. It is unreasonable to expect that there could be one single set on the level of practiced messaging. There could be possibly a formalization for each of the aspects (but not one for all of them!), yet, such a formalization would be very abstract, hence requiring again some contingent instantiation, i.e. an instantiation that is based on conventions.
From a philosophical point of view it is important that we do not need to refer to any kind of truth conception. Truth, or truthfulness, can not be an element of our world on the level of information. It is purely a matter of using a particular interpretation for assigning a value to a relationship. Whether an information is true or not, such a question means to mistake the category. The impression of falsity or truth is just and only a matter of matching expectations, not a matter on the conceptual level of information. Here we fully agree with Floridi (section 3.2.3 in ).
It should not come as a surprise that we—in contrast to quite a few authors—also do not think that it is reasonable to conceive information as “meaningful and well-formed data.” Again it commits the mistake of assigning meaning the state of an externalizable object.
Above we stated that there is a conflict between everyday language games and the philosophically extended theory of information. Simply put, we can not store information. As a response to that difficulty we proposed the elementarization of information in an aspectness. Aspectness of information is not a drawback, it is just a consequence of the facts that (1) meaning is an entity based on conventions, and (2) the brain is part of a body.
Not the information is stored, but all the conditions necessary to achieve or reconstruct the “same” interpretation again. Yet, despite the hope that these conditions have been met, it often turns out that it has not. In this case we find a lot of data, which we can not render into information again. Often, we need a meta-description of the data in order to be able to re-instantiate the information again. Usually, many and strong conventions are required to do this successfully.
Similar to the role of a word in a text and their interpretation, the interpretation of stored graphemes into symbols, and then into information, is not guaranteed to reach a particularly desired point.
From what has been said up to here, it is clear that information can not be transferred. We can transfer encoded data. Upon decoding into graphemes, data and meta-data, which may even contain a description of the cultural conventions, we then may try to construct the information that is present on the side of the sender. Yet, we never can know whether this attempt was successful in a particular case. Thus, large parts in discourses are just about that: trying to establish “resonance.” We just would like to remark that Richard Brandom  created a whole philosophy around this problematics. Another important topic here is media philosophy, starting with McLuhan’s “The media is the message.” Yet, there is much more about media theory today .
Information can be transferred only between entities that are able to interpret on the basis of cultural conventions. Outside of such a relation the term “information” does not make any sense. Syntactical machines like computers do not exchange information because they can not interpret.
This brings us to a further complication. We first assume that information exists only as interpretation. Second, we know that we have no direct access to the reasoning machine in our brains, i.e. we have no “direct” access to the models in our brain-mind. In order to transfer an information we first have to encode the respective models into external symbols (e.g. language) before we can send them. Yet, this encoding is tricky and definitely not unique or “lossless.” We conclude, that the pragmatics of “transferring information” just means to talk to each other about a determinable subject in order to distribute the ability for the same set of potential decoherences.
So, what do we mean if we say “computers are processing information.”? Well, most of the computers do not process information, because they do not interpret, they are syntactically rearranging symbols. What the standard computer does reproducibly is to apply the rules determined by the programmer. The role of the computer is simply to automate, to desynchronize and to diaepistemize the encoded interpretation. Nevertheless, it is still the interpretation of the programmer. (Which is the reason for software projects being so expensive: to define the conventions and the rules for it, which means, that a lot of discussions have to be performed.) The notion of “processing information” thus directly points towards a particular attitude about automation of social life. We prefer not to live in a world ruled by such an attitude.
Here in this chapter, we investigated the relation between information and causality. A clear picture about information as well as about causality and, as we have seen, about their relation is of course crucial for any theory about epistemic beings. Thus it is also one of the cores of our theory about machine-based epistemology. An entity that is able to derive “insights” about the world is able to derive propositional statements about the causality and the informationality in the world. We have seen that both belong together in a complementary manner.
It is at the core of the concept of information to regard it as something that can be transferred. This holds even for the old meaning in classic Rome. We can clarify the pragmatics of “information” as something that can be transfer or to shared on the basis of our investigation. We may conclude that information (as a language game) just means to talk to each other about a determinable subject (1) in order to distribute (to share) the ability for the same set of potential decoherences, (2) to do so in a way that allows to talk about the degree of reliability of the whole process including its results, and finally (3) allowing for the negotiation about the region to take in the aspectional space, used as a theory of information, from which a particular model is selected in a given context.
We agree thus completely with Peter Janich  that information can not be naturalized in any way. That is, as a whole it can not be formalized, or externalized, information can not be separated from the embedding Lebenswelt. Of course, we indeed can create reductionist models about any concept we can grasp. This is a usual attitude in science. It is, however, naive (and dangerous) to generalize any results from an investigation that rests on overly reductionist model, particularly if that generalization concerns minds or the Lebenswelt of minds.
Peter Janich calls it even a legend, this assumption that “information is a natural thing.” If a theory does not respect this embedding, it can not be called semantic any longer. Terms like “Formal Semantics” are close to (bad) ideology. An escalation into a veritable smoke discharging misnomer is represented by calling an “information algebra,” the highest level of formalization, a semantic theory, as it is practiced in .
An informational being must be able to align its position in that space according to the position of other such informational beings. As the closing word, we just would like to emphasize that it could be dangerous to externalize this negotiation and alignment to syntactic devices or centralizing institutions like Facebook, Google, or the EU.
This article has been created on Oct 20st, 2011, and has been republished in a considerably revised form on Jul 8th, 2012.
1. Ultimately, we will develop a particular concept of knowledge in the future (which has come by now): see the article “A Deleuzean Move” about an appropriate conceptualization of knowledge. Basically, knowledge is not about representational issues at all. Citing from there: “Saying ‘I know’ means that one wants to indicate that she or he is able to perform choreostemically with regard to the subject at hand. In other words, it is a label for a pointer (say reference) to a particular image of thought and its use.”
(as you can see, some efforts has still to be spent here…)
-  David J. Saab, Uwe V. Riss, 2011, Information as Ontologization. Journal of the American Society for Information Science and Technology, Vol.62(11), pp. 2236–2246.
-  Rafael Capurro, Birger Hjørland (2003), Annual Review of Information Science and Technology Ed. B. Cronin, Vol. 37. Chapter 8, pp. 343-411.
-  Andrzej S. Zaliwski (2011), Information – is it Subjective or Objective? tripleC 9(1): 77-92. available online http://www.triple-c.at.
-  W. Lenski, (2010). Information: A conceptual investigation. Information 1, 74-118.
-  José María Díaz Nafría (2010), What is information? A multidimensional concern. tripleC 8(1): 77-108, 2010. available online: http://www.triple-c.at.
-  Rafael Capurro, Information. Ein Beitrag zur etymologischen und ideengeschichtlichen Begründung des Informationsbegriffs. Saur, 1978.
-  Bernhard Siegert, Passage des Digitalen.
-  F. Hartley, The Transmission of Information, 1928
-  Shannon
-  Warren Weaver (1949), The Mathematics of Communication. Scientific American. reprinted in: David M. Messick (ed.), Mathematical Thinking in Behavioral Sciences. Freeman, 1968.
-  Bar-Hillel and Carnap
-  Giovanni Sommaruga (2009), One or Many Concepts of Information? in: Giovanni Sommaruga (ed.) Formal Theories of Information, LNCS 5363. pp. 253–267, 2009.
-  about precision
-  Jon Barwise and Jeremy Seligman, Information Flow: The Logic of Distributed Systems. Cambridge University Press, 2008. (books@google)
-  Jeremy Seligman, Channels: From Logic to Probability. in: Giovanni Sommaruga (ed.) Formal Theories of Information, LNCS 5363, pp. 193–233. Springer, 2009.
-  Jeremy Seligman (2009), Looking For a Unied Theory of Information and Communication. 2nd International Workshop on Logic of Rational Interaction, October 10, 2009. South West University (ILI), The University of Auckland (Philosophy).
-  Richard Feynman, about quantum physics
-  Wesley Salmon, Causality and Explanation.
-  Judea Pearl, Causality – Models, Reasoning and Inference. Cambridge UNiversity Press, Cambridge 2000.
-  Hans Radder, “Technology and Theory in Experimental Science.” in: Hans Radder (ed.), The Philosophy Of Scientific Experimentation. Univ of Pittsburgh 2003, pp.152-173
-  Isabelle Peschard, Reality Without Representation. The theory of enaction and its epistemological legitimacy. Thesis Ecole Polytechnique, Paris 2004.
-  Bas C. Van Fraassen, Scientific representation: paradoxes of perspective. 2008.
-  MacKay, D. M., 1969, Information, Mechanism and Meaning, Cambridge: MIT Press.
-  Collier John D., “Intrinsic Information.” pp.390-409, in P.P. Hanson (ed.) Information, Language and Cognition: Vancouver Studies in Cognitive Science, Vol. 1 Oxford University Press, 1990.
-  H. Atmanspacher (ed.), Information dynamics. NATO ASI series B (256). Plenum Press, New York, 1991.
-  Luciano Floridi (2011), Semantic Conceptions of Information. Stanford Encyclopedia of Philosophy. available online here.
-  Richard Brandom, Making it Explicit. 1994.
-  Vera Bühlmann, Inhabiting Media. Thesis, University of Basel (CH), 2009.
-  Peter Janich, Was ist Information? Suhrkamp 2007.