Flusser: Digital Illusion

Digital Illusion [1]

Before our incredulous eyes, alternative worlds composed of particles are beginning to emerge on computers — lines, surfaces, soon bodies and moving bodies. These worlds are colourful and produce sounds, and probably in the near future it will be possible to feel, smell and taste them as well. But that is not all, for these moving bodies can be equipped with artificial intelligence of the type Turing’s man[2], so that we will be able to engage them in dialogical relationships.4  Why do we mistrust these synthetic images, sounds and holograms?  Why do we demean them with the word “illusion”?5 Why are they not real for us?  The overhasty answer goes: because these alternative worlds are nothing more than computed particles, because they are structures of mist hovering in a void. The answer is over hasty because it  mistakes reality for a density of distribution, and we can be sure that future technology will be in a position to distribute particles with the density we find in the things of our given world. The table I am writing on is just a swarm of particles.  If the particles could be distributed with equal density in a hologram of this table, our senses would no longer be able to tell the difference. The problem is: either alternative worlds are as real as the given one or the given one or they are as ghostly as the alternatives.

As to the question about our mistrust of alternative worlds, however, there is an answer of a completely different kind.  There are worlds that we have designed ourselves, that were not given to us, as was the one that surrounds us.  Alternative worlds are not givens (data) but artificial constructions (facts).  We mistrust these worlds because we mistrust everything artificial, all art. “Art” is beautiful, but a lie, which is what the term “illusion” means. But this answer, too, leads to another question: Why is illusion deceptive, actually? Is there anything that is not deceptive? That is the crucial question, the epistemological question alternative worlds pose. When we speak of “digital illusion,” then, we are raising this question and no other.

Of course this is not a new question, for our eyes have troubled us since they ceased to believe, at the latest by pre-Socratic times, even if they only achieved their full acuity at the beginning of the Renaissance. Alternative worlds, with their digital illusion, bring the trouble to a head.  In considering digitalisation, then, it is appropriate to start with the Renaissance. What happened then?  To put it briefly, people discovered that in order to grasp the world, to take it in hand, they needed calculate it rather than observe or describe it. The world is in fact unimaginable and indescribable, but it is calculable. Only now, with alternative worlds, are we seeing the result of this discovery.

It started in roughly this way:  revolutionary craftsmen of the early Renaissance would no longer let the bishop declare a “fair price” for their products. They wanted a “free market”  in which the value of the goods established itself cybernetically, through supply and demand. In rejecting the authority of the bishop with respect to “value,” they also rejected everything that until then had been understood by the term “theory”. “Theory” had been the recognition of unchangeable forms. So by means of a “theory” of the “ideal” shoe, the bishop could compare it with one produced by the shoemaker and so establish the value of the produced shoe according to how close it came to the ideal. The revolutionary craftsmen maintained that there was no ideal shoe and no unchangeable form, but rather that they themselves were the ones who invented shoe forms and continually improved them. The forms did not present eternal ideals, but changeable models, which is the reason the Renaissance is called “modern”. By “theory,” they understood not the passive observation of ideals, but the progressive development of models, worked out in practice, that is in observation and experiment. Modern science and technology, the industrial revolution and finally digital illusion arose in this way.

As a result, theoreticians from the cathedral and cloister migrated into the workshop (university, polytechnic, industrial laboratory) and began to make models to be used to make better shoes and to make the world in general more graspable and manageable. Surprisingly, it turned out that such working models could not be images or texts, but rather algorithms.  Incidentally, we have not yet fully absorbed the surprise at finding that the world is a book — natura libellum, to say it in the language of the Renaissance — that is codified in numbers. From that point on, theoreticians had to think more and more in numbers and less and less in letters and images. This change had profound consequences that need to be considered in trying to grasp digitalisation.

Theoreticians have always been literate people — litterati — who struggled against thinking in images, that is, magical thinking. They thought in lines of written characters.  They developed a linear, processual, logical, historical consciousness.  But there were always foreign bodies in the linear written codes of the alphabet, namely written signs that are not structurally linear.  Letters are signs for spoken sounds, that is, for discourse, whereas these foreign bodies are ideograms for quantities, that is, numbers. Numbers are not discursive and so don’t fit into the line. This is the reason we’ve always spoken of an alphanumeric code rather than of an alphabetic script. The consciousness that articulates itself in this way was processual and historical as well as formal and calculating.  In recognizing that the need to think numbers was increasing and the need to think in letters was decreasing, a formal consciousness began to encroach on  historical consciousness. This was transformative, but not because formal models are a modern invention: there are clay tablets from some three millennia ago incised with marks that can only be understood as models for a drainage system.  The work of such  geometers of the Bronze Age are the intellectual ancestors of so-called computer artists. They did not depict their surroundings, but produced designs for as yet unrealized, “projected,” alternative “worlds”.  Their designs resemble synthetic computer images in that they express a formal, “mathematical” consciousness.  Looking closely at such ancient clay tablets is not a bad way to grasp the essence of the alternative worlds that are now coming into being.

Despite its having been underway for such a long time, the modern recoding of theoretical thinking from letters into numbers must be considered an intellectual transformation.  It can be seen clearly in Descartes, but had already begun with Nicolas of Cusa[3] and becomes painfully clear with Galileo. Recoding raises the fundamental  epistemological question mentioned earlier of whether there is anything that does not deceive. Descartes’ well-known answer was approximately: that which does not deceive is disciplined, clear and distinct arithmetic thinking. A thing is clear and distinct because it is coded in numbers, with an interval separating each individual number from any other. Such thinking is disciplined in that the rules of numerical code, such as addition and subtraction, must be followed exactly.  The reason for giving up literal thinking in favour of numerical thinking is, in fact, that literal thinking is not clear, distinct and disciplined enough to lead to knowledge. The thinking thing — res cognitans — must be arithmetic to be capable of knowing the world.

Yet with this comes a peculiar, typically modern paradox. The thinking thing is clear and distinct — and that means it is full of holes between the numbers. But the world is an extended thing —res extensa, and everything fits into it seamlessly. So when I apply the thinking thing to the extended one, in order to think about it — adequatio intellectus ad rem (the intellect’s conformity to the object under consideration)[4] — then the extended thing gets away from me through the intervals. This is the reason the closing of the intervals between numbers became the epistemological problem of the Renaissance. Descartes tried to solve it simply, believing that every point in the world can be counted out in numbers, making geometry into epistemological method. Later these methods were refined, by Newton and Leibniz in particular. Numbers were brought in to fill the intervals in, “differential integration”.  By means of differential equations all possible things in the world can in fact be formalized and formulated. Formal mathematical thinking can know everything, and offer models for making anything: we became all-knowing and all-powerful. That is the cognitive transformation already achieved by Nicolas of Cusa in his assertion that God could not know that one and one are two any better than we ourselves do.

This greatly abbreviated description of modern recoding of letters into numbers and the resulting shift from processual, historical and enlightened consciousness to a formal, calculating and analytical consciousness is of course not even close to sufficient to really grasp the development of alternative worlds in computers. For one thing, not everyone has made the leap from the linear to null-dimensional, i.e. calculating consciousness. Most remain oriented in progressive, enlightened thought: they experience, perceive and evaluate the world as a chain of cause and effect, and their participation is geared toward breaking the chains, to free us from necessity. Their consciousness continues to be linear, literal and literary. There are only a few who have left this consciousness behind, who no longer perceive the world in terms of causality but as a toss of the dice, whose thinking is no longer progressive and enlightened but futurological and systems-analytic or “structural”. They produce the models that guide the majority.  For example they program advertisements, films and political programs according to structural criteria so that those who are manipulated abandon the idea of accountability,

Because of the level of consciousness at which they operate, most people cannot participate in the alternative worlds that are beginning to appear in computers, and so claim not to want to participate.

The division of society appears to be dramatic: on one side, there are a few programmers who think formally and numerically, and on the other, many programmed, who think literally. But this is not yet the core of the contemporary issue, which lies in formal thinking’s claim to be all-powerful and all-knowing. In the 20th century, more particularly in the second half of it, this kind of thinking turned a somersault. It happened for practical and theoretical reasons. The practical aspect is as follows: differential equations formalised everything. In this purely formal sense everything is “recognizable”. But in order to use such equations as working models, they must be “re-numbered,” that is, re-coded back into natural numbers. With complex equations this is a tiresome process, and all interesting problems are complex. The recoding of such equations can take more time than there is in the universe. For this reason such problems remain unsolvable. We are not all-powerful, and although we are all-knowing, our knowledge is of no practical use in the case of complex, which is to say interesting problems. A widespread cultural pessimism and a sense of absurdity in life can be traced back to this claim of formal reason being turned upside down.

At a theoretical level, calculating thought was penetrating ever more deeply into illusions. It took them apart, so that the phenomena themselves took on the structure of calculating thought. Not only in physics did phenomena collapse into particles, but also in biology — in genes, for example — in neurophysiology into particle-like stimuli, in linguistics into phonemes, the ethnology into culturemes, in psychology into actomes.  A question arises with these tiny parts — with quarks, for example — as to whether we are talking about tiny parts in the world or about symbols, which is to say, signs of calculating thought. Perhaps numerical thought is in fact not so much about knowledge of the world as it is about a projection of numerical code outwards, followed by a recovery of what was projected. Numerical knowledge is, then, is problematical.

Against this background, the situation of contemporary consciousness can be summarized as follows: since the Renaissance, a part of the “intellectual elite”, the litterati, began to think in a formal-calculating way rather than a discursive-historical way and to express themselves in algorithms rather than literary texts. The motive for this conversion was the expectation that this thinking would be “adequate,” that is, suitable for the perception and handling of the environment, perhaps even human beings and their society.  In fact we own modern science and technology to this thinking.  At first it looked like technology was nothing other than applied science, and the technical schools were subsumed under “pure” faculties.  Then the relationship between science and technology began to turn around and the “pure” disciplines became ancillary to technology. At present, theory and practice are so interwoven that we can distinguish between them theoretically or practically.  Say philosophy is the “purest” discipline, then its technologization, that is, the mathematizing of philosophical discourse, and its reverse, the “philosophizing” of technology, has become the actual goal of our thinking.  The expectations for this thinking have not been met. The elite of those who think formally, disappointed in itself, is currently responsible for perceptual, experiential and behavioural models to which society orients itself.  these are so-called technocrats, media moguls and influencers, who might better be called “programmers”. Because the alternative worlds now arising in computers must be considered designs of this reigning elite, the computer must be examined more closely.

As was already pointed out earlier, differential equations have, since the beginning of the [20th] century, proven themselves to be practically useless. It was an unbearable situation. It was impossible to make use of available knowledge. Hundreds of calculations filling pages upon pages with number sat in engineers’ studies without having solved the problems that had been recognized theoretically. Strangely, this collapse of “pure reason” seems not to have penetrated into public awareness at the time. Calculating machines were invented to resolve the unbearable problem. The machines kept getting faster and faster, so that a great many, if not all problems could be solved because they could be stated numerically. High speed calculating machines also had several unforeseen features that transformed our idea of humanity and our  understanding of ourselves. For present purposes, we will restrict the discussion to just two of them. As pointed out earlier, a good deal of  the epistemological project of modernity was concerned with making numerical code “adequate” to the world, generating more and more clever and elegant mathematical methods. The calculating machines made this work superfluous. They calculate so fast that 1, 0 and the command “digitize” is enough. They can get along with no further mathematical refinement. They calculate with two fingers, but do it so quickly as to surpass the greatest mathematicians. This had serious consequences, because mathematical thinking, which until then had been considered the highest human capacity, turned out to be work unworthy of human beings because it could be mechanized. It also introduced a different kind of work, namely the programming of the calculating machines. Instead of calculating, this involved analysing the universe of numbers. Mathematical thinking had to step into systems analysis and was changed. One might maintain that what was the case for mathematical thinking affected various other kinds of thinking as well, for example decision-making.

The second feature of high-speed calculating machines is their surprising capacity not only to calculate, but also to compute, that is, they are capable not only of analysing equations as numbers, but also of synthesizing numbers into forms. Calculating thought penetrated deeply into phenomena that have, finally, with this most recent development, disintegrated into particles, and it is disturbing to realise it. With this step, the world has taken on the structure of the numerical universe. When computers showed that calculating thinking can not only disintegrate the world into particles (analyse), but also put them together again (synthesise), they posed dizzying epistemological problems. Here are just two particularly startling examples: first, what we call life can be analysed in particles, in genes, but genes can, with genetic technology, be reassembled into new information to produce “artificial life forms”; second, computers can synthesize alternative worlds, which they project from algorithms, that is, from symbols of calculating thinking and which can be just as concrete as the environment that surrounds us. In such projected worlds everything that is mathematically thinkable can also actually be made — even that which is “impossible” in the environment, such as four dimensional bodies or Mandelbrot sets. Computers are not yet technically capable of doing these things, but in principle, nothing stands in the way.

At this point in our dizzying reflection on “digital illusion” we need to catch our breath and look back over the distance we’ve come so far. We can describe the view that presents itself as follows: people have thought formally since the Bronze Age at the latest, having designed drainage systems on clay tablets. Formal thinking has been subordinate to processual thinking through history, and only in the early Renaissance did “analytical geometry” — geometric forms recoded into numbers — appear in the foreground.  Such disciplined formal thinking made modern science and technology possible, but led to a theoretical and practical dead end.  The computer was invented to clear the practical obstacles, simultaneously intensifying the theoretical problems. In the early Renaissance, people searched for something that did not deceive, and believed they had found it in clear, distinct and disciplined numerical thought. Then they began to suspect that science only projects numerical code outward, that what appear to be natural laws are equations that have been applied to nature. Later still came the deeper suspicion that the entire universe, from Big Bang to heat death, with all its fields and relations might be projections that calculating thought recovers. Finally, computers are now showing that we can project and recover not only this one universe, but as many as we like. In short, our epistemological and with it our existential problem is whether  everything, including ourselves, should be regarded as digital illusion.

So the difficulty of alternative worlds must be faced squarely. For if everything deceives, everything is a digital illusion — not only a synthetic image on the computer screen, but also this typewriter, this typing finger and this idea being expressed by the finger — then the word illusion has become meaningless. We are left to regard everything as digital, everything as a more or less dense distribution of particles, of bits. What we call real and perceive and experience as real are those places, those depressions or expansions in which particles are densely distributed and realise their potential. That is the digital world view as it is proposed by scientists and shown to us on computers. We have to live with it from now on, even if it sticks in the craw.

Not only a new ontology, but a new anthropology will be forced upon us. We ourselves have to grasp the “self” as such a “digital distribution,” a possibility realized by means of densities. We must understand ourselves as dips or bulges in a field of relationships, above all between human beings. We, too are “digital computations” of whirling, particulate possibilities. We must not only work through this new anthropology psychoanalytically, neuro-physiologically, and epistemologically (acknowledging its Judeo-Christian sources that took human beings for mere dust), but we also actually put it into operation. It is not enough to see our “selves” as intersections of virtual possibilities, an iceberg floating in a sea of unconsciousness or a computation of nervous synapses, we must also act accordingly. The alternative worlds now appearing in computers put such insights into action.

What do those who sit before computers, pressing keys and producing lines, surfaces and bodies, actually do?  They realise possibilities. They compress particles according to exact formulae. What they produce is as much an exterior as an interior: they realize alternative worlds and in the process, themselves. they “design” realities from possibilities. The more densely they are compressed, the more effective they become. It actually transforms the new anthropology: “we” is the knot of possibilities that that becomes more real the more tightly it compresses the possibilities that swirl in and around it. Computers are apparatuses for the realization of possibilities that produced within, between and around human beings by means of exact, calculating thought. This formulation can be understood as one possible definition of a “computer”.

We are no longer subjects of a given objective world, but projects of alternative worlds. From a submissive subjective position, we have raised ourselves to project. We are growing up. We know that we are dreaming.

The existential change from subject to project is not the result of some “free decision”. We are forced to it, just as our distant forbearers considered themselves forced on to two legs because the ecological catastrophe in progress at the time made it necessary for them to somehow cross open ground between sparse trees. We, on the other hand, must see not only see through the objects that surround us, but also the selves we once called our own minds, souls or simply identities, as computations of particles. We can no longer be subjects, because there are no more objects whose subjects we could be, and no hard core that could be the subject of some object. The subjective attitude and with it subjective perception has become untenable. We have abandon it as childish illusion and dare a step into the open field of possibilities. The adventure of becoming a human being has entered a new phase. That can be seem most clearly in the loss of our ability to distinguish between truth and illusions or between science and art. Nothing is “given’ to us except possibilities to be realised, which are “nothing yet”. What we call “the world” is what has been computed from our senses through less than transparent methods into feelings, wishes and perceptions. These, along with the perceptions themselves, are reified computational processes. Science calculates the world as it once was. It is concerned with facts, with made things, not with data. Scientists are computer artists avant la lettre, and the achievement of science consists not in some kind of “objective knowledge” but in models for handling what has been computed. To recognise science as a kind of art is not to demean it, for in this way it becomes a paradigm for all the other arts. It becomes clear that all art forms can actually be real, that is, produce realities, if they renounce empiricism and achieve the theoretical precision of science. And that is the “digital illusion” that is under discussion here: through digitization, all art forms are becoming exact scientific disciplines, and can no longer be distinguished from science.

In ancient Greek philosophy, art is closely associated as illusion, a connection that will now become decisive. When the childish wish for “objective perception” is abandoned,  perception will be judged by aesthetic criteria. This isn’t new either: Copernicus is better than Ptolemy, and Einstein better than Newton, because they offer more elegant models. What is new is that beauty will become the only acceptable criterion for truth:  “art is better than truth”.  In the so-called computer art that we are just beginning to see,  alternative worlds are as real and true as digital illusion is beautiful. The person as project, the one who thinks formally, analysing and synthesizing systems, is an artist.

This insight takes us back to  beginning of the line of thinking proposed here. We started with a mistrust of the alternative worlds now appearing, because they are artificial, and because we designed them ourselves. We can now see this mistrust in an appropriate context. It is the mistrust of people who think in the old, subjective, historically conscious, linear way facing the new, which expresses itself in alternative worlds and cannot be grasped with such old categories as “objectively real” or “simulation”.  It depends on a formal, calculating, structural consciousness, for which “real” is that which can be concretely experienced (aisthestai = experience).  Inasmuch as alternative worlds are perceived as beautiful, they are also the realities in which we live. “Digital illusion” is the light that for us illuminates the yawning emptiness in and around  us. We ourselves are the headlights projecting alternative worlds in and against nothingness.

[1] From “Digitaler Schein,” 202-215 IN Medienkultur, Frankfurt am Main: Fischer Taschenbuch, 3rd edition, 2002.

[2] In Turing’s Man: Western Culture in the Computer Age (1984) Jay David Bolter takes a sceptical view of the claims made at the time for ai, expressing greater interest in how these claims affect the ways contemporary human beings imagine themselves [The lower case “m” was in my German edition]. The book is listed in the “travelling library” at the Flusser Archive in Berlin. These are the books Flusser kept with him during his extensive travels in the 1980s.

[3] An instance of a “Renaissance man” of diverse skills and curiosities, Nicolas of Cusa (1401-1464) was a prominent German Catholic bishop whose interests in mathematics, cosmology, astronomy led him to many discoveries well in advance of mainstream science.