Culture Machine, Vol 1 (1999)

CM1999 Article: MacKenzie

The Future Left to its Own Devices

Adrian MacKenzie

device: 1. plan, scheme, trick, contrivance, invention, thing adapted for a purpose or designed for a particular function. 2. drawing, design, figure ... 3. (in pl.) fancy, will [Concise English Oxford Dictionary]

The devices about to appear are meant to activate three senses of the word device: they are contrivances or contraptions, even toys; they are drawn figures; and less predictably, they are fanciful or playful. As a verb, to devise also means to divide. Property is devised in a will: thus "to devise" not only means to contrive, it also means to bequeath to a time to come. Let me set the frame for the two related devices which attempt to extract some contrivance, figure and play in devising a future.

Between an absolute future and a programmed no-future

We know that the future is not programmable as such, that it cannot be calculated, or simulated fully in advance. The future is not the output of a computation. Speaking in terms of computational theory and Alan Sokal notwithstanding, we run absolutely no risk in saying: the future, as the outcome to the set of all problems marked by the present, is undecidable. Does that mean that it remains absolutely unanticipated? That it must be absolutely other if it is to be any future at all? And does this mean that the time of the future is something profoundly alien to calculation, programming or to devising?

One thing that drives recent theory to an aporetic conception of the future is the processing of contingency, and especially the historically recent and still growing technological channelling of events into networks of inscriptions or marks mobilised through algorithmic or programmed devices. In short, contemporary technical action, above all, computation, programming, calculation and simulation deaden singularity. As Lyotard, following Heidegger and others, has it, the ever-faster absorption and processing of events by systems of marks (especially digital writing) tends to innoculate the present against the future (Lyotard,1991: 68). Despite the rush of images and information, nothing happens in computation that was not already there to begin with. Similarly, Derrida asks:

Today, perhaps because we are too familiar with at least the existence, if not the operation, of machines for programming invention, we dream of reinventing invention on the far side of the programmed matrices. For is a programmed invention still an invention? Is it an event through which the future (l'avenir) comes to us? (Derrida, 1989: 46)

In general terms, the answer will be that nothing new is programmed, that programming only repeats the same, neutralising novelty as it goes through evermore convoluted and intricate computations. Therefore, as I understand it, "the only possible invention is invention of the impossible" (Derrida, 1989, 60) The approaches to an unknowability or unassimilability of the future developed in much of this theory are all similarly motivated: to retard the neutralising technological programming of contingency in favour of a different temporality or eventuality not reducible to the order of calculation or programming.

My concern here, following a suggestion found in (Beardsworth, 1998), is that the opposition between the programming of no-future and the singularity of an absolute future, an absolutely other future, might remain too disjunctive. There may be a complication working through programmed calculation, where programming is understood as a metonym for the 'inscriptivity' of contemporary technics in general. This complication blurs the line between the calculable and incalculable. The suggestion is that the opposition between calculation and the non-calculable, or the technical and the non-technical or supratechnical might be more complicated than we think; or at least, that re-thinking the opposition between the technical and the non-technical might implicate a heterogenous incalculability within the calculable.

In the context of an assessment of deconstruction's attitude to technology, Beardsworth summarises a complex argument by saying:

If ... the evolution of the technical is the very possibility of human indetermination, then, just as arche-writing must be articulated through its differentiations, so the incalculable and the indeterminate must be worked through calculation. [Beardsworth, 1998, 80]

Leaving aside the path that Beardsworth takes to reach this point, the implication is that if the future has a chance in the face of an onrush of programmed information, it is partly to be thought along the pathways and networks of calculation or simulation, not simply in opposition to them or outside them. The framing problem here is: how to think through the opposition between incalculable event and calculation, how to map a passage between indetermination and determination without subordinating one to the other. This problem does suggest the need for a slightly innovative approach to thinking calculation or programming in general, an approach which can at least take into account how the incalculable is inscribed or devised through the calculable.

As it usually conceived, the incalculable remains caught up with the calculable. That is, it is still defined from within the horizon of theoretical proofs, axiomatic systems or decision methods as what lies beyond the limits of calculation. Derrida states, in the same essay quoted above and in other texts, that the future to come, and the chances of an entirely other future, not only escape all programming and are not integrable within any calculus, but remain "heterogeneous ... to the form of the undecidable that theories of formal systems have to cope with" [Derrida, 1989, 56]. The future of which he speaks would be different: "I am not speaking here of factors of undecidability and incalculability that function as reservations in a calculable decision" [Derrida, 1984, 29]. In his terms, we must therefore distinguish between the incalculability of the future as understood from within calculation as what is not in principle calculable, and the alterity or otherness of a future which is incalculable because it is something other than calculation or program.

The whole question will be: given there is nothing in principle outside calculation, how would something other than the calculable come? Would it come through calculation, no longer conceived as a formal operation, but as technical mediation, media, or as graphing?

Graphing, spacing, mediation

A certain figurative device &endash; the network or graph &endash; constitutes a shared neighbourhood across recent mathematics, biology and computer science, communications systems and the social sciences. Networks epitomise the technical mediations of our collectives through calculation. Here, in the first device, a network or graph grows by laying down a set of paths between nodes. The device might take a minute or two to appear in its window. This delay should not be seen as merely technical. Although it is undoubtedly contingent on network connections, and processor speed, the delay is also necessary. As we will see, there is no technical mediation of human collectives which does not induce delays as well as accelerations. All mediation folds time.

The device in the adjacent window shows the formal elements of what is known in computer science as a graph. A graph, at its simplest, maps a set of relations between elements. In this case, relations expressed as a sequence of links between letters (e.g. a-b, b-f, f-e, f-h-f-x, c-x, c-a, c-b...) are mapped as a network. More specifically, a graph is a collection of nodes and edges that link them together. A node is a zone of relative or provisional stability in the ensemble, while an edge marks a more or less variable relation between two nodes. In the case of the graph-device, the naming of the nodes, and the relations between them are chosen randomly Ultimately, given the underlying coding of the program, they are based on the times of the system clock from which random numbers are drawn. The sequence of such a graph permits many different spacings. If the graph is scrambled (by randomising all the locations of the nodes: press the "Scramble" button), it might move back to a similar spatial configuration or a quite different one, which nevertheless still maps the same set of relations. At the most general level, a level that impinges on all technical mediations, the device inscribes a set of relations as spacing [Nancy, 1993, 53]. The graph is the product of a sequence of calculations or computations coupled with a set of rules for mapping relations.

As it calculates positions according to the relations between nodes, this technical device might begin to re-figure the oppositions between calculation and the incalculable: how can the processing of contingency, actualised in the elaboration and unfurling of streams of collective action across convoluted surfaces of graphing (i.e. contemporary technologies), be understood as working the incalculable or the indeterminate through calculation and simulation? In other words, how can we perceive most sensitively the texture and rhythms of the incalculable amidst the programmed work of calculation which today irreversibly permeates technical action? In what ways does the incalculable inhabit calculation, without being reducible to it?

In approaching this calculative device as a form of technical action, I begin by assuming, following many others ([Latour, 1994], [Serres, 1995], [Stiegler, 1996], [Haraway, 1997], [Ansell Pearson, 1997], [Beardsworth, 1998]), that there is no collective or social sequence that is not technically differentiated, there is no action which is not a technical action, that is not spaced or graphed in some way. Technical differentiation triggers different spacings, delays and rhythms within the collective. In terms of Michel Serres' notion of the quasi-object, a technical object, such as the graph device, constitutes a stabilisation of relational time, of the unstable bonds or transient structurings within a multiplicity:

Our relationships, social bonds, would be airy as clouds were there only contracts between subjects. In fact, the object, specific to the Hominidae, stabilizes our relationships; it slows down the time of our revolutions.[Serres, 1995, 87]

So, if we understand the nodes of the graph as introducing the point of view of quasi-subjects, then the edges of the graph, and the wavering fluctuations in their length, show a degree of stabilisation of relational time in a collective due to the inevitable detour of those relations through non-human actors, or through technical mediations. The graph, as it accumulates more nodes, stabilises the relations between them.

At the outset, (restart the graph with the reset button to see this), each new relation sends the collective floating or tumbling in new directions. It is unstably sensitive to every fluctuation. As the relations become more complicated, each new addition is absorbed or metabolised within a slower and denser multiplicity. The mesh of relations in the collective grows: the graph begins to look like a game of cat's cradle that has become too complicated [Haraway, 1994, 59-60]. It often happens that the set of relations does become too complicated to be mapped consistently, and the graph will collapse into one corner of the screen. (The reset button will start mapping a new set of relations.)

Again, we might say that the time of calculation slows down because the program underlying the device has to keep track of an increasing number relations (ie. the burden of calculating the propagation of perturbations becomes greater). Yet, again, this slowing down is also essential or necessary: as the matrix of relations thickens, more and more effects are propagated through it, but with diminished amplitude. The play (using this term in its mechanical sense of movement within limits) of the collective slows down due to the ramified propagation of relations into other relations. Contrary to expectations, a greater degree of mediation or mediatisation does not simply speed up events in the collective. It can also be seen to slow them down.

At its most dense, when almost a hundred nodes have been added, an inelastic clustering can been seen in the graph. New nodes scarcely add anything palpable to the shape of network. The very mediation of relations stabilises the collective. In this respect, the graph device perhaps remarks something akin to what recent theories (critical and deconstructive) have been saying about calculation and technics in general: if fluctuations are events, the networked collective absorbs events, almost without a trace.

If we understand technical mediations as stabilising fluctuations within human-nonhuman collectives, then they also imbue collectives with a particular temporal texture, particular rhythms of delay and modes of organising relations of proximity and distance. Technical mediations, especially those involved with processes of signification, can dampen or prolong the fluctuations in ways that a purely human collective, abstracted from all its supplements and left to its own devices, would never be able to do. Without these mediations or differentiation, collectives would indeed be left to their own devices. In the terms I have borrowed from Serres' work, they would become diachronic play, without history, calendar or future.

Programmed graphs

The second device, a kind of digital overlay on the wavering mediations of the first, seeks to figure somewhat more specifically the differentiations of technical mediations. The problem with the graph device in its initial form is that it remarks nothing of the specific texture of the mediations associated with programming, calculation and computation. While it is a device contrived to figure the way in a collective stabilises itself through quasi-objects, it tells us little about the particular rhythms at work in present day technical mediations. By virtue of their capacity to absorb events (ranging from an idle click of the mouse to the triggering of a missile launch), programs clearly present the most salient instance of calculation available. The second device &endash; technically known as a random Boolean network (for reasons that will soon become clearer)&endash; seeks to figure the specificity of the program as a technical contrivance, as a channelling of relations in the collective through rules of writing.

The historical depth of the mediations in our collectives is enmeshed with alphabetisation and writing. It is well known today that within technoscientific networks, writing transmutes into quite different inscriptive ensembles of archival access and realtime simulation. Contemporary technical mediations in the form of programs explicitly exposes the collective as generalised writing. Accordingly, the particular technical action of the program, as generalised inscription, again "opens the field of history" [Derrida, 1967, 27]. General writing "explicitly reworks the relation between material inscription and transcendence and at the same time generalises this relation, as one of a radical opening prior to the formation of the inside or outside, across all forms of life." [Beardsworth, 1998, 76]

In what sense is the random boolean network (which should now be fluctuating in an adjacent window) related to programs? Beyond the trivial sense which it already shares with the graph device - they both run as programs on a "virtual machine" encapsulated within the browser software &endash; the network whose elements flash red and green, in binary states of off and on, figures the interaction of a set of programs working in parallel, and synchronised by their linkage together in a network. To cut short the complexities of what is happening under the rather banal and at times confusing flashing of the nodes, we can simply say that each node of the graph has now been programmed not only with a set of links to other nodes in the network, it has also has been programmed with a random set of re-writing rules concerning the states of the nodes it is connected to. Each node is a program.

The device is called a random boolean network because not only are its connections to other nodes randomly assigned, but how it changes its own external state (its output) is determined or calculated by randomly assigned rules. Each node is random then in two senses - firstly, in its connection to other nodes, secondly, in the programmed behaviour in relation to those other nodes. It is boolean in the sense that each node can only be on or off, true or false. As a whole, the network is random, in a more a general sense, in terms of the dynamics of the interactions between the programs. It is this interaction which is shown in the flashing green and red boxes.

To envisage what is figured here, we could imagine, under the tutelage of the deconstructive quasi-concept of generalised writing, the genesis of all events and all structures as sequences of marks which in turn fold back on themselves, transcribing further sequences in an unending, at times circular, mesh of transcriptions. Calculation and the incalculable are all held with this process of transcription. Programs explicitly capture, inscribe, and in the process transmute mediations of collective relations as streams of coded marks. In a significant sense, the scientific model of random boolean NK networks has been developed with precisely such a context in mind. In the work of Stuart Kauffman, an evolutionary theorist, this model presents a way of viewing the evolutionary emergence of patterns and ordering in terms of series of transcriptive operations (such as those involved in the recombination of genes in sexual reproduction) [Kauffman, 1994] [Kauffman, 1995]. (The more technical name of the model as an NK network points to a further two analytical features of the device: (i) it has N elements or nodes; (ii) each connected to K other elements in the ensemble, as shown in the window at the bottom of the adjacent network device). By running large numbers of trials on such networks, by tallying the results for many different values of N and K, Kauffman has developed statistical intuitions as to the behaviour of large numbers of elements in a collective over time.

For Kauffman's purposes, this model has been primarily useful in tracking down the general possibilities of spontaneous order in complex metabolisms. Against Darwinist arguments for the overriding importance of natural selection (ie. survival of the fittest) as the source of all order and function in life, Kauffman has sought to map the spontaneous orderings and losses of order that arise from the interaction of elements in a collective. Although these results are of great interest in contrast to neo-Darwinist and sociobiological orthodoxies, I want to skirt what I see as three possible misinterpretations of my presentation of the random boolean network as a device here.

Firstly, Kauffman has extended the model and others like it to the co-evolution of technical artefacts in webs of complementarity and substitution (ie. cars + road, computers + telephones, etc) [Kauffman, 1995, 273-304]. There it simulates the unfolding grammar of relations in the collective as a sequence of strings or linear orderings of marks in which each technical action accounts for a mark or 'symbol' occurring in a sequence of operations. Departing from these suggestive beginnings, the conclusions he draws from the model are less interesting. They tend to simply corroborate liberal democratic notions of the centrality of individual freedom.

Secondly, unlike other applications of Kauffman's work in recent cultural theory, I do not see the random boolean network functioning here as a model. For instance, in Manuel De Landa's A Thousand Years of Non-Linear History, Kauffman's work, and other similar scientific models are explicitly presented as models of history, as if history exists apart from the differentiations of its inscriptions, or as if history were a state of affairs to be modelled diachronically. For instance, De Landa writes:

[T]here is what we may call "nonlinear combinatorics", which explores the different combinations into which entities derived from the previous processes (crystals, coherent pulses, cyclic patterns) may enter. ... While the concept of self-organization, as applied to purely material and energetic systems, has been sharpened considerably over the last three decades, it still needs to be refined before we can apply it to the case of human societies. [De Landa, 1998, 17]

The project of applying 'nonlinear combinatorics' which De Land then carries out is different to the purposes of the devices here, which are concerned with the relation between what counts as an event or history and the differentiated conditions of the inscription of events. Insofar as they are figures or drawings, these devices point towards openings of history in the play between structures and events.

Thirdly, Keith Ansell Pearson has recently mentioned a possible convergence between Kauffman's and Deleuze's concepts of multiplicity [Ansell Pearson, 1997, 128-129]. While I accept the general motivation of his discussion &endash; the desirability of concepts of 'innocent becoming' &endash; I am not here attempting to argue for any explicit convergence between philosophical concepts and Kauffman's work in the random network device presented here. While such a convergence between philosophy and science, or the corresponding possibility of elaborating a philosophical concept (in the Deleuzean sense) of the network cannot be ruled out, I am focusing here on the status of the random boolean network as a device in the tri-partite sense mentioned at the outset of this essay. In short, I am more interested in the device as a contraption than as a concept.

The idiomaticity of random network devices

These provisos aside, we can now walk through some of the dynamics of the random boolean network device. Simply put, I would suggest that these dynamics figure some of the contemporary differentiations of delay, diachrony and synchrony in our programmatic collectives. In these terms, they show some of the ways in which the work of calculation is closely bound to incalculability, and, conversely, how the incalculable works through calculation.

Encoding by number

At any given point in time, the state of all the nodes in the network is summarised or encoded by the number that appears in the lower right hand corner of the display panel. In a time where so much is encoded by number (access codes, identity numbers, cryptographic techniques, etc), the number figures two things in relation to the device.

(i) At each point in time, the overall state of the network is uniquely signed by this number. If the network is in state number 53704407, then the next state can be uniquely determined from this number &endash; if we can access the program of every node in the network. The number captures at one synchronic point in time the state of every element of the network as a set of marks that then can be processed, stored, and retrieved as the content of future events. In that respect, numbers control events, or numbers become events. Looking at the complex changing patterns in the device, it is hard to see what is being repeated and what is new. The changing states of the elements are confusing. Looking at the number, the repetitions and novelties soon become apparent.

In exactly the same way, I would argue, calculation, and programming more generally, number events so that they can be sequenced, sorted, stored, retrieved. Those acts in turn constitute the production of events, of times and spaces. Hence, calculation is no longer concerned solely with number, but with the ordering, translation and substitution of one set of marks for another. Operations on numbers are a special case of generalised re-writing functions which sort, compare, edit and translate sets of marks to form images, sounds, gestures, and movements. Rather than understand calculation as an abstraction that decontextualises or dislocates a given milieu of application (this is the dominant critical standpoint), we might understand calculation as a process of concretion which re-writes or re-mediates a domain of relations.

(ii) The number's magnitude indicates the overall 'state space' of the network. That is, the higher this number, the greater the number of possible configurations the network can potentially pass through. To compare roughly, this network of 10 elements can only pass around 1000 different states, whereas this network of 30 elements can have hundreds of millions or even billions of states. Although the magnitude of the state space signifies in coarse terms how many possible states a given ensemble can pass through, the number of states it actually encounters is much smaller. The disparity between the scale of possible states of the collective, and what actually happens, hints at the complexities and bifurcations contained within what might otherwise appear as a monolithic processing of events.

If this number and the colours of the nodes in the network are not changing then the network has reached a steady-state position (or it has been running for more than a few minutes, in which case, it automatically stops: randomize the network to initialise a new set of nodes and relations).

Sequence and cycle

Due to their deterministic nature, every network will sooner or later repeat itself (although in larger networks, the time required easily and often exceeds cosmological time spans by orders of magnitude). From that point on, nothing new happens.

However, just how soon repetition arises depends on such factors as the number of interconnections that run between the elements: the network, if each element is only connected to one other, can quickly lapse into a few stable states repeated endlessly (a short state cycle; conversely, if every element is directly connected to every other element, it can cycle through different network states without repeating an earlier one for a very long time (a long state cycle). Between these two extremes, varying degrees of repetition can occur. While such repetition often occurs, and makes the networks seem predictable and mechanical, there is no guarantee that it will occur. Eventually the same set of states will return, but for a given network, with its particular connections, nodes and initial state, there is no way knowing whether its sequence will be a short cycle or a very long one, perhaps too long to be recognised.

In every case, only the work of calculating or running the programs will show what the network will do. For this reason, the network devices cannot be made to show a long state cycle or a short one without programming a particular initial state into the nodes. Conversely, clicking on nodes in the network (causing them to change state) can tip the whole network into a different sequence. The play of the device is such that this work of discovering whether a particular state is part of a long or short cycle can only be a matter of experiment.

Some statistical generalisations can be made (and these are the products of Kauffman's experiments): (1) every network sooner or later enters one of a relatively small number of so-called "state cycles" in which the same sequence of states repeat over and over until something perturbs the network; state cycles or "state attractors" of various sequence lengths cut the vast number of possible sequences down to a relatively small number; (2) the length of a state cycle reflects the complexity of the relations between elements in the network. With more interconnections between every elements in the network and chaotic, unpredictable behaviour results. In this case, a change in the state of one element in the network may divert the sequence of network states down an entirely different path and onto a different state cycle; (3) The most interesting kinds of state cycles are those which are neither completely stable and imperturbable, nor those which run off into inordinately complex convolutions before repeating, but those that lie on the border between stability and unpredictability, retaining a sensitivity to changes elsewhere in the network.

The work of the incalculable

It is impossible to say, without actually carrying out the computation, whether a given state is related to another through a sequence of states; ie. whether it forms part of the same state cycle, or whether it will never repeat again. Furthermore, both the extent of a state cycle, and its ability to absorb fluctuations, can be difficult to ascertain. The network device can figure the effects of a fluctuation: clicking with the mouse on one of the network's nodes alters its state to the opposite boolean value. As I have said, sometimes (although this may require repeated attempts), the change of state of one node is enough to send the network heading towards an entirely different state cycle. On other occasions, it only introduces a temporary detour that soon ends up in the original cycle.

In terms of such cycles, a "basic time" unfolds as an echo or return within the network. It produces an ordered distribution of spaces in which some elements of the network are stable or unchanging (either permanently on or permanently off), and small portions are actively changing. The network device figures this stabilisation in terms of the movement of the nodes: those that are unchanging are fixed on the screen; only nodes actively changing state keep moving. Such sequences trace out pathways of activity amidst an unchanging background. These nodes are likely to be sensitive to any change or fluctuation in their neighbourhood. In short, the collectives slow down, sort and filter fluctuations through the stable or repeated mediations of elements. Once such a sorting is in place, fluctuations elsewhere in the collective may or may not be readily absorbed into the cycle. The cycle may be sensitive or insensitive to changes in state of elements depending on the degree of connectivity or the density of relations that holds within the network. But just how they sort and filter them depends on the specific programs and connections existing in a given network. Generalisations about the homogeneity of calculation, its capacity to simply absorb or neutralise events, are too coarse to grasp these distinctions.

Furthermore, a wide range of involvements in the active paths occurs across the different state cycles belonging to the same collective. Even though the same underlying programs and connections hold between elements, different state cycles show quite different patterns of activation of elements. In some, nearly all of the elements are stable while in others, a larger number are actively changing. This means that the same network can be activated in divergent ways, producing different pathways of calculation and a range of stabilisations. Some state cycles are almost imperturbable, whilst others are sensitive to the slightest change. The difference carries back to underlying connectivity of the network. The same network, the same ensemble of calculations, each node retaining the same program, can give rise to widely divergent sequences of events.

Finally, some state cycles are strongly attractive of 'surrounding' states. This means that they tend to channel divergent sequences back towards the cycle. In that case, a set of elements forms a centre into which most or even all other sequences will sooner or later cascade. Such a cycle forms a centre of calculation in Latour's terms [Latour, 1987, 235] Every centre of calculation has the effect of polarising the collective towards itself. Once the centre is in place, many paths head towards the same centre: "[i]t is always a matter of discovering the one that polarizes the multiple toward it" [Serres, 1995, 93]. In these terms, the question of how the incalculable is worked through calculation concerns the redundancy or patterns associated with the collective. What I think is interesting or suggestive in the play of the boolean random boolean networks is not so much the particular forms of organisation that arise from an interaction between the elements, but the fact that these interactions might be understood different actualisations of the same set of calculations, iterated on often an only slightly different set of starting states. The sequences in which they interact are both highly variable and redundant or patterned.

The role that devices or technical quasi-objects play in collectives is this: they open reserves of repetition, cycling and anticipation which interact with the diachronic instabilities of play, centred on the event. "Every historical event", writes Giorgio Agamben, "represents a differential margin between diachrony and synchrony, instituting a signifying relation between them" [Agamben, 1993, 75]. In his terms, the specificity of an event concerns the way in which it institutes an altered set of conversions between the asymptotically unattainable poles of a purely synchronic collective, absorbing all events, and a purely diachronic collective, completely given over to the transience of events. In these terms, the play of the graph device adds some indication of how that differential margin is constituted through cycles, delays, and the propagation of fluctuations when events are subject to the calculations of graphs or networks. In these terms too, the incalculable and calculation are intensively woven together for us in the play of technical mediation.

References

Agamben, G. Infancy and history : the destruction of experience tr. Heron, L. (London & New York: Verso, 1993)

Ansell Pearson, K. Viroid Life: Perspectives on the Transhuman Condition, (New York & London: Routledge, 1997)

Callon, M. "Techno-economic networks and irreversibility" ed. Law, J. The Sociology of Monsters, Essays on Power, Technology and Domination (London: Routledge, 1991), 132-160

De Landa, Manuel, A Thousand Years of Non-Linear History, (New York: Zone Book, 1998)

Haraway, D. "A Game of Cat's Cradle: Science Studies, Feminist Theory, Cultural Studies", Configurations 2.1 (1994) 59-71

Kauffman, S. At Home in the Universe. The Search for the Laws of Self-Organization and Complexity, (Oxford: Oxford University Press, 1995)

Kauffman, Stuart A. The Origins of Order. Self-organization and Selection in Evolution, (New York & London: Oxford University Press, 1993)

Latour, B. Science in Action. How to follow scientists and engineers through society, (Cambridge: Harvard University Press, 1987)

Latour, B. We Have Never Been Modern, tr. Porter, C. (London: Harvester Wheatsheaf Press, 1993)

Lyotard, J.-F. The Inhuman. Reflections on Time, tr. Bennington. G. & Bowlby, R. (Stanford, Stanford University Press, 1991)

Nancy, J.-L. "War, Law, Sovereignty - Techné" Rethinking Technologies, ed. Conley, V. A. (Minnesota University Press, Minneapolis 1993)

Serres, M. Genesis, tr. James, G.& Nielson, J. (Ann Arbor: The University of Michigan Press, 1995)

Serres, M. The Parasite, (Baltimore: Johns Hopkins