Tag: Guattari

The Man with Two Brains, but no Umbrella

This short piece features in a book produced as part of the Conway Actants exhibit by Deborah Gardner and Jane Millar at Conway Hall, Red Lion Square, London earlier this year – a ‘a unique, visual art project which directly responds to Conway Hall’s spaces, ethos, activities and archive.’ https://conwayhall.org.uk/event/conway-actants/. It’s adapted from a talk at a Club Critical Theory event at the Exhibit.

It also lightly engages with some of ideas developed in the forthcoming book The Assemblage Brain: Sense making in Neuroculture (Minnesota University Press) due early next year.

The Man with Two Brains, but no Umbrella

Tony D Sampson, reader in digital culture and communications, University of East London

My critical intervention into Conway Actants draws on two books by Deleuze and Guattari which problematize the relation between art and philosophy in very different ways by introducing two kinds of brain. These two brains might seem like an uncanny way to engage with an art exhibit, but I’m interested here in the critical spaces that open up when philosophical and artistic brains meet and the potential to disentangle art from the circuitry of capitalism.

Rhizomes Brains

Artists familiar with Deleuze and Guattari might already know the rhizome brain.[i] Although it rejects signifying practices, causing all kinds of problems for the visual arts, it’s easy to see why so many contemporary artists are drawn to its light. It’s a seductive image of thought that brings together philosophical concepts and artistic sensations in mixtures.

Rhizome brains closely follow the neuron doctrine. They are not, as such, a continuous reticular fabric. There’s a synaptic discontinuity between cells. Thoughts leap across gaps, making the brain an uncertain, probabilistic multiplicity, swimming in its own neuroglia. Indeed, as some artists might notice, many people have a metaphorical tree growing in their head, but axons and dendrites are more like bindweed than roots. The artist’s rhizome does not, as such, produce forms in representational space or objects of art under the scrutiny of subjective acts of signification. Rhizome art is an aesthetic propagation; a contagion, spreading outward in unquantifiable spaces of experience.

Another attraction of the rhizome is that it connects anything to everything. There are indeed many rhizomatic mixtures in Conway Actants. In the Brockway Room, for example, there’s a series of abstract machines, acting like transmitters and receivers. This is not information transmitted through a medium. These are assemblages of disparate materials, objects and structures. Like Franz West’s adaptive and portable art we find the open possibilities of relationality. This is art that teeters on a threshold between propagation and disintegration.

Conway Actants assembles the historical figures from Conway Hall, connecting them to organic and inorganic materials. We find networks in networks, lumps of matter, crystals, images of splatters, scattered beads, layered and hole punched surfaces. Matter is strewn across the images, submerged in clear resin and magnified in parts or conversely blurred. It’s a fractured kind of viewing; disturbing, obscuring and connecting by way of blots and blobs. These are also event-sculptures. The hives, for example, are infestations in the library and hallway. They remind us what is outside, on the roof!


Chaos Brains


The artist seduced by the rhizome might be surprised by Deleuze and Guattari’s second brain. It certainly disturbs the composed mixtures of sensation and concept. But it is here in the chaos brain that we find an unexpected critical turn enabling us to more closely inspect the space of possibilities in which artists and philosophers meet.


In their swansong, What is Philosophy? Deleuze and Guattari unpredictably argue that concepts and sensations do not mix. Forget the nomads that used to cut across art and philosophy. We now have distinct planes. It’s not that artists or philosophers have different kinds of brains. This is not a crude neuroimaging exercise, differentiating between the location of sensations and concepts in the brain. It is the work of all brains to tear open the umbrella that shields the brain from chaos—and “plunge” into the infinity that confronts us all.[ii] The difference here is found in what artists and philosophers discern from the endless possibilities they encounter. What emerges from these plunges into chaos is a difference in kind. The promises of rhizomatic mixture are replaced by the almost biblical affirmation: Thou shalt not mix![iii]


Philosophers Produce Concepts, Artists Produce Sensations


How do these differences affect the relation between art and philosophy? To begin with, both need to be considered in relation to the endless possibilities we find in events. Some of the mixture of the rhizome is retained in the work of concepts. They give consistency to infinite chaos – they actualize the event. This is not, however, about giving form to the event. A concept is not a representation of events! Concepts are not forms of opinion. They are always relational.


“There is always an area ab that belongs to both a and b.”[iv]


The philosopher’s analytical tool of choice, the conceptual personae, is intended to surpass forms of opinions. But like one Nietzsche’s personae, concepts can only interfere with sensations. They can never become sensations.


The artist’s sensation never actualizes the event, but instead embodies the possibilities it produces. The sensation is, in fact, neither virtual nor actual. It is always about the possible. Like this, the hive sculptures give a body to the event, providing it with a universe of possibilities in which to live. So while the artist composes with her materials; offering contemplations and enjoyments,[v] she also produces a universe that constructs limits, distances, proximities and constellations.[vi] Importantly then, sensations are not perceptions or acts of signification. They are affects that deploy aesthetic figures to route around perception and signification.

The historical trajectory of art leads Deleuze and Guattari to a junction whereby the emergence of abstract art and conceptual art seems to promise to bring sensations and concepts together.[vii] The shadow of Duchamp looms large here since he assembles both in the same gallery space. But to what extent do they really mix? Indeed, the move to conceptual art is explicitly rejected by Deleuze and Guattari. This is because when art becomes too informative it also becomes unclear as to whether or not it is a sensation or concept.[viii] The problem is, it seems, that it’s not in the artwork itself, but in the spectator’s “opinion” that the sensation is, or is not, manifested as art. It is the audience who decide (as receivers of information) “whether it is art or not.”[ix] Like this, Duchamp’s readymade does not mediate affect through the experience of a sensation. It is a subjective act that mixes signification with sensation, and as a consequence, it seems, the artist loses some of the affective power of the work.

Deleuze and Guattari evidently favour abstract art. It refines the sensation by dematerializing matter. Turner’s chaotic seascapes, for example, produce a sensation of the concept of the sea. In contrast, conceptual art is not so refined. It dematerializes through generalization. This is undeniably a surprising denunciation given that conceptual art is the condition for contemporary art and Deleuze and Guattari are seen by many to be the philosopher kings of contemporary art. Nonetheless, the problem is clear: conceptual art produces signification not sensation.[x]

Maybe in its pursuit of mixture, contemporary art has been decidedly selective in its choice of Deleuzean brains. I’m sure the revelation of a second brain will cause a certain amount of discomfort. But perhaps art as sensation retains a political clout that art as signification can never achieve? The former is famously infused with autonomy while the latter has too much information, which might, Deleuze and Guattari feared, collapse into immaterial capitalism.

Art in the Cultural Circuits of Immaterial Capitalism

In his Control Society thesis, Deleuze expressed concern about art’s relation to capitalism. Art had already left the gallery and “entered into the open circuits of the bank.”[xi] As Rivera must have asked himself in the Rockefeller Center in the early 1930s, “how can art find a critical space in this place?” Conceptual art is indeed part of an art system today; made up of critics, uber dealers, advisors, bankers and oligarchs. It’s these nodes in the circuitry, not the spectator, who decide if a concept is art (or not) by attributing a discourse, and ultimately, a price tag to it. This is a system that even Charles Saatchi calls “too toe-curling for comfort.”

How can art escape the cultural circuits of capitalism? Can a post-conceptual art create new conceptual weapons or frame its own concepts? [xii] Can art become non-conceptual?[xiii] Does art need to become, as Deleuze and Guattari contend, non-art! If nothing else the uncomfortable critical space produced by this second brain asks art to search for new rhizomatic lines of flight that might become disentangled from the circuitry in which it seems to be presently trapped.


[i] Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia (Minneapolis: University of Minnesota Press, 1987).

[ii] Gilles Deleuze and Félix Guattari, What Is Philosophy? (London: Verso, 1994), 202.

[iii] See Isabelle Stengers, “Gilles Deleuze’s Last Message.” published online at: http://www.recalcitrance.com/deleuzelast.htm (accessed May 2016).

[iv] Deleuze and Guattari, 1994, 20.

[v] Ibid., 212.

[vi] Ibid.,177.

[vii] Ibid., 198.

[viii] Ibid.

[ix] Ibid.

[x] This is point of contradiction nicely captured by Stephen Zepke.

  • Deleuze and Guattari explicitly reject Conceptual Art.
  • Conceptual Art is the condition of Contemporary Art.
  • Deleuze and Guattari are touted far and wide as the philosophers of Contemporary Art.
  • . . . Huh?

See Stephen Zepke, The post–conceptual is the non–conceptual, Deleuze and Guattari and conditions of Contemporary Art, Autumn Art and film through Deleuze and Guattari (2009). http://www.actualvirtualjournal.com/2014/11/the–post–conceptual–is–non–conceptual.html (accessed April 2015).

[xi] Gilles Deleuze, “Postscript on the Control Society,” in Neil Leach (ed.) Rethinking Architecture: A Reader in Cultural Theory (New York: Routledge, 1997), 295.

[xii] As for example, Ricardo Basbaum’s work seeks to do. From a discussion between Basbaum and the author in London (July 2013).

[xiii] As Zepke argues.


Dr. Tony D. Sampson is reader in digital culture and communications at the University of East London. His publications include The Spam Book (coedited with Jussi Parikka, 2009), Virality: Contagion Theory in the Age of Networks (Minnesota, 2012) and The Assemblage Brain: Sense Making in Neuroculture (Minnesota, 2017). He has also published numerous journal articles and book chapters. Tony is a cofounder of Club Critical Theory: Southend and director of the EmotionUX Lab at UEL.

Deleuze in Southend-on-Sea

Deleuze in Southend-on-Sea

DSC_0136 modified copy

Tony D Sampson

Text based on a talk given at the first Club Critical Theory night at the Railway Hotel in Southend, Essex, UK on April 17th 2014. Corrections may still be needed.

DSC_0155 modified copy
Photo by Iry Hor

Applying Deleuze to Southend in the context a Club Critical Theory discussion is a doubly difficult task. To begin with Deleuze introduces a new vocabulary that sits atop of an already complex layer of philosophical debate. We will need to grapple with complexity theory and a strange incorporeal materialism. Then there are personal reasons that make this task problematic relating to my own situation here as a Southender. Deleuze, for me, represents an escape from certain aspects of my early working life in Southend at the local college and particularly my time spent in what we referred to then as the School of Media and Fascism. Deleuze was part of my escape plan from this horror, so returning to Southend with him in mind presents all kinds of problems, but let’s put those aside for a moment and see where Deleuze in Southend takes us.

DSC_0157 modified copy
Photo by Iry Hor

The point of this introduction to Deleuze is to apply some critical distance between our material and expressive experiences of Southend. So let’s begin by saying what I think Deleuze is not. He is not postmodern or poststructuralist. Such generalities are, as I will try to explain, not acceptable in Deleuzian ontology. With regard to the latter, language or linguistic categories, while not discounted, they are not the major concern. I also do not see Deleuze’s post-Marxism as necessarily antithetical to the Left or indeed Marx. He was supposedly planning a book on the latter before he died. Shame we missed that one! What attracted me to Deleuze was his attempt to rally against all kinds of totality, fascism and authoritarian regimes. But there are many Deleuzes. His project is vast. What I’ll focus on here are some aspects of Difference and Repetition which seem to me to map out the trajectory of his philosophical project born out of the frustrations of 1968 and extending into his work with Guattari – a time marked by a unrequited desire for revolution. A time that instigated a need to rethink what revolution really means.

Simply put, we need to overcome an old philosophical problem; that is to say, the problem of the One and the many. This is a mereological problem – meaning the study of the relation between parts and wholes and an ongoing debate concerning what constitutes an emergent whole. Before applying this to Southend directly I want to draw a little on crowd theory to illustrate what I mean by mereology.  The origins of social theory are rooted in a question concerning what constitutes individuals and crowds. That is, what happens to an individual when she becomes part of a crowd? In the late 1800s Gustave Le Bon thought that once the socially conscious individual became part of a crowd she was incorporated into a stupid and mostly unconscious collectivity. A great influence on Freud’s group psychology and 1930s fascism, Le Bon applied a kind of emergence theory that assumes that the whole has properties independent of the parts that compose it.

We can illustrate Le Bon’s claim by visiting Roots Hall Football ground every week.

Soccer - FA Cup - Third Round - Chelsea v Southend United - Stamford Bridge

Le Bon’s football crowd is an emergence of a kind of collective intelligence in reverse in which smart individuality dissipates into the unruly wholeness of the crowd. We can also see this in terms of the discourses of local policy makers when they refer to the Southend community as a whole.  The properties of the emergent whole are assumed to become superveniant – meaning that the interaction between parts produces a whole that has its own properties. It is this immutable wholeness that has a downward causal power over the parts from which it has emerged. This is the kind of sociology that claims that we are the product of the society we are born into, i.e. a member of a certain class.

Oxford v Southend

What Deleuze does is replace the One and the many with the multiplicity. We need to draw here on a little bit of complexity theory, but before that Deleuze and Guattari set out a really nice case against superveniant wholes in their book Anti Oedipus.  In short, all things become parts. Wholes are just bigger parts. Instead of the One, we encounter populations of parts. The multiplicity becomes the organizing principle in a complex relationality between parts, which we will call here assemblages.  Unlike the timelessness (synchronic) of emergent superveniant wholes, we find that assemblages are exposed to historical (diachronic) processes. A population of parts becomes a territorialization: a territory held together by relationality, but exposed to deterritorializations and reterritorializations. Assemblages also have material and expressive parts. For instance, material parts might be the buildings we surround ourselves with or without the financial capital we require to build them. Expressive parts might refer to the availability of cultural capital (knowledge e.g.) or the flow of conversations, routines, rituals, habits and discourses that co-determine the social spaces we inhabit.

The question therefore moves away from simply locating the properties of the emergent whole to the question of what brings assemblages together. What are the interactions that give life to a novel territory? We also need to take into account how interactions between parts and the wider environment: the way in which the football crowd interacts with the materiality of the stadium or the weather or the expressive voices and gestures of the away supporters.

Bury v Southend

It is nonetheless crucial that the multiplicity is not mistaken for a master process that determines the territory – the crowd e.g. We need to see this rendering of a crowd as what the Deleuzian Manuel Delanda refers to as a space of possibilities. This is a brand of Deleuze that draws heavily on complexity theory. Herein the properties of the crowd have identities that are not fixed or essential. The crowd has capacities that can affect and be affected. The interaction between home and away supporters, e.g. The crowd also has tendencies that function as pattern changers. Say Southend United actually manage promotion this season, and that’s of course more virtual than actual at this point, but should it happen then the crowd will swell in number perhaps requiring a new stadium. There is also the complexity of universal singularities to follow, which, in simple terms, provides the trajectories or lines of flight that the crowd follow. Continuing with the football theme this line of flight can be guided by fixtures, but again the design of the stadium the crowd encounters is important here, since to a great extent it provides the crowd with its contours and flows, and plays a role in its emergence. Finally, singularities are drawn to basins of attraction. Again, the shape of the crowd becomes a territory because of the material, expressive and environmental factors it is coupled to.


We can draw on another example: a building. Take Carby House and Heath House in Victoria Avenue. The so-called Gateway to Southend!

DSC_0141 modified copy
Photo by Iry Hor
DSC_0124 modified copy
Photo by Iry Hor

As spaces of possibility buildings have properties. They are big or small, predominantly this colour or that colour. They generally have windows. These windows also have capacities. They are windows that can be opened or smashed. But importantly, they need someone to open or smash them. Without this interaction taking place the capacity remains virtual rather than actual. It is a double event in this sense. There are tendencies too. Buildings can decay over time for all kinds of reasons; weather, lack of maintenance, vandalism etc. Investment, or a lack of it, can act as a basin of attraction which singularities are drawn to forming areas of regeneration or decay. Furthermore, decay can lead to other buildings decaying. This last point is very important to my work in assemblage theory or what I call contagion theory. Urban spaces can emerge as contagious material assemblages converging with expressive social epidemiologies. What we might call crime waves, for example, can begin with very small interactions between parts. These are events, like one solitary broken window, that can lead to further events, such as more windows being smashed. This leads to break-ins, fire, perhaps even a death.

DSC_0123 modified copy
Photo by Iry Hor

What I have tried to do in preparation for this talk is grasp Southend as an assemblage. The questions we could ask about these spaces need to account for historical processes, the material and expressive parts, the spaces of possibility, the interactions between parts and environments, the properties, capacities, tendencies, singularities, and basins of attraction. More than that, we also need to ask what kinds of assemblage we can make from these relations.  What novel critical networks, new artworks and performances can interact with existing parts? Our venue, the Railway Hotel is in many ways one such place. It was after all known locally as the BNP pub. The BNP would, I’m told, meet here, in this room. The landlord has transformed this building. He is a true Deleuzian. Hopefully Club Critical Theory can continue to provide an expression to this kind of positive change in Southend.


With photographer Iry Hor I have started to look at some of the urban assemblages that surround us. These include, on one hand, the lines of flight of regeneration; most notably, the Forum, the so-called Lego Building and the new college and university campuses. This is regeneration we can understand in part as financial territorialization. Territories formed around access to vast amounts of capital resources; for example, £54 million in 2004 for the new college campus, £14 million from the government and £9 million from EEDA for the new University of Essex building. In the case of the Forum there has been £27million invested by Southend Council, the University of Essex and South Essex College.

DSC_0118 modified copy
Photo by Iry Hor
DSC_0109 modified copy
Photo by Iry Hor
Photo by Iry Hor
DSC_0104 modified copy
Photo by Iry Hor

Love them or loathe them these new shiny buildings have the capacity to affect and be affected. I found this nice quote about the Forum in the local paper from an OAP resident living in Sunningdale Court sheltered housing in nearby Gordon Place.

“I love it. When I go out, I have to pass the building and I have this great big smile on my face as I do. I’m just so happy.”

But this area in Southend is not an indelible whole. Its access to resources is never permanent. With £8.6million cuts to government funding to the college in the next few years many of the expressive internal parts of this shiny new building will begin to dissipate. Indeed, the external interactions between the new campuses and the adjacent derelict buildings are a constant reminder of the tendencies of decay that can affect all buildings that fall into financial decline.

DSC_0096 modified copy
Photo by Iry Hor
DSC_0107 modified copy
Photo by Iry Hor

Again, things are never whole. Indeed, on the other hand, there is the trajectory of urban decay in Victoria Avenue, including Carby House, Heath House and the old college building in Canarvon Road. A report on the planning application for the new campuses in 2003 made it clear that “the disposal of the existing campus buildings is an important part of the delivery process for the new campus.”

DSC_0145 modified copy
Photo by Iry Hor

It seems strange to me that these building are owned by developers. What kind of perverse development is this? To try to find out I traced back the various interactions between these so-called developers and Southend Council as reported in the local papers. These interactions are perhaps best summarized by Anna Waite, the former Tory Southend councillor responsible for planning in 2008 who said then: “I wish I knew what was happening. I haven’t heard anything from the developers for months.” Is this evidence enough for the need for public intervention into private property?

DSC_0153 modified copy
Photo by Iry Hor

Perhaps Victoria Avenue is an example of deterritorialization? Well, things are not that simple in Deleuze’s onology because parts are at their most creative when they are deterritorialized. Carby House and Heath House are not only the rotting Gateway to Southend. They are the central hub of a contagion of decay.

Heath House closed when the remaining 300 workers were made redundant in 2000. Along with Carby House it has, in the past 14 years, become a mesmerizing example of a transformative decomposition of material parts brought about by its open interaction with the environment and a withdrawal of access to resources. But Carby House and Heath House are perhaps in the process of expressive and material reterritorialization. They are certainly a defiant example of what affordable housing means in times of austerity in Southend-on-Sea. They have become a home to the homeless.

DSC_0132 modified copy
Photo by Iry Hor
Tony D Sampson introduces Deleuze at the first Club Critical Theory night upstairs at the Railway Hotel
Andrew Branch from UEL introduces Bourdieu to Club Critical Theory
Giles Tofield (The Cultural Engine) chairs the first Club Critical Theory

A few notes

The School of Media and Fascism is attributed to Jairo Lugo currently at University of Sheffield.

Gustave Le Bon’s contribution to Crowd Theory is The Crowd.

For more on parts and wholes see Deleuze and Guattari’s Anti Oedipus: Capitalism and Schizophrenia 42-50.

To better grasp Manuel Delanda’s approach to assemblages via complexity theory see A New Philosophy of Society and Philosophy and Simulation.

For more on contagion theory see Tony D Sampson’s book Virality: Contagion Theory in the Age of Networks.

References to interactions between Southend Council and developers, investments in new buildings and local reaction sourced in the Evening Echo archive.



Talks on Virality (part three)

(Goldsmiths 22nd Oct)

…. Continuing on Milgram.

Agentic states are generally traced to the disposition of an individual caught up in a natural chain of command rather than a disassociated state.

 For Milgram, imitation leads to conformity, but obedience ultimately requires the explicit social action of the individual.

Nevertheless, Milgram was famously the great manipulator of the social encounter. His triggering of crowd contagion was unquestionably socially engineered.

In Virality Milgram is positioned as a hypnotist, planting suggestibility — via the points of fascination provided by his skyward looking actors — into the neurological, biological, and sociological composition of the crowd.

From this privileged position, Milgram not only observed, but also controlled the implicit, involuntary, and contagious responses his experiment induced.

Virality also makes an important distinction between Tarde and his contemporary Gustave Le Bon.

 First, these two crowd theorists seem to be at the base of two distinct theoretical lines of influence.

One characterized by Le Bon’s direct link to Freud’s psychoanalysis.

The other by Tarde’s role in the development of Deleuzian ontology.

Second, there are conflicting ideas about the role contagion plays in social movements.

Unlike Le Bon’s conservative concerns for the stability of an old aristocratic order, Tarde introduces a novel media theory that considers both the potential and improbability of rare movements of democratic contagion.

Last, there are two very different notions of hypnotic power at work in Le Bon’s The Crowd and Tarde’s Laws.

The former falls back on a direct representational means of control (the crowd that thinks, or hallucinates, in images), while the latter speaks of indirect subrepresentational and reciprocal hypnotisms.

The coupling of Tarde/Deleuze and Le Bon/Freud presents a very different relation between conscious and unconscious states. As Deleuze and Guattari argue, Freud tried to approach the crowd from the point of view of the unconscious. But he didn’t see that the unconscious itself was fundamentally a crowd.

He was perhaps myopic and hard of hearing insofar as he misconstrued the crowd for a certain individual. In contrast, schizoid analysis does not “mistake the buzz and shove of the crowd for Daddy’s voice.” Daddy’s hypnotic authority is grasped instead as symptomatic of the psychoanalyst’s predisposition to repress the desiring machine by locking it away (inside) the representational space of the unconscious.

Like this, Le Bon’s crowd contagion acts on the social, forcing it to reproduce a unified collective mentality.

As an alternative to The Crowd’s delusional fantasy, Virality explores how the tendency to pass on real and illusory contagions can be attributed to phantom-events.

Significantly, phantom-events are outcomes, or effects, of actions and passions, not their Oedipal representation. The phantom is paradoxically without a body but is nevertheless a material thing (an incorporeal materiality). The event becomes detached from its causes, spreading itself from surface to surface. This is not the point at which affect turns into fantasy, but rather where the ego spreads to the surface.

It is the hypnotized subject’s distance from the phantom-event that makes her prone to variable appearances of the real and the imagined.

Arguably, this is the logic of sense apparent in the spreading of chain letters, Trojan viruses, false rumors, and fake video virals.

These are the emergent forces of a contagious encounter, in a social field, which function according to an action-at-a-distance. The phantom-event contaminates those caught somewhere in the loop between the imaginary and the real events she encounters and believes in

(see part four)

The Contagions of The Spam Book Revisited

For those visitors to Virality who haven’t yet read The Spam Book (Hampton Press, 2009) here is the introductory chapter (co-written with Jussi Parikka) and including the introduction to the first section of the collection: Contagions.



Jussi Parikka Tony D. Sampson

In the classic 1970s Monty Python sketch, a couple enters, or rather, in typ­ical Pythonesque mode, descend upon a British cafe and are informed by the waitress that Spam is on the menu.

There’s egg and bacon; egg sausage and bacon; egg and spam; egg bacon and spam; egg bacon sausage and spam; spam bacon sausage and spam; spam egg spam spam bacon and spam; spam sausage spam spam bacon spam tomato and spam.

The joke, of course, refers to Spam, the canned food substance that originat­ed in the 1930s in the United States, but was famously imported into Britain during World War II.1 Spam (spiced ham) became a cheap supplement for pure meat products, which were in severe shortage during the conflict. Perhaps the cheapness and mass consumption of Spam during the period are among the reasons why it became the butt of many music hall jokes. Indeed, following the music hall tradition, Spam becomes central to the Python’s often nonsensical sketch as it quickly deterritoralizes from the more obvious context of the waitress–customer discussion to a full Viking chorus of spam, spam, spam . . .

Spam, spam, spam, spam. Lovely spam! Wonderful spaaam! Lovely spam! Wonderful spam. Spa-a-a-a-a-a-a-am! Spa-a-a-a-a-a-a-am! Spa-a­a-a-a-a-a-am! Spa-a-a-a-a-a-a-am! Lovely spam! (Lovely spam!) Lovely spam! (Lovely spam!) Lovely spaaam! Spam, spam, spam, spaaaaam!

The joke’s intention, as Monty Python jokes in general tend to do, is to get us to laugh at a major concern of contemporary communications: communi­cation breakdown.2 The habitual repetition of everyday events quickly turns into a chaotic mess and a turbulent example of noncommunication. The familiar communication channels of this architypal British working-class cafe are suddenly flooded with intruding thirds, a noise that fills the acoustic space with a typically meaningless Python refrain: spam, spam, spam. In this sense (or nonsense), the sketch manages to parody the meaninglessness intrinsic to any meaningful act of communication by increasing the level of environmental noise that accompanies the process of sending messages. In fact, the invading Viking horde (perhaps a veiled reference to the U.S. troops stationed in Britain during World War II) eventually drowns out, or “spams,” the ongoing conversation between the waitress and the customers, transforming the chaotic scene into a closing title sequence filled with more references to spam, spam, spam . . .

More than 30 years later, and the analogy made between Python’s sketch and the unsolicited sending of bulk e-mail has provided new impetus to the word spam. Perhaps for many of us digital spam is less funny. For those of us increasingly reliant on e-mail networks in our everyday social interac­tions, spam can be a pain; it can annoy; it can deceive; it can overload. Yet spam can also entertain and perplex us. For example, how many of you have recently received an e-mail from “a Nigerian Frind” (sic) or a Russian lady looking for a relationship? Has your inbox overflowed with the daily announcements of lottery prizes and cut price Viagra? Perhaps you have experienced this type of Dadaist message, which appears at the zero degree of language.

Dehasque Little Bergmann Dewald Murray Eriksson Tripathy Gloo Janusauskas Nikam Lozanogmjpkjjpjrfpklkijnjkjflpkqkrfijmjgkkj kgrkkksgpjmkqjmkggujfkrkpktkmmmjnjogjkhkhknjpgghlhnkofjgp gngfgrgpkpgufifggmgmgugkirfsftkgtotutmumpptituuppmqqpgpjpkq qquqkuqqiqtqhqnoppqpruiqmqgnkokrrnknslsifhtimiliumgghftfpfnfsf nfmftflfrfjhqgrgsjfflgtgjflksghgrrgornhnpofsjoknoofoioplrlnlrjim jmkhnltlllrmthklpljpuuhtruhupuhujqfuirorsrnrhrprtrotmsnsonjrh rhrnspngslsnknfkfofigogpkpgfgsgqfsgmgti qfrfskfgltttjulpsthtrmkhnilh rhjlnhsisiriohjhfhrftiuhfmuiqisighgmnigi gnjsorgstssslolsksiskrnrnsf­spptngqhqitpprpnphqrtmprph.3

Generally speaking, however, spam arouses a whole panorama of negative and bemused emotions, in much the same way as computer viruses, worms, and the uninvited excesses of Internet pornography often do. In fact, we might collectively term these examples as digital pollution and identify them as a major downside (or setback) to a communication revolution that prom­ised to be a noiseless and friction-free Road Ahead.4 In this context, and against the prescribed and often idealized goals of the visionaries of digital capitalism, they appear to us as anomalies. Nevertheless, despite the glut of security advice—a lot of which is spuriously delivered to our e-mail inbox­es, simply adding to the spam—little attention has been paid to the cultural implications of these anomalous objects and processes by those of us engaged in media and communication studies, and particularly studies linked to digital network culture. Perhaps we have been too busy dealing with the practical problem and have failed to ask questions of anomalies in themselves.5 The innovation of this volume is to answer these questions by considering the role of the anomaly in a number of contexts related to digi­tal communication and network culture. However intrusive and objection­able, we argue that the digital anomaly has become central to contemporary communication theory. Along these lines, we begin this book by asking: “In what sense are these objects anomalous?”

If we constrain ourselves to the dictionary definition of the anomalous, as the unequal, unconformable, dissimilar, and incongruous, in other words, something that deviates from the rule and demonstrates irregular and abnor­mal behaviour or patterns,6 then arguably our question becomes problema­tized by everyday experiences of network culture. To be sure, spam, virus­es, worms, and Internet porn are not irregular or abnormal in this sense. This junk fills up the material channels of the Internet, transforming our communications experiences on a daily or even hourly basis. For example, according to recent moderate sources, 40% of e-mail traffic is spam, mean­ing some 12.4 billion spam mails are being sent daily.7 Similarly, in an exper­iment using a “honeypot” computer as a forensic tool for “tracking down high-technology crime,” a team from the BBC in the United Kingdom recently logged, on average, one attack per hour that could render an unpro­tected machine “unusable or turn it into a [zombie] platform for attacking other PCs.”8 It is therefore not surprising that many network users fear everyday malicious Internet crime more than they do burglary, muggings, or a car theft.9 Indeed, within the composite mixture of the everyday and the anomalous event, the fixed notion that the normal is opposed to the abnor­mal is increasingly difficult to reconcile.

It is from this cultural perspective that we approach the network anom­aly, arguing that the unwelcome volume of anomalous traffic informs mul­tiple articulations concerning the definition of the Internet and how the net­work space is becoming transformed as a means of communication. For example, in the late 1990s, network culture was very much defined in terms of the economic potential of digital objects and tools, but recently the dom­inant discourse has tilted toward describing a space seemingly contaminated by digital waste products, dirt, unwanted, and illicit objects.10 There are, indeed, a number of ways in which anomalies feedback into the expressive and material components of the assemblages that constitute network cul­ture. On one hand, network security businesses have established them­selves in the very fabric of the digital economy (waste management is the future business model of late modernity). The discourses formed around this billion-dollar security industry, ever more dependent on anomalies for its economic sustenance, lay claim to the frontline defense of network cul­ture against the hacker, the virus writer, and the spammer, but they also shape the experiences of the network user. On the other hand, analysis of the build up of polluted traffic means that evaluations are made, data is translat­ed into prediction models, and future projects, such as Internet 2.0 and other “spam and virus-free” networks, are proposed as probable solutions to the security problems facing online businesses and consumers. In other words, anomalies are continuously processed and rechanneled back into the every­day of network culture. Whether they are seen as novel business opportuni­ties or playing the part of the unwanted in the emerging political scenarios of network futures, anomalous objects, far from being abnormal, are con­stantly made use of in a variety of contexts, across numerous scales. Therefore, our aim in this introduction is to primarily address the question concerning anomalies by seeking conceptual, analytic, and synthetic path­ways out of the binary impasse between the normal versus the abnormal.

In our opinion, what makes this collection standout, however, is not only its radical rethinking of the role of the anomalous in digital culture, but that all of the contributions in the book in one way or another mark an important conceptual shift away from a solely representational analysis (the mainstay of media and cultural studies approach to communication). Rather than present an account of the digital anomaly in terms of a representation­al space of objects, our aim as editors has been to steer clear of the linguistic categorizations founded on resemblances, identities, oppositions, and metaphorical analogies. In our opinion, the avoidance of such representa­tional categorizations is equal to rejecting the implicit positioning of a pre­fabricated grid on which the categories identified constrain or shackle the object. For us, judging a computer virus as a metaphor of a biological virus all too easily reproduces it to the same fixed terms conjured up in the metaphor in itself and does not provide any novel information concerning the intensive capacities of, for example, a specific class of software program. Hence, our desire is to avoid metaphorics as a basis of cultural analysis is connected to our wish to focus “less on a formation’s present state conceived as a synchronic structure than on the vectors of potential transformation it envelops,” to use Brian Massumi’s words.11 Furthermore, we do not wish to blindly regurgitate the rhetoric of a computer security industry who peddle metaphorical analogies between the spread of computer viruses and AIDS. For that reason, we have attempted to avoid the tendency to conjure up the essence of the digital anomaly from a space considered somehow outside—a space populated by digital Others or out-of-control Artificial Intelligence pathogens engaged in a evolutionary arms race with a metaphorical immune systems.12 In this sense, the reference to the dark side of digital culture in the subtitle of this book is more closely allied to our understanding of the dark­ness surrounding this type of representational analysis than it is the darkness of the object in itself. We intend to address this lack of light (or lack of analy­sis) by considering a conceptual approach that is more fluid, precise, and inventive in terms of a response to the question of the anomaly. It is designed to grasp the liminal categories and understand the materiality and paradox­ical inherency of these weird “objects” and processes from theoretical and political points of view.

We do nevertheless recognize that on material and representational lev­els, spam and other anomalies do have effects. But in this volume, we acknowledge the problems inherent to the deployment of a media theory based exclusively on effect.13 To be sure, in the past, this is how media anom­alies such as violence and pornography have been treated by certain field positions within media and communication studies—the effects of which were considered to cultivate an audience’s sense of reality.14 Perhaps our approach will be seen as more Monty Python than George Gerbner, in as much as we are less interested in the causality afforded to the impression of media meanings than we are in the process of communication in itself. Yet, this does not indicate a return to the transmission model so prevalent in early communication theory, wherein the establishment of communicative fidelity between Sender A and Recipient B, in the midst of signal noise, is the basic setting. On the contrary, instead of the linear channeling of messages and the analysis of effects, one might say that this book is concerned with affect and ethology: how various assemblages of bodies (whether technolog­ical, biological, political or representational) are composed in interaction with each other and how they are defined, not by forms and functions, but by their capabilities or casual capacities. In other words, we are interested in how one assemblage, a heterogeneous composition of forces, may affect another.15 Later, we refer to this approach as topological, because we argue that it releases us from the analytical dichotomy between causal (fatal) effects and complete indeterminism, and allows us to instead consider a co­causal, intermediate set of determinisms and nonlinear bodies. Significantly then, we use the term topology to address the complex assemblages of net­work society, which are not restricted to technological determinism, or the effects technology has on society, but encompasses the complex foldings of technological components with other aspects of social and cultural reality.16

Importantly, in this analytical mode, we are not seeking out the (prede­fined) essence of the anomaly (whether expressed in terms of a representa­tional category or intrinsic technical mechanism), but instead a process in a larger web of connections, singularities, and transformations. Therefore, our question positions the anomaly in the topological fabric of an assemblage from where new questions emerge. For example, how do operating systems and software function in the production of anomalous objects? In what kind of material networks do such processes interact? How are certain software processes and objects translated into criminal acts, such as vandalism, infringement, and trespass?17 We now elaborate on this theoretical position from a historical perspective, before addressing the questions of affects, topology, and anomalous objects.


Analysis of media in terms of the anomaly is nothing new. There are, in fact, many approaches that implicitly or explicitly address anomalous media. A number of well-known approaches that should be familiar to the media and communication field, including the Frankfurt School and the media-ecolog­ical writings of the Toronto School (including Neil Postman), have regard­ed (mass) media in itself as an anomaly. Of course, these approaches do not concern a strict deviation from the norm or events outside of a series, as such. Instead, the dangerous anomaly has long been regarded as a function of the homogenizing powers of popular media. The repetitious standardiza­tion of media content is seen as a result of the ideological capitalist-state apparatus, which applies the logic of the factory assembly line to the pro­duction of cultural artefacts. For the Frankfurt School, particularly Adorno and Horkheimer, analysis of mass media revealed a system of consumer production in conflict (but also as a paradoxical fulfillment of) with the enlightenment project via mass ideological deception.18 Later, Postman con­tinues along similar lines by conceptualizing the modern mass media, espe­cially television, as a kind of a filter that hinders public discourse by allow­ing only programs and other “objects” with entertainment value to pass through communication channels.19 As an index of this dystopic under­standing of mass media, some years later the former Pink Floyd songwriter Roger Waters transposed these weapons of mass distraction and apocalyptic visions of Western media culture into his conceptual album Amused to Death (1992), where the TV sucks in all human emotion while the human species amuses itself to death watching Melrose Place, the Persian Gulf War, and copious amounts of porn. Indeed, in this way the media machine is treated as a monstrous anomaly, and significantly, a totality rather than a singularity.

In a historical context, the shock of the “new” media seems to have always occupied a similar polemical space as the one that obsessed the con­servative approaches of media effects theorists, like Gerbner. The anomalies of the new media are most often surrounded by moral panics. Such panics, whether around cinema, television, video, computer games, or the Internet, with its malicious dark side, populated by perverts lurking around every vir­tual corner, can perhaps be seen as an attempt to contextualize new media in existing social conventions and habits of the everyday. The media panics sur­rounding the Internet, for example, have highlighted the contradiction between the ideals of a reinvigorated public sphere—an electronic agora for scientists, academics, politicians, and the rest of civil society—and the reality of a network overflowing with pornography, scams, political manipulation, piracy, chat room racists, bigots, and bullies. In recent years we have seen how the Internet has been transformed from a utopian object into a problem­atic modulator of behavior, including addiction, paedophilia and illicit down­loading. It has become an object for censorship—necessitating the weeding out of unpleasant and distasteful content, but also the filtering of politically sensitive and unwanted exchange.20 In fact, in wake of the Jokela high school shootings in Finland in November 2007, there are those who claim, like they did after Columbine, that it is not the guns, but the Internet that is to blame. The uncontrollable and uncensorable flood of damaging information is still grasped as more dangerous than the impact of firearms.

The emergence of inconsistencies and deviations in media history has led Lisa Gitelman to argue that we should “turn to the anomaly” and con­centrate on the patterns of dissonance that form when new media encounter old practices. For Gitelman, “transgressions and anomalies . . . always imply the norm and therefore urge us to take it into account as well.”21 Therefore, anomalies become a tool of the cultural analyst, enabling him or her to dig into the essential, so to speak. They can be imagined as vehicles taking us along the lines of a logic that delineates the boundaries between the normal and the abnormal. But in our view such approaches do not dig deeply enough into the logical mode of the anomaly since there is always a danger that such a representational analysis will continue to treat it as an excluded partner (Other) who haunts the normalized procedures of the Same.

Alternatively, we argue that network culture presents us with a new class of anomalous software object and process, which cannot be solely reduced to, for example, a human determined representation of the capital­ist mode of consumerism.22 The examples given in this collection— contagious software, bad objects, porn exchange, and modes of network censorship—may well derive some benefit from representational analysis (particularly in the context of porn and spam e-mail content),23 but our anomalies are not simply understood as irregular in the sense that their con­tent is outside of a series. On the contrary, they are understood as expressing another kind of a topological structuring that is not necessarily derived from the success of friction-free ideals as a horizon of expectancy. The content of a porn site,24 a spam e-mail, or a computer virus, for instance, may represent aspects of the capitalist mode of production, but these programs also express a materiality, or a logic of action, which has been, in our opinion, much neg­lected in the media and communication field. This is a logical line in which automated excessive multiple posting, viral replication, and system hijacking are not necessarily indices of a dysfunctional relation with a normalized state of communication, but are rather capacities of the software code. Software is not here understood as a stable object or a set of mathematical­ly determined, prescribed routines, but as the emergent field of critical soft­ware studies is proposing, it is a process that reaches outside the computer and folds as part of the digital architectures, networks, social, and political agendas. When we combine this capacity of software with our focus on the dynamics of the sociotechnical network assemblage, in its entire broadband spectrum, we experience systems that transfer massive amounts of porn, spam, and viral infection. Such capacity, which in our view exceeds the crude distinction between normal and abnormal, becomes a crucial part of the expressive and material distribution of network culture. Porn, spam and viruses are not merely representational; they are also component parts of a sociotechnicallogical praxis. For us, they are a way of tapping into and think­ing through the advanced capitalist mode in the context of the network.

We therefore suggest that the capacity of the network topology inti­mately connects us to a post-Fordist mode of immaterial labour and knowl­edge production. We do not however prescribe to a strictly defined cyber­netic or homeostatic model of capitalist control (a point explained in more detail later), which is designed to patch up the nonlinear flows deemed dan­gerous (like contagions) to the network. On the contrary, our conception of capitalism is a machine that taps into the creative modulations and variations of topological functioning.25 Networks and social processes are not reducible to a capitalist determination, but capitalism is more akin to a power that is able to follow changes and resistances in both the extensive and intensive redefining of its “nature.” It is easy at this point to see how our vision of the media machine no longer pertains to the anomalous totality described by the Frankfurt and Toronto Schools. Like Wendy Chun, we see this machine as an alternative to the poverty of an analysis of the contempo­rary media sphere as continuously articulated between the polarity of narra­tives of total paranoid surveillance and the total freedom of digitopia. Therefore, following Chun, in order to provide a more accurate account of the capacities of media technologies as cultural constellations, this book looks to address networked media on various, simultaneously overlapping scales or layers: hardware, software, interface, and extramedial representa­tion (“the representation of networked media in other media and/or its functioning in larger economic and political systems”).26

Such an approach has led us and other contributors to draw on a Deleuze-Guattarian framework. We might also call this approach, which connects the various chapters of this book, an assemblage theory of media.

Yet, in order to fully grasp this significant aspect of our analysis, it is impor­tant to see how Deleuze and Guattari’s meticulous approaches to network society can be applied beyond the 1990s hype of “the rhizome” concept and what we see as its misappropriation as a metaphor for the complexities of networked digital media. In this sense, most contemporary writers using Deleuze and Guattari have been keen to distance themselves from a metaphorical reading of cultural processes in the sense that “metaphorici­ty” implies a dualistic ontology and positions language as the (sole) active force of culture (see the Contagions section introduction for more discus­sion). In this context, Deleuze and Guattari have proved useful in having reawakened an appreciation of the material forces of culture, which not only refer to economic relationships, but to assemblages, events, bodies, technologies, and also language expressing itself in other modalities other than meaning. Not all of the chapters are in fact locked into this framework, yet, even if they do not follow the precise line, they do, in our opinion, attempt to share a certain post-representational take which is reluctant to merely reproduce the terms it criticizes and instead explores the various logics and modes of organization in network culture in which anomalies are expressed. Importantly, the chapters are not focused on the question of how discourses of anomalous objects reproduce or challenge the grids of meaning concerning ideology and identity (sex, class, race, etc.) but rather they attempt to explore new agendas arising beyond the “usual suspects” of ideology.

We now move on to explore the topological approach in more detail, proposing that it can do more than simply counter representational reduc­tionism. First, we specify how it can respond to the fault lines of essential­ism. Then we use it to readdress a mode of functionalism that has pervaded the treatment of the anomaly from Durkheim to cyberpunk.


In order to further illuminate our question concerning the anomalies of con­temporary communication, let us return to the Monty Python sketch for further inspiration and a way in which we might clearly distinguish between a prevalent mode of essentialism and our topological approach. Following strictly essentialist terms we might define Python’s cafe by way of the loca­tion of the most important and familiar communication codes;27 looking for the effective functioning of communication norms. In this mode, we would then interpret the “spamming” of the cafe as an oppositional function, set­ting up certain disparate relations between, on the one hand, a series of per­fected communication norms, and on the other hand, the imperfection of our anomaly. Yet, arguably, the Python sketch does more than establish dialectical relations between what is in and what is outside a series. Instead, Python’s comedy tactic introduces a wider network of reference, which unshackles the unessential, enabling the sketch to breach the codes of a closed communication channel, introducing fragments of an altogether dif­ferent code. Thus, in the novel sense of topological thinking, the British cafe becomes exposed to the transformational force of spontaneous events rather than the static essences or signs of identity politics.

In a way, Monty Python suggests an anti-Aristotelian move, at least in the sense proposed by Paul Virilio: that is a need to reverse the idea of acci­dents as contingent and substances as absolute and necessary. Virilio’s apoc­alyptic take on Western media culture argues for the inclusion of the poten­tial (and gradual actualization) of the general accident that relates to a larger ontological shift undermining the spatio-temporal coordinates of culture. In a more narrow sense, Virilio has argued that accidents should be seen as inci­dental to technologies and modernity. This stance recapitulates the idea that modern accidents do not happen through the force of an external influence, like a storm, but are much more accurately follow-ups or at least function­ally connected with, the original design of that technology. In this way, Virilio claimed that Aristotelian substances do not come without their acci­dents, and breakdowns are not the absence of the presumed order, but are rational, real and designed parts of a media cultural condition: the “normal” state of things operating smoothly.28 With Monty Python, as with Deleuze, the structures of anticipation and accidentality are not simply reversed, but the anomalous communication event itself emerges from within a largely accidental or inessential environment.xxix

To analyze the material reality of anomalous objects, we must therefore disengage from a perspective that sees the presumed friction-free state of networking, the ideal non-erring calculation machine, or a community of rational individuals using technologies primarily for enlightenment as more important than the anomaly (spam, viruses, and porn merely regarded as secondary deviations.) Indeed, in our view, accidents are not simply sporadic breakdowns in social structure or cultural identity, but express the topolog­ical features of the social and cultural usage of media technologies. In this context, we concur with Tiziana Terranova,30 who discusses network dynamics as not simply a “space of passage for information,” but a milieu that exceeds the mechanism of established communication theory (senders, channels, and receivers). The surplus production of information comprises a turbulent mixture of mass distribution, contagion, scams, porn, piracy, and so on. The metastability of these multiple communication events are not merely occurrences hindering the essence of the sender–receiver relation, which generally aims to suppress, divide, or filter out disparities altogether, but are instead events of the network topology in itself. The challenge then, is not to do away with such metastabilities, but to look at them in terms of an emergent series and experience them as the opening up of a closed com­munication system to environmental exteriority and the potentialities that arise from that condition. A condition we can refer to as the inessential of network culture.


If all things have followed from the necessity of the most perfect nature of God, how is it that so many imperfections have arisen in nature—cor­ruption, for instance, of things till they stink; deformity, exciting dis­gust; confusion, evil, crime, etc.? But, as I have just observed, all this is easily answered. For the perfection of things is to be judged by their nature and power alone; nor are they more or less perfect because they delight or offend the human senses, or because they are beneficial or prejudicial to human nature.31

We have thus far argued that the anomaly is best understood in terms of its location in the topological dynamics of network culture. Significantly, in this new context then, we may also suggest that anomalies are not, as Spinoza realized, judged by the “presumed imperfections of nature” (nature repre­senting a unity, as such), but instead they are judged by “their nature and power alone.” In other words, it matters not if objects “delight or offend the human senses.” Particular “things” and processes are not to be judged from an outside vantage point or exposed to “good” or “bad” valuations. Instead, the ethological turn proposes to look at the potentials of objects and ask how they are capable of expression and making connections.

In this way, the shift toward topological analysis becomes parallel to a perspective that claims to be “beyond good and evil” and instead focuses on the forces constituent of such moral judgments. This marks the approach out as very different from the historical tradition of social theory, particular­ly the early response of organic functionalists to the good and bad of social events. For example, Emile Durkheim was perhaps the first social scientist to show how anomie played an important part in social formations, but he negated the productive capacities we have pointed to in favor of describing the anomaly as a state of social breakdown. For Durkheim, the ultimate anomalous social act—suicide—stemmed from a sense of a lack of belong­ing and a feeling of remoteness from the norm. Anomaly as a social phenom­enon therefore referred to a deprivation of norms and standards. Although suicide was positively disregarded as an act of evil, it did however signal a rupture in the organics of society, an abnormality, a falling out of series, as such.32 Indeed, his statistical container model of macro society—much appreciated by the society builders of 19th-century Europe—judged social phenomena against the average, the essential, and the organic unity of social functionalism. This of course ruled out seeing anomalies as social phenom­ena with their own modes of operation and co-causal capacity to affect.

Baudrillard’s notion of the perverse logic of the anomaly intervenes in the functionalist exorcism of the anomalous, as a thing that doesn’t fit in.33 Writing mainly about another bad object, drugs, Baudrillard argued that the anomaly becomes a component part of the logic of overorganization in modern societies. As he put it:

In such systems this is not the result of society’s inability to integrate its marginal phenomena; on the contrary, it stems from an overcapacity for integration and standardization. When this happens, societies which seem all-powerful are destabilized from within, with serious conse­quences, for the more efforts the system makes to organize itself in order to get rid of its anomalies, the further it will take its logic of over-organization, and the more it will nourish the outgrowth of those anom­alies.34

Beyond the law-abiding notion of Durkheim’s anomie Baudrillard, there­fore, proposed to consider contemporary phenomena (the writing stems from 1987) as labeled by excess—a mode of hyperrational anomaly. He argued that the modern emphasis placed on control management has itself spurred on these excesses of standardization and rationality. The strange malfunctions become the norm, or more accurately, they overturn the logic of thinking in terms of self versus other. Moreover, in the perverse logic of Baudrillard’s anomalous, the object, as an extensive target of social control, is preceded by an intensive logic that exceeds the grid of explanation imposed by social scientists, educationalists, and therapeutic practitioners. Instead of external deviations contrary to the internal functioning of the social, anom­alies start to exhibit an intensive and integral social productivity.

The distinction made here between the intensive productivity of the anomaly and a social model developed around organic unity and function­alism is perhaps better grasped in DeLanda’s similar distinction between relations of interiority, and relations of exteriority.35 In the former, societies are regarded as solely dependent on reciprocal internal relations in order that they may exhibit emergent properties. In the latter, DeLanda seeming­ly turns the generalized social organism inside out, opening up its compo­nent parts to the possibilities and capacities of complex interactions with auxiliary assemblages. In fact, what he does is reconceive the social organism as an assemblage.

So as to further explore this notion of the social assemblage, let’s return to the example of the forensic honeypot computer introduced in the first section of this introduction. Previously understood as a closed system, the rationalized logic machine soon becomes exposed to the disparities of the network. Emergent relations hijack the honeypot’s functionality. Its rela­tion to an exteriority links up these disparities and in turn connects it to other assemblages. It is at this juncture that we locate the transformational differentiation and alterity of the honeypot as it becomes inseparable from the relations it establishes with a multiplicity of other assemblages, populat­ed by technosocial actors, including netbots, virus writers, cookies, and hacker groups and their software. Nevertheless, the anomalies that traverse the assemblage are not simply disparities that suppress or divide, but are instead the role of the anomaly intervenes in the process of the becoming of the honeypot. It establishes communication with other objects related to the assemblage, potentializing new territories or deterritorializing other assemblages.

Anomalies transform our experiences of contemporary network culture by intervening in relational paths and connecting the individual to new assemblages. In fact, the anomaly introduces a considerable amount of insta­bility to what has been described in the past as a cybernetic system of social control.36 In practice, the programs written by hackers, spammers, virus writers, and those pornographers intent on redirecting our browsers to their content, have problematized the intended functionality and deployment of cybernetic systems. This has required cyberneticians to delve deeply into the tool bag of cybernetics in an effort to respond to the problem engendered: How to keep the system under control? For experts in the computing field, defensive software, such as antivirus technology, represents a new mobiliza­tion of security interests across the entire networked computing environ­ment instead of being exclusively aimed at single computers,37 and it is inter­esting to see how many of these defences appear to play to the notion of organic unity as described earlier. For example, computer scientists based at IBM’s TJ Watson Research Centre during the early 1990s attempted to tack­le the problem of computer viruses by developing a cybernetic immune sys­tem.38 Using mathematical models borrowed from epidemiology, these researchers began to trace the diffusion patterns of computer viruses analo­gous to the spread of biological viruses. Along with other commercial ven­dors, they sought out methods that would distinguish between so-called legitimate and viral programs. In other words, their cybernetic immune sys­tem was designed to automate the process of differentiating self from non-self and ultimately suppress the threshold point of a viral epidemic (the point at which a disease tips over into a full-blown epidemic).

However, the increasing frequency of digital anomalies has so far con­founded the application of the immunological analogy. In fact, research in this area has recently shifted to a focus on topological vulnerabilities in the network itself, including a tendency for computer viruses to eschew epi­demiological threshold points altogether.39 Maps of the Internet and the World Wide Web (www), produced by complex network theorists in the late 1990s,40 demonstrate how networks become prone to viral propagation, as they would any other program. There is, as such, a somewhat fuzzy distinc­tion between what can be determined as self and non-self. As we have already pointed out, the anomaly is not, in this sense, outside the norm.

The history of cybernetics provides many more examples of this prob­lem where logic encounters network politics. The origins of Turing’s theory of computational numbers was arguably realized in a paradoxical and large­ly unessential composition of symbolic logic, in as much as he set out to prove that anomalies coexisted alongside the axioms of formal logic.41 Not surprisingly then, Turing’s halting problem, or the undecidability problem, eventually resurfaced in Cohen’s formal study of computer viruses, a doom-laden forecast in which there is no algorithmic solution to the detection of all computer viruses.42 Indeed, logic systems have long been troubled by their inability to cope with virals. The problem of the self-referencing liar bugged the ancient Greek syllogistic system as much as it has bugged the contemporary cybernetics of network culture.

In this light, it is interesting to draw attention to the way in which these fault lines in cybernetics and Durkheim’s anomie have converged in cyber-culture literature. With its many references to Gaia43 (a theory of natural balance and equilibrium akin to immunology) cyberculture has co-opted the principle of the self-referencing maintenance of organic unity into the fabric of the collectivities of cyberspace. For example, John Perry Barlow argued that the immune system response of the network is “continuously” defining “the self versus the other.”44 In this way, he typified the tendency of cyber­punk’s frontier mentality to discursively situate the digital anomaly firmly outside of the homeostatic system of network survivability. In fact, as Bruce Sterling revealed, cyberpunks and the cyberneticists of the antivirus indus­try have become strange bedfellows:

They [virus writers] poison the digital wells and the flowing rivers. They believe that information ought to be poisonous and should hurt other people. Internet people build the networks for the sake of the net, and that’s a fine and noble thing. But virus people vandalize computers and nets for the pure nasty love of the wreckage.45

It seems that the much wished-for stability of the cyberpunk’s Daisyworld is increasingly traversed by the instabilities produced by the anomaly. As Sterling noted in another context, “the Internet is a dirty mess”46 that has lost its balance mainly because of the increasing outbreaks of cyberterrorism and cybercrime, but also because of the negligence of the authorities to ade­quately address the problems facing network culture. In Sterling’s vision, which increasingly echoes those of the capitalist digerati, there is a horizon on which the network eventually becomes a clean and frictionless milieu. Yet such a sphere of possibility rests conceptually on the notion of home­ostasis and stability, which sequentially implies a conservative (political) stance. In our view, it is more insightful to follow Geert Lovink’s position that networking is more akin to notworking:

What makes out today’s networking is the notworking. There would be no routing if there were no problems on the line. Spam, viruses and identity theft are not accidental mistakes, mishaps on the road to tech­no perfection. They are constitutional elements of yesterday’s network architectures. Networks increase levels of informality and also pump up noise levels, caused by chit-chat, misunderstandings and other all too human mistakes.47

We argue that the noise of Lovink’s notworking not only throws a spanner in the works of the cybernetic system, but also more intimately connects us to the capacity of the network to affect and thus produce anomalies. Instead of seeing the network as a self-referential homeostatic system, we want to therefore propose an autopoietic view of networks wherein alterity becomes the mode of operation of this sociotechnical machine (even though, e.g., Lovink might be reluctant to use these concepts). So if we would want to approach network systems in a broad framework as autopoi­etic systems, one would need to emphasize their difference from an old ideal of harmonious determined Nature. Following Guattari,48 we argue that sys­tems are not structures that merely stabilize according to a predetermined task, but are instead machines composed in disequilibrium and a principle of abolition. Here, re-creation works only through differentiation and change, which are ontological characteristics of a system that relies contin­uously on its exterior (a network). The digital network is consequently composed in terms of a phylogenetic evolution (change) of machines, and importantly understood as part of a collective ecological environment. In this context, the maintenance project of any machine (social, technical, or biological system) cannot be simply confined to the internal (closed in) pro­duction of self, or for that matter the detection of non-self, but instead returns us to the individuation process (discussed earlier) and the continu­ance of what Guattari called the “diverse types of relations of alterity.”49 We argue that a condition akin to a horror autotoxicus of the digital network, the capacity of the network to propagate its own imperfections, exceeds the metaphor with natural unity. Indeed, despite a rather vague notion about the purposeful essence of network production as described by individuals like Bill Gates (something perhaps akin to Spinoza’s “perfect nature of God”), the network itself is without a doubt the perfect medium for both perfection and imperfection.


We do not doubt that what we are dealing with here are very curious objects indeed. They present mind-boggling problems to system managers and net­work controllers Yet, the failure to adequately overcome the computer virus problem perhaps pales in comparison to what Wired Magazine described as the next big issue for network security: the autonomous software netbots (or spambots) that are more flexible and responsive to system defences than the familiar model of pre-programmed computer viruses and worms. As Wired described the latest threat

The operational software, known as command and control, or C&C, resides on a remote server. Think of a botnet as a terrorist sleeper cell: Its members lurk silently within ordinary desktop computers, inert and undetected, until C&C issues orders to strike.50

Here we see that the netbot becomes discursively contemporised in terms of a latent terrorist cell that evades the identification grid of an immune system. Possibly this marks a discursive shift away from the biological analogy with viruses and worms toward the new anxieties of the war on terror. Whatever the rhetoric, identification is perhaps the key contemporary (and future) problem facing not just computer networks, but networks of political power, wherein nonexistence (becoming invisible) can become a crucial tac­tical gesture, as Galloway and Thacker suggest in Chapter 13.

The invisibility of software objects has in practice confounded a media studies approach orientated toward a representational analysis of phenome­nological “content.” Software considered as a specific set of instructions running inside a computer is obviously something more akin to a perform­ance, rather than a product of visual culture. To combat the often-simplistic analysis of software, Lev Manovich proposed, back in 2001, that media stud­ies should move toward “software studies,” and in doing so he provided an early set of principles for an analysis of new media objects. Manovich’s prin­ciples of new media include numerical representation, modularity, automa­tion, variability and transcoding. New media in this way is based on the pri­mary layer of computer data—code—that in its programmability separates “new” from “old” media, such as print, photography, or television.51 However, since then, Chun noted how Manovich’s notion of transcoding— that software culture and computation is about translating texts, sounds, and images into code—is not a sufficiently rich notion.52 Instead of registering (repeating) differences that pre-exist, Chun argued that computation makes differences and actively processes code in and out of various phenomenolog­ical contexts, such as text or sound. Her argument is supported by virus researchers who note that even a simple opening and closing of an applica­tion, or rebooting of a system, can make changes to boot sector files, log files, system files, and Windows’ registry.

For example, opening and closing a Word document is a computational process that may result in, for example, the creation of temporary files, changes to macros, and so forth.53 However, these processes do not directly come into contact with the human senses (we cannot always see, hear, touch, taste, or indeed smell an algorithmic procedure) and there is consequently a deficit in our cognitive and conceptual grasping of software objects and processes, as such. Yet, despite the abstract nature of mathematical media, these processes are completely real and demand attention from cultural the­ory, not least because the contemporary biopower of digital life functions very much on the level of the nonvisual temporality of computer network. This is why cultural theory needs to stretch its conceptual capacities beyond representational analysis and come up new notions and ideas in order to bet­ter grasp the technological constellations and networked assemblages of “anomalous media culture.” By proposing novel concepts, like those sug­gested by Deleuze for example, we do not aim to prescribe a trendy cultur­al theory, but rather enable a rethink of the processes and emerging agendas of a networked future.

The anomalous objects discussed in this volume can therefore be taken as indices of this novel media condition in which complex transformations occur. Yet, while on an algorithmic and compositional level, the objects and processes highlighted in spam e-mails, computer viruses, and porn commu­nities are not in anyway different from other objects and processes of digi­tal culture, there is clearly a repetitious and discursive filtering process going on: If software is computation that makes a difference (not just a coding of differences), then there is also a continuous marking out of what kind of processes are deemed as normal, abnormal, and/or anomalous. In other words, there is an incessant definition and redefinition of what, on the one hand, makes a good computation, a good object, and a good process, and on the other hand, what is defined as irresponsible and potentially a bad object or process. However, as noted earlier, the material and expressive boundaries of these definitions are not at all clear. We may, in this light, therefore sug­gest that such turbulent objects are considered as standard objects of net­work culture.54 Instead of merely being grasped as elements that should be totally excluded from the economic, productive, and discursive spheres of the knowledge society, they are equally understood as captured and used inclusively within the fabrication of digital assemblages. For example, the anomaly takes on new functions as an innovative piece of evolutionary “viral” or “spam” software (in digital architecture or sound production for instance), or is translated into new modes of consumer organization and activation (viral marketing), or becomes adapted to serve digital sociality in practices and communities (pornographic exchange). Ultimately, if capital­ism is able to make novel use of these critical practices of resistance, then cultural and media theorists should do likewise. Otherwise, they will remain anomalies for a theory unable to perceive of new modulations of power and politics functioning on the level of software.

From the varied perspectives offered in this volume the reader will notice that our take on the anomaly is not considered sacrosanct—anomalous digi­tal objects are distributed across many scales and platforms. However, we do feel that all of the following chapters intersect with our notion of the anom­alous object, albeit provoking a controversy around its compositional theme. Therefore, in order to introduce a sense of organization to the mixture of viewpoints put forward in The Spam Book we have divided the chapters in subsections: Contagions, Bad Objects, Porn, and Censored. Each subsection has an introduction setting out how we, the editors, grasp the position and the value of each chapter. As we have already suggested, there are of course many takes on the digital anomaly, but what The Spam Book proposes to do is shed some light on what has, until now, remained on the dark side of media and communication and cultural analysis.



Digital contagions are often couched in analogical metaphors concerning biological disease. When framed in the linguistic structures of representa­tional space, the biological virus becomes a master referent, widely dispersed in the fields of cultural studies, computer science, and the rhetoric of the antivirus and network security industries.55 The figurative viral object becomes part of a semiotic regime of intrusive power, bodily invasion, uncontrollable contamination, and even new modes of auto-consumerism. Nevertheless, although representational analysis may have an application in a media age dominated by the visual image, the approach does not, in our opinion, fully capture the imperceptible constitutive role of contagion in an age of digital networks.

Indeed, when contemplating the metaphor of contagion, it is important to acknowledge two constraining factors at work. First, the analytical focus of metaphorical reasoning may well establish equivalences, but these resem­blances only really scratch the surface of an intensive relation established between a viral abstraction and concrete contagious events. Second, it is important to recognize the political import of the analogical metaphor in itself. It has an affective charge and organizational role in the spaces, prac­tices, and productions of digital network culture. For example, as seen in this section, the resemblances established between neo-Darwinian genes and computer viruses have imposed the logic of the arms race on evolutionary computing.56 In conjunction with epidemiological and immunological analogies, the digital gene delimits the patterning of software practices, excluding the contagious anomaly from the norms of code reproduction.

In light of the often-divisive imposition of the metaphor in the materi­ality of digital network culture, it is important that the this volume provides a potential escape route out of the analytical constraints of representation. Gilles Deleuze could, in our opinion, function as one alternative thinker who provides a set of tools for a post-representational cultural analysis. Via Deleuze we can substitute the metaphoric burden of the biological referent on its digital “viral cousins” with an exposition of the constitutive role con­tagion plays in material spaces, time-based practices, and productions. In place of the negatives of the metaphor we find an abstract diagram with an affirmative relation to the concrete contagious assemblages of digitality. To be more concise, the diagrammatic refrain is what holds these assemblages together, or even more succinctly, attempts to delimit and control the iden­tities of these larger unities. Think of the abstract diagrams used in this sec­tion as descriptions of the intensity of relations, repetitiously “installed” in the concreteness of the digital assemblages addressed in each chapter.

We begin the section with John Johnston’s chapter on the computer viruses’ relation to artificial life (ALife) research. Johnston loses the familiar metaphorical references to the spread of biological disease and instead explores the complex relationality between illicit virus production and the futures of ALife research. The chapter is stripped bare of the verbosity of Deleuzian ontology, yet arguably, the abstract diagram is ever present. It is apparent in the chapter’s endeavor to dig beneath the surface of analogical reasoning and instead explore the limitations and mysteries of “imitating biology.” In this way, Johnston refocuses our attention on the problematics of establishing a link between organic life and nonorganic digitality. In fact, Johnston’s diagram presents a somewhat challenging distinction between the two, and as a result he questions the viability of virally coded anomalies, which are both outside of the natural order of things and at risk of exceed­ing the services of human interest.

Tony Sampson’s chapter (chap. 2) uses three questions to intervene in a conception of universal contagion founded on the premise of “too much con­nectivity.” Beginning with a brief account of the universality of the conta­gious event, he locates the prominence of a distributed network hypothesis applied to the digital epidemic. However, Sampson points to an alternative viewpoint in which evolving viral vulnerabilities emerge from a composition of stability and instability seemingly arising from the connectivity and inter­action of users. Chapter 2 then moves on to argue that efforts made by antivirus researchers to impose epidemiological and immunological analo­gies on these emerging susceptibilities in digital architecture are very much flawed. For example, the crude binary distinction between self and non-self is regarded here as ill-equipped to manage the fuzzy logics of an accidental topology. Indeed, Sampson sees the problems encountered in antivirus research extending beyond the defense of the body of a network to the con­trol of a wider network of interconnecting social bodies and events. He con­cludes by speculating that the problematics of anomaly detection become part of a broader discursive and nondiscursive future of network conflict and security.

In Chapter 3, Luciana Parisi pushes forward the debate on the univer­sality of viral ecologies by seeking to avoid distinctions made between digi­tal and analogue, technical and natural, and mathematical and biological architectures. The focus of her analysis moves instead to the material capac­ity of infectious processes, which exceed the organizational tendencies of algorithmic logic. For example, Parisi investigates how the neo-Darwinian genetic code imposes an evolutionary schema on the diagram of digital con­tagion. However, unlike Johnston, Parisi argues that the organic and nonor­ganic of digitality are assembled together and that assumptions made in the practice of writing genetic algorithms fail to grapple with the symbiotic nature of what she terms the abstract extensiveness of digital architecture. Parisi employs a complexity of reasoning to explain her alternative blob architectures. If the reader is unfamiliar with the influential events theory of Alfred N. Whitehead or Lynn Margulis’ notion of endosymbiosis then the ideas expressed can be difficult to tap into. Nevertheless, a concentrated deep read will offer great rewards to those wanting to discover digital con­tagion in a novel and profound light.

Finishing this section, Roberta Buiani (chap. 4) proposes that virality is not merely an inherent “natural” part of the software code, but is continu­ously distributed as figures of contagion, virulence, and intensivities across popular cultural platforms. Making a useful distinction here between being viral and becoming viral, Buiani returns us to the limits imposed by the metaphoric regime of disease and the nonlimitative distribution of a flexible “single expression.” In its becoming, virality has the potential to produce creative outcomes, rather than just new threats—new diagrams perhaps?

Remodelling Communication by Gary Genosko

As prelude to Virality there are some very interesting references to an early take on the scrambling of the Shannon and Weaver communication model I wrote about back in 2006. Genosko’s book is a much better read though. Finding a way to slot my rhizome model into a fascinating account of the development of communication models. I have just received a hardback copy from Toronto Press. It must be out soon. A really good read!

Here’s the blurb.

Remodelling Communication

Covering major developments from post-war cybernetics and telegraphy to the Internet and our networked society, Remodelling Communication explores the critical literature from across disciplines and eras on the models used for studying communications and culture. Proceeding model-by-model, Genosko provides detailed explanations of mathematical, semiotic, and reception theory’s encoding/decoding models, as well as Baudrillard’s critique of models and general models that bring together a variety of disciplinary perspectives. Providing a dynamic, forward-looking reorientation towards a new universe of reference, Remodelling Communication makes a significant, productive contribution to communication theory.

Gary Genosko is Canada Research Chair in Technoculture and a professor in the Department of Sociology atLakeheadUniversity.

Here’s the model, which, as Genosko points out, was inspired by Sylvano Bussotti’s rhizomatic score featured in Deleuze and Guattari’s Thousand Plateaus.

SeeSenders, Receivers and Deceivers: How Liar Codes Put Noise Back on the Diagram of Transmission” (some problems with this link. Also published here, but again some problems here too! – need to explore what is happenning at VX Heaven!)