
For those visitors to Virality who haven’t yet read The Spam Book (Hampton Press, 2009) here is the introductory chapter (co-written with Jussi Parikka) and including the introduction to the first section of the collection: Contagions.
ON ANOMALOUS OBJECTS
OF DIGITAL CULTURE An Introduction
Jussi Parikka Tony D. Sampson
In the classic 1970s Monty Python sketch, a couple enters, or rather, in typical Pythonesque mode, descend upon a British cafe and are informed by the waitress that Spam is on the menu.
There’s egg and bacon; egg sausage and bacon; egg and spam; egg bacon and spam; egg bacon sausage and spam; spam bacon sausage and spam; spam egg spam spam bacon and spam; spam sausage spam spam bacon spam tomato and spam.
The joke, of course, refers to Spam, the canned food substance that originated in the 1930s in the United States, but was famously imported into Britain during World War II.1 Spam (spiced ham) became a cheap supplement for pure meat products, which were in severe shortage during the conflict. Perhaps the cheapness and mass consumption of Spam during the period are among the reasons why it became the butt of many music hall jokes. Indeed, following the music hall tradition, Spam becomes central to the Python’s often nonsensical sketch as it quickly deterritoralizes from the more obvious context of the waitress–customer discussion to a full Viking chorus of spam, spam, spam . . .
Spam, spam, spam, spam. Lovely spam! Wonderful spaaam! Lovely spam! Wonderful spam. Spa-a-a-a-a-a-a-am! Spa-a-a-a-a-a-a-am! Spa-aa-a-a-a-a-am! Spa-a-a-a-a-a-a-am! Lovely spam! (Lovely spam!) Lovely spam! (Lovely spam!) Lovely spaaam! Spam, spam, spam, spaaaaam!
The joke’s intention, as Monty Python jokes in general tend to do, is to get us to laugh at a major concern of contemporary communications: communication breakdown.2 The habitual repetition of everyday events quickly turns into a chaotic mess and a turbulent example of noncommunication. The familiar communication channels of this architypal British working-class cafe are suddenly flooded with intruding thirds, a noise that fills the acoustic space with a typically meaningless Python refrain: spam, spam, spam. In this sense (or nonsense), the sketch manages to parody the meaninglessness intrinsic to any meaningful act of communication by increasing the level of environmental noise that accompanies the process of sending messages. In fact, the invading Viking horde (perhaps a veiled reference to the U.S. troops stationed in Britain during World War II) eventually drowns out, or “spams,” the ongoing conversation between the waitress and the customers, transforming the chaotic scene into a closing title sequence filled with more references to spam, spam, spam . . .
More than 30 years later, and the analogy made between Python’s sketch and the unsolicited sending of bulk e-mail has provided new impetus to the word spam. Perhaps for many of us digital spam is less funny. For those of us increasingly reliant on e-mail networks in our everyday social interactions, spam can be a pain; it can annoy; it can deceive; it can overload. Yet spam can also entertain and perplex us. For example, how many of you have recently received an e-mail from “a Nigerian Frind” (sic) or a Russian lady looking for a relationship? Has your inbox overflowed with the daily announcements of lottery prizes and cut price Viagra? Perhaps you have experienced this type of Dadaist message, which appears at the zero degree of language.
Dehasque Little Bergmann Dewald Murray Eriksson Tripathy Gloo Janusauskas Nikam Lozanogmjpkjjpjrfpklkijnjkjflpkqkrfijmjgkkj kgrkkksgpjmkqjmkggujfkrkpktkmmmjnjogjkhkhknjpgghlhnkofjgp gngfgrgpkpgufifggmgmgugkirfsftkgtotutmumpptituuppmqqpgpjpkq qquqkuqqiqtqhqnoppqpruiqmqgnkokrrnknslsifhtimiliumgghftfpfnfsf nfmftflfrfjhqgrgsjfflgtgjflksghgrrgornhnpofsjoknoofoioplrlnlrjim jmkhnltlllrmthklpljpuuhtruhupuhujqfuirorsrnrhrprtrotmsnsonjrh rhrnspngslsnknfkfofigogpkpgfgsgqfsgmgti qfrfskfgltttjulpsthtrmkhnilh rhjlnhsisiriohjhfhrftiuhfmuiqisighgmnigi gnjsorgstssslolsksiskrnrnsfspptngqhqitpprpnphqrtmprph.3
Generally speaking, however, spam arouses a whole panorama of negative and bemused emotions, in much the same way as computer viruses, worms, and the uninvited excesses of Internet pornography often do. In fact, we might collectively term these examples as digital pollution and identify them as a major downside (or setback) to a communication revolution that promised to be a noiseless and friction-free Road Ahead.4 In this context, and against the prescribed and often idealized goals of the visionaries of digital capitalism, they appear to us as anomalies. Nevertheless, despite the glut of security advice—a lot of which is spuriously delivered to our e-mail inboxes, simply adding to the spam—little attention has been paid to the cultural implications of these anomalous objects and processes by those of us engaged in media and communication studies, and particularly studies linked to digital network culture. Perhaps we have been too busy dealing with the practical problem and have failed to ask questions of anomalies in themselves.5 The innovation of this volume is to answer these questions by considering the role of the anomaly in a number of contexts related to digital communication and network culture. However intrusive and objectionable, we argue that the digital anomaly has become central to contemporary communication theory. Along these lines, we begin this book by asking: “In what sense are these objects anomalous?”
If we constrain ourselves to the dictionary definition of the anomalous, as the unequal, unconformable, dissimilar, and incongruous, in other words, something that deviates from the rule and demonstrates irregular and abnormal behaviour or patterns,6 then arguably our question becomes problematized by everyday experiences of network culture. To be sure, spam, viruses, worms, and Internet porn are not irregular or abnormal in this sense. This junk fills up the material channels of the Internet, transforming our communications experiences on a daily or even hourly basis. For example, according to recent moderate sources, 40% of e-mail traffic is spam, meaning some 12.4 billion spam mails are being sent daily.7 Similarly, in an experiment using a “honeypot” computer as a forensic tool for “tracking down high-technology crime,” a team from the BBC in the United Kingdom recently logged, on average, one attack per hour that could render an unprotected machine “unusable or turn it into a [zombie] platform for attacking other PCs.”8 It is therefore not surprising that many network users fear everyday malicious Internet crime more than they do burglary, muggings, or a car theft.9 Indeed, within the composite mixture of the everyday and the anomalous event, the fixed notion that the normal is opposed to the abnormal is increasingly difficult to reconcile.
It is from this cultural perspective that we approach the network anomaly, arguing that the unwelcome volume of anomalous traffic informs multiple articulations concerning the definition of the Internet and how the network space is becoming transformed as a means of communication. For example, in the late 1990s, network culture was very much defined in terms of the economic potential of digital objects and tools, but recently the dominant discourse has tilted toward describing a space seemingly contaminated by digital waste products, dirt, unwanted, and illicit objects.10 There are, indeed, a number of ways in which anomalies feedback into the expressive and material components of the assemblages that constitute network culture. On one hand, network security businesses have established themselves in the very fabric of the digital economy (waste management is the future business model of late modernity). The discourses formed around this billion-dollar security industry, ever more dependent on anomalies for its economic sustenance, lay claim to the frontline defense of network culture against the hacker, the virus writer, and the spammer, but they also shape the experiences of the network user. On the other hand, analysis of the build up of polluted traffic means that evaluations are made, data is translated into prediction models, and future projects, such as Internet 2.0 and other “spam and virus-free” networks, are proposed as probable solutions to the security problems facing online businesses and consumers. In other words, anomalies are continuously processed and rechanneled back into the everyday of network culture. Whether they are seen as novel business opportunities or playing the part of the unwanted in the emerging political scenarios of network futures, anomalous objects, far from being abnormal, are constantly made use of in a variety of contexts, across numerous scales. Therefore, our aim in this introduction is to primarily address the question concerning anomalies by seeking conceptual, analytic, and synthetic pathways out of the binary impasse between the normal versus the abnormal.
In our opinion, what makes this collection standout, however, is not only its radical rethinking of the role of the anomalous in digital culture, but that all of the contributions in the book in one way or another mark an important conceptual shift away from a solely representational analysis (the mainstay of media and cultural studies approach to communication). Rather than present an account of the digital anomaly in terms of a representational space of objects, our aim as editors has been to steer clear of the linguistic categorizations founded on resemblances, identities, oppositions, and metaphorical analogies. In our opinion, the avoidance of such representational categorizations is equal to rejecting the implicit positioning of a prefabricated grid on which the categories identified constrain or shackle the object. For us, judging a computer virus as a metaphor of a biological virus all too easily reproduces it to the same fixed terms conjured up in the metaphor in itself and does not provide any novel information concerning the intensive capacities of, for example, a specific class of software program. Hence, our desire is to avoid metaphorics as a basis of cultural analysis is connected to our wish to focus “less on a formation’s present state conceived as a synchronic structure than on the vectors of potential transformation it envelops,” to use Brian Massumi’s words.11 Furthermore, we do not wish to blindly regurgitate the rhetoric of a computer security industry who peddle metaphorical analogies between the spread of computer viruses and AIDS. For that reason, we have attempted to avoid the tendency to conjure up the essence of the digital anomaly from a space considered somehow outside—a space populated by digital Others or out-of-control Artificial Intelligence pathogens engaged in a evolutionary arms race with a metaphorical immune systems.12 In this sense, the reference to the dark side of digital culture in the subtitle of this book is more closely allied to our understanding of the darkness surrounding this type of representational analysis than it is the darkness of the object in itself. We intend to address this lack of light (or lack of analysis) by considering a conceptual approach that is more fluid, precise, and inventive in terms of a response to the question of the anomaly. It is designed to grasp the liminal categories and understand the materiality and paradoxical inherency of these weird “objects” and processes from theoretical and political points of view.
We do nevertheless recognize that on material and representational levels, spam and other anomalies do have effects. But in this volume, we acknowledge the problems inherent to the deployment of a media theory based exclusively on effect.13 To be sure, in the past, this is how media anomalies such as violence and pornography have been treated by certain field positions within media and communication studies—the effects of which were considered to cultivate an audience’s sense of reality.14 Perhaps our approach will be seen as more Monty Python than George Gerbner, in as much as we are less interested in the causality afforded to the impression of media meanings than we are in the process of communication in itself. Yet, this does not indicate a return to the transmission model so prevalent in early communication theory, wherein the establishment of communicative fidelity between Sender A and Recipient B, in the midst of signal noise, is the basic setting. On the contrary, instead of the linear channeling of messages and the analysis of effects, one might say that this book is concerned with affect and ethology: how various assemblages of bodies (whether technological, biological, political or representational) are composed in interaction with each other and how they are defined, not by forms and functions, but by their capabilities or casual capacities. In other words, we are interested in how one assemblage, a heterogeneous composition of forces, may affect another.15 Later, we refer to this approach as topological, because we argue that it releases us from the analytical dichotomy between causal (fatal) effects and complete indeterminism, and allows us to instead consider a cocausal, intermediate set of determinisms and nonlinear bodies. Significantly then, we use the term topology to address the complex assemblages of network society, which are not restricted to technological determinism, or the effects technology has on society, but encompasses the complex foldings of technological components with other aspects of social and cultural reality.16
Importantly, in this analytical mode, we are not seeking out the (predefined) essence of the anomaly (whether expressed in terms of a representational category or intrinsic technical mechanism), but instead a process in a larger web of connections, singularities, and transformations. Therefore, our question positions the anomaly in the topological fabric of an assemblage from where new questions emerge. For example, how do operating systems and software function in the production of anomalous objects? In what kind of material networks do such processes interact? How are certain software processes and objects translated into criminal acts, such as vandalism, infringement, and trespass?17 We now elaborate on this theoretical position from a historical perspective, before addressing the questions of affects, topology, and anomalous objects.
MEDIA ANOMALIES: HISTORICAL CONTEXT
Analysis of media in terms of the anomaly is nothing new. There are, in fact, many approaches that implicitly or explicitly address anomalous media. A number of well-known approaches that should be familiar to the media and communication field, including the Frankfurt School and the media-ecological writings of the Toronto School (including Neil Postman), have regarded (mass) media in itself as an anomaly. Of course, these approaches do not concern a strict deviation from the norm or events outside of a series, as such. Instead, the dangerous anomaly has long been regarded as a function of the homogenizing powers of popular media. The repetitious standardization of media content is seen as a result of the ideological capitalist-state apparatus, which applies the logic of the factory assembly line to the production of cultural artefacts. For the Frankfurt School, particularly Adorno and Horkheimer, analysis of mass media revealed a system of consumer production in conflict (but also as a paradoxical fulfillment of) with the enlightenment project via mass ideological deception.18 Later, Postman continues along similar lines by conceptualizing the modern mass media, especially television, as a kind of a filter that hinders public discourse by allowing only programs and other “objects” with entertainment value to pass through communication channels.19 As an index of this dystopic understanding of mass media, some years later the former Pink Floyd songwriter Roger Waters transposed these weapons of mass distraction and apocalyptic visions of Western media culture into his conceptual album Amused to Death (1992), where the TV sucks in all human emotion while the human species amuses itself to death watching Melrose Place, the Persian Gulf War, and copious amounts of porn. Indeed, in this way the media machine is treated as a monstrous anomaly, and significantly, a totality rather than a singularity.
In a historical context, the shock of the “new” media seems to have always occupied a similar polemical space as the one that obsessed the conservative approaches of media effects theorists, like Gerbner. The anomalies of the new media are most often surrounded by moral panics. Such panics, whether around cinema, television, video, computer games, or the Internet, with its malicious dark side, populated by perverts lurking around every virtual corner, can perhaps be seen as an attempt to contextualize new media in existing social conventions and habits of the everyday. The media panics surrounding the Internet, for example, have highlighted the contradiction between the ideals of a reinvigorated public sphere—an electronic agora for scientists, academics, politicians, and the rest of civil society—and the reality of a network overflowing with pornography, scams, political manipulation, piracy, chat room racists, bigots, and bullies. In recent years we have seen how the Internet has been transformed from a utopian object into a problematic modulator of behavior, including addiction, paedophilia and illicit downloading. It has become an object for censorship—necessitating the weeding out of unpleasant and distasteful content, but also the filtering of politically sensitive and unwanted exchange.20 In fact, in wake of the Jokela high school shootings in Finland in November 2007, there are those who claim, like they did after Columbine, that it is not the guns, but the Internet that is to blame. The uncontrollable and uncensorable flood of damaging information is still grasped as more dangerous than the impact of firearms.
The emergence of inconsistencies and deviations in media history has led Lisa Gitelman to argue that we should “turn to the anomaly” and concentrate on the patterns of dissonance that form when new media encounter old practices. For Gitelman, “transgressions and anomalies . . . always imply the norm and therefore urge us to take it into account as well.”21 Therefore, anomalies become a tool of the cultural analyst, enabling him or her to dig into the essential, so to speak. They can be imagined as vehicles taking us along the lines of a logic that delineates the boundaries between the normal and the abnormal. But in our view such approaches do not dig deeply enough into the logical mode of the anomaly since there is always a danger that such a representational analysis will continue to treat it as an excluded partner (Other) who haunts the normalized procedures of the Same.
Alternatively, we argue that network culture presents us with a new class of anomalous software object and process, which cannot be solely reduced to, for example, a human determined representation of the capitalist mode of consumerism.22 The examples given in this collection— contagious software, bad objects, porn exchange, and modes of network censorship—may well derive some benefit from representational analysis (particularly in the context of porn and spam e-mail content),23 but our anomalies are not simply understood as irregular in the sense that their content is outside of a series. On the contrary, they are understood as expressing another kind of a topological structuring that is not necessarily derived from the success of friction-free ideals as a horizon of expectancy. The content of a porn site,24 a spam e-mail, or a computer virus, for instance, may represent aspects of the capitalist mode of production, but these programs also express a materiality, or a logic of action, which has been, in our opinion, much neglected in the media and communication field. This is a logical line in which automated excessive multiple posting, viral replication, and system hijacking are not necessarily indices of a dysfunctional relation with a normalized state of communication, but are rather capacities of the software code. Software is not here understood as a stable object or a set of mathematically determined, prescribed routines, but as the emergent field of critical software studies is proposing, it is a process that reaches outside the computer and folds as part of the digital architectures, networks, social, and political agendas. When we combine this capacity of software with our focus on the dynamics of the sociotechnical network assemblage, in its entire broadband spectrum, we experience systems that transfer massive amounts of porn, spam, and viral infection. Such capacity, which in our view exceeds the crude distinction between normal and abnormal, becomes a crucial part of the expressive and material distribution of network culture. Porn, spam and viruses are not merely representational; they are also component parts of a sociotechnicallogical praxis. For us, they are a way of tapping into and thinking through the advanced capitalist mode in the context of the network.
We therefore suggest that the capacity of the network topology intimately connects us to a post-Fordist mode of immaterial labour and knowledge production. We do not however prescribe to a strictly defined cybernetic or homeostatic model of capitalist control (a point explained in more detail later), which is designed to patch up the nonlinear flows deemed dangerous (like contagions) to the network. On the contrary, our conception of capitalism is a machine that taps into the creative modulations and variations of topological functioning.25 Networks and social processes are not reducible to a capitalist determination, but capitalism is more akin to a power that is able to follow changes and resistances in both the extensive and intensive redefining of its “nature.” It is easy at this point to see how our vision of the media machine no longer pertains to the anomalous totality described by the Frankfurt and Toronto Schools. Like Wendy Chun, we see this machine as an alternative to the poverty of an analysis of the contemporary media sphere as continuously articulated between the polarity of narratives of total paranoid surveillance and the total freedom of digitopia. Therefore, following Chun, in order to provide a more accurate account of the capacities of media technologies as cultural constellations, this book looks to address networked media on various, simultaneously overlapping scales or layers: hardware, software, interface, and extramedial representation (“the representation of networked media in other media and/or its functioning in larger economic and political systems”).26
Such an approach has led us and other contributors to draw on a Deleuze-Guattarian framework. We might also call this approach, which connects the various chapters of this book, an assemblage theory of media.
Yet, in order to fully grasp this significant aspect of our analysis, it is important to see how Deleuze and Guattari’s meticulous approaches to network society can be applied beyond the 1990s hype of “the rhizome” concept and what we see as its misappropriation as a metaphor for the complexities of networked digital media. In this sense, most contemporary writers using Deleuze and Guattari have been keen to distance themselves from a metaphorical reading of cultural processes in the sense that “metaphoricity” implies a dualistic ontology and positions language as the (sole) active force of culture (see the Contagions section introduction for more discussion). In this context, Deleuze and Guattari have proved useful in having reawakened an appreciation of the material forces of culture, which not only refer to economic relationships, but to assemblages, events, bodies, technologies, and also language expressing itself in other modalities other than meaning. Not all of the chapters are in fact locked into this framework, yet, even if they do not follow the precise line, they do, in our opinion, attempt to share a certain post-representational take which is reluctant to merely reproduce the terms it criticizes and instead explores the various logics and modes of organization in network culture in which anomalies are expressed. Importantly, the chapters are not focused on the question of how discourses of anomalous objects reproduce or challenge the grids of meaning concerning ideology and identity (sex, class, race, etc.) but rather they attempt to explore new agendas arising beyond the “usual suspects” of ideology.
We now move on to explore the topological approach in more detail, proposing that it can do more than simply counter representational reductionism. First, we specify how it can respond to the fault lines of essentialism. Then we use it to readdress a mode of functionalism that has pervaded the treatment of the anomaly from Durkheim to cyberpunk.
TOPOLOGICAL THINKING: THE ROLE OF THE ACCIDENT
In order to further illuminate our question concerning the anomalies of contemporary communication, let us return to the Monty Python sketch for further inspiration and a way in which we might clearly distinguish between a prevalent mode of essentialism and our topological approach. Following strictly essentialist terms we might define Python’s cafe by way of the location of the most important and familiar communication codes;27 looking for the effective functioning of communication norms. In this mode, we would then interpret the “spamming” of the cafe as an oppositional function, setting up certain disparate relations between, on the one hand, a series of perfected communication norms, and on the other hand, the imperfection of our anomaly. Yet, arguably, the Python sketch does more than establish dialectical relations between what is in and what is outside a series. Instead, Python’s comedy tactic introduces a wider network of reference, which unshackles the unessential, enabling the sketch to breach the codes of a closed communication channel, introducing fragments of an altogether different code. Thus, in the novel sense of topological thinking, the British cafe becomes exposed to the transformational force of spontaneous events rather than the static essences or signs of identity politics.
In a way, Monty Python suggests an anti-Aristotelian move, at least in the sense proposed by Paul Virilio: that is a need to reverse the idea of accidents as contingent and substances as absolute and necessary. Virilio’s apocalyptic take on Western media culture argues for the inclusion of the potential (and gradual actualization) of the general accident that relates to a larger ontological shift undermining the spatio-temporal coordinates of culture. In a more narrow sense, Virilio has argued that accidents should be seen as incidental to technologies and modernity. This stance recapitulates the idea that modern accidents do not happen through the force of an external influence, like a storm, but are much more accurately follow-ups or at least functionally connected with, the original design of that technology. In this way, Virilio claimed that Aristotelian substances do not come without their accidents, and breakdowns are not the absence of the presumed order, but are rational, real and designed parts of a media cultural condition: the “normal” state of things operating smoothly.28 With Monty Python, as with Deleuze, the structures of anticipation and accidentality are not simply reversed, but the anomalous communication event itself emerges from within a largely accidental or inessential environment.xxix
To analyze the material reality of anomalous objects, we must therefore disengage from a perspective that sees the presumed friction-free state of networking, the ideal non-erring calculation machine, or a community of rational individuals using technologies primarily for enlightenment as more important than the anomaly (spam, viruses, and porn merely regarded as secondary deviations.) Indeed, in our view, accidents are not simply sporadic breakdowns in social structure or cultural identity, but express the topological features of the social and cultural usage of media technologies. In this context, we concur with Tiziana Terranova,30 who discusses network dynamics as not simply a “space of passage for information,” but a milieu that exceeds the mechanism of established communication theory (senders, channels, and receivers). The surplus production of information comprises a turbulent mixture of mass distribution, contagion, scams, porn, piracy, and so on. The metastability of these multiple communication events are not merely occurrences hindering the essence of the sender–receiver relation, which generally aims to suppress, divide, or filter out disparities altogether, but are instead events of the network topology in itself. The challenge then, is not to do away with such metastabilities, but to look at them in terms of an emergent series and experience them as the opening up of a closed communication system to environmental exteriority and the potentialities that arise from that condition. A condition we can refer to as the inessential of network culture.
THE TOPOLOGICAL SPACE OF “BAD” OBJECTS
If all things have followed from the necessity of the most perfect nature of God, how is it that so many imperfections have arisen in nature—corruption, for instance, of things till they stink; deformity, exciting disgust; confusion, evil, crime, etc.? But, as I have just observed, all this is easily answered. For the perfection of things is to be judged by their nature and power alone; nor are they more or less perfect because they delight or offend the human senses, or because they are beneficial or prejudicial to human nature.31
We have thus far argued that the anomaly is best understood in terms of its location in the topological dynamics of network culture. Significantly, in this new context then, we may also suggest that anomalies are not, as Spinoza realized, judged by the “presumed imperfections of nature” (nature representing a unity, as such), but instead they are judged by “their nature and power alone.” In other words, it matters not if objects “delight or offend the human senses.” Particular “things” and processes are not to be judged from an outside vantage point or exposed to “good” or “bad” valuations. Instead, the ethological turn proposes to look at the potentials of objects and ask how they are capable of expression and making connections.
In this way, the shift toward topological analysis becomes parallel to a perspective that claims to be “beyond good and evil” and instead focuses on the forces constituent of such moral judgments. This marks the approach out as very different from the historical tradition of social theory, particularly the early response of organic functionalists to the good and bad of social events. For example, Emile Durkheim was perhaps the first social scientist to show how anomie played an important part in social formations, but he negated the productive capacities we have pointed to in favor of describing the anomaly as a state of social breakdown. For Durkheim, the ultimate anomalous social act—suicide—stemmed from a sense of a lack of belonging and a feeling of remoteness from the norm. Anomaly as a social phenomenon therefore referred to a deprivation of norms and standards. Although suicide was positively disregarded as an act of evil, it did however signal a rupture in the organics of society, an abnormality, a falling out of series, as such.32 Indeed, his statistical container model of macro society—much appreciated by the society builders of 19th-century Europe—judged social phenomena against the average, the essential, and the organic unity of social functionalism. This of course ruled out seeing anomalies as social phenomena with their own modes of operation and co-causal capacity to affect.
Baudrillard’s notion of the perverse logic of the anomaly intervenes in the functionalist exorcism of the anomalous, as a thing that doesn’t fit in.33 Writing mainly about another bad object, drugs, Baudrillard argued that the anomaly becomes a component part of the logic of overorganization in modern societies. As he put it:
In such systems this is not the result of society’s inability to integrate its marginal phenomena; on the contrary, it stems from an overcapacity for integration and standardization. When this happens, societies which seem all-powerful are destabilized from within, with serious consequences, for the more efforts the system makes to organize itself in order to get rid of its anomalies, the further it will take its logic of over-organization, and the more it will nourish the outgrowth of those anomalies.34
Beyond the law-abiding notion of Durkheim’s anomie Baudrillard, therefore, proposed to consider contemporary phenomena (the writing stems from 1987) as labeled by excess—a mode of hyperrational anomaly. He argued that the modern emphasis placed on control management has itself spurred on these excesses of standardization and rationality. The strange malfunctions become the norm, or more accurately, they overturn the logic of thinking in terms of self versus other. Moreover, in the perverse logic of Baudrillard’s anomalous, the object, as an extensive target of social control, is preceded by an intensive logic that exceeds the grid of explanation imposed by social scientists, educationalists, and therapeutic practitioners. Instead of external deviations contrary to the internal functioning of the social, anomalies start to exhibit an intensive and integral social productivity.
The distinction made here between the intensive productivity of the anomaly and a social model developed around organic unity and functionalism is perhaps better grasped in DeLanda’s similar distinction between relations of interiority, and relations of exteriority.35 In the former, societies are regarded as solely dependent on reciprocal internal relations in order that they may exhibit emergent properties. In the latter, DeLanda seemingly turns the generalized social organism inside out, opening up its component parts to the possibilities and capacities of complex interactions with auxiliary assemblages. In fact, what he does is reconceive the social organism as an assemblage.
So as to further explore this notion of the social assemblage, let’s return to the example of the forensic honeypot computer introduced in the first section of this introduction. Previously understood as a closed system, the rationalized logic machine soon becomes exposed to the disparities of the network. Emergent relations hijack the honeypot’s functionality. Its relation to an exteriority links up these disparities and in turn connects it to other assemblages. It is at this juncture that we locate the transformational differentiation and alterity of the honeypot as it becomes inseparable from the relations it establishes with a multiplicity of other assemblages, populated by technosocial actors, including netbots, virus writers, cookies, and hacker groups and their software. Nevertheless, the anomalies that traverse the assemblage are not simply disparities that suppress or divide, but are instead the role of the anomaly intervenes in the process of the becoming of the honeypot. It establishes communication with other objects related to the assemblage, potentializing new territories or deterritorializing other assemblages.
Anomalies transform our experiences of contemporary network culture by intervening in relational paths and connecting the individual to new assemblages. In fact, the anomaly introduces a considerable amount of instability to what has been described in the past as a cybernetic system of social control.36 In practice, the programs written by hackers, spammers, virus writers, and those pornographers intent on redirecting our browsers to their content, have problematized the intended functionality and deployment of cybernetic systems. This has required cyberneticians to delve deeply into the tool bag of cybernetics in an effort to respond to the problem engendered: How to keep the system under control? For experts in the computing field, defensive software, such as antivirus technology, represents a new mobilization of security interests across the entire networked computing environment instead of being exclusively aimed at single computers,37 and it is interesting to see how many of these defences appear to play to the notion of organic unity as described earlier. For example, computer scientists based at IBM’s TJ Watson Research Centre during the early 1990s attempted to tackle the problem of computer viruses by developing a cybernetic immune system.38 Using mathematical models borrowed from epidemiology, these researchers began to trace the diffusion patterns of computer viruses analogous to the spread of biological viruses. Along with other commercial vendors, they sought out methods that would distinguish between so-called legitimate and viral programs. In other words, their cybernetic immune system was designed to automate the process of differentiating self from non-self and ultimately suppress the threshold point of a viral epidemic (the point at which a disease tips over into a full-blown epidemic).
However, the increasing frequency of digital anomalies has so far confounded the application of the immunological analogy. In fact, research in this area has recently shifted to a focus on topological vulnerabilities in the network itself, including a tendency for computer viruses to eschew epidemiological threshold points altogether.39 Maps of the Internet and the World Wide Web (www), produced by complex network theorists in the late 1990s,40 demonstrate how networks become prone to viral propagation, as they would any other program. There is, as such, a somewhat fuzzy distinction between what can be determined as self and non-self. As we have already pointed out, the anomaly is not, in this sense, outside the norm.
The history of cybernetics provides many more examples of this problem where logic encounters network politics. The origins of Turing’s theory of computational numbers was arguably realized in a paradoxical and largely unessential composition of symbolic logic, in as much as he set out to prove that anomalies coexisted alongside the axioms of formal logic.41 Not surprisingly then, Turing’s halting problem, or the undecidability problem, eventually resurfaced in Cohen’s formal study of computer viruses, a doom-laden forecast in which there is no algorithmic solution to the detection of all computer viruses.42 Indeed, logic systems have long been troubled by their inability to cope with virals. The problem of the self-referencing liar bugged the ancient Greek syllogistic system as much as it has bugged the contemporary cybernetics of network culture.
In this light, it is interesting to draw attention to the way in which these fault lines in cybernetics and Durkheim’s anomie have converged in cyber-culture literature. With its many references to Gaia43 (a theory of natural balance and equilibrium akin to immunology) cyberculture has co-opted the principle of the self-referencing maintenance of organic unity into the fabric of the collectivities of cyberspace. For example, John Perry Barlow argued that the immune system response of the network is “continuously” defining “the self versus the other.”44 In this way, he typified the tendency of cyberpunk’s frontier mentality to discursively situate the digital anomaly firmly outside of the homeostatic system of network survivability. In fact, as Bruce Sterling revealed, cyberpunks and the cyberneticists of the antivirus industry have become strange bedfellows:
They [virus writers] poison the digital wells and the flowing rivers. They believe that information ought to be poisonous and should hurt other people. Internet people build the networks for the sake of the net, and that’s a fine and noble thing. But virus people vandalize computers and nets for the pure nasty love of the wreckage.45
It seems that the much wished-for stability of the cyberpunk’s Daisyworld is increasingly traversed by the instabilities produced by the anomaly. As Sterling noted in another context, “the Internet is a dirty mess”46 that has lost its balance mainly because of the increasing outbreaks of cyberterrorism and cybercrime, but also because of the negligence of the authorities to adequately address the problems facing network culture. In Sterling’s vision, which increasingly echoes those of the capitalist digerati, there is a horizon on which the network eventually becomes a clean and frictionless milieu. Yet such a sphere of possibility rests conceptually on the notion of homeostasis and stability, which sequentially implies a conservative (political) stance. In our view, it is more insightful to follow Geert Lovink’s position that networking is more akin to notworking:
What makes out today’s networking is the notworking. There would be no routing if there were no problems on the line. Spam, viruses and identity theft are not accidental mistakes, mishaps on the road to techno perfection. They are constitutional elements of yesterday’s network architectures. Networks increase levels of informality and also pump up noise levels, caused by chit-chat, misunderstandings and other all too human mistakes.47
We argue that the noise of Lovink’s notworking not only throws a spanner in the works of the cybernetic system, but also more intimately connects us to the capacity of the network to affect and thus produce anomalies. Instead of seeing the network as a self-referential homeostatic system, we want to therefore propose an autopoietic view of networks wherein alterity becomes the mode of operation of this sociotechnical machine (even though, e.g., Lovink might be reluctant to use these concepts). So if we would want to approach network systems in a broad framework as autopoietic systems, one would need to emphasize their difference from an old ideal of harmonious determined Nature. Following Guattari,48 we argue that systems are not structures that merely stabilize according to a predetermined task, but are instead machines composed in disequilibrium and a principle of abolition. Here, re-creation works only through differentiation and change, which are ontological characteristics of a system that relies continuously on its exterior (a network). The digital network is consequently composed in terms of a phylogenetic evolution (change) of machines, and importantly understood as part of a collective ecological environment. In this context, the maintenance project of any machine (social, technical, or biological system) cannot be simply confined to the internal (closed in) production of self, or for that matter the detection of non-self, but instead returns us to the individuation process (discussed earlier) and the continuance of what Guattari called the “diverse types of relations of alterity.”49 We argue that a condition akin to a horror autotoxicus of the digital network, the capacity of the network to propagate its own imperfections, exceeds the metaphor with natural unity. Indeed, despite a rather vague notion about the purposeful essence of network production as described by individuals like Bill Gates (something perhaps akin to Spinoza’s “perfect nature of God”), the network itself is without a doubt the perfect medium for both perfection and imperfection.
CONCLUSION: STANDARD OBJECTS?
We do not doubt that what we are dealing with here are very curious objects indeed. They present mind-boggling problems to system managers and network controllers Yet, the failure to adequately overcome the computer virus problem perhaps pales in comparison to what Wired Magazine described as the next big issue for network security: the autonomous software netbots (or spambots) that are more flexible and responsive to system defences than the familiar model of pre-programmed computer viruses and worms. As Wired described the latest threat
The operational software, known as command and control, or C&C, resides on a remote server. Think of a botnet as a terrorist sleeper cell: Its members lurk silently within ordinary desktop computers, inert and undetected, until C&C issues orders to strike.50
Here we see that the netbot becomes discursively contemporised in terms of a latent terrorist cell that evades the identification grid of an immune system. Possibly this marks a discursive shift away from the biological analogy with viruses and worms toward the new anxieties of the war on terror. Whatever the rhetoric, identification is perhaps the key contemporary (and future) problem facing not just computer networks, but networks of political power, wherein nonexistence (becoming invisible) can become a crucial tactical gesture, as Galloway and Thacker suggest in Chapter 13.
The invisibility of software objects has in practice confounded a media studies approach orientated toward a representational analysis of phenomenological “content.” Software considered as a specific set of instructions running inside a computer is obviously something more akin to a performance, rather than a product of visual culture. To combat the often-simplistic analysis of software, Lev Manovich proposed, back in 2001, that media studies should move toward “software studies,” and in doing so he provided an early set of principles for an analysis of new media objects. Manovich’s principles of new media include numerical representation, modularity, automation, variability and transcoding. New media in this way is based on the primary layer of computer data—code—that in its programmability separates “new” from “old” media, such as print, photography, or television.51 However, since then, Chun noted how Manovich’s notion of transcoding— that software culture and computation is about translating texts, sounds, and images into code—is not a sufficiently rich notion.52 Instead of registering (repeating) differences that pre-exist, Chun argued that computation makes differences and actively processes code in and out of various phenomenological contexts, such as text or sound. Her argument is supported by virus researchers who note that even a simple opening and closing of an application, or rebooting of a system, can make changes to boot sector files, log files, system files, and Windows’ registry.
For example, opening and closing a Word document is a computational process that may result in, for example, the creation of temporary files, changes to macros, and so forth.53 However, these processes do not directly come into contact with the human senses (we cannot always see, hear, touch, taste, or indeed smell an algorithmic procedure) and there is consequently a deficit in our cognitive and conceptual grasping of software objects and processes, as such. Yet, despite the abstract nature of mathematical media, these processes are completely real and demand attention from cultural theory, not least because the contemporary biopower of digital life functions very much on the level of the nonvisual temporality of computer network. This is why cultural theory needs to stretch its conceptual capacities beyond representational analysis and come up new notions and ideas in order to better grasp the technological constellations and networked assemblages of “anomalous media culture.” By proposing novel concepts, like those suggested by Deleuze for example, we do not aim to prescribe a trendy cultural theory, but rather enable a rethink of the processes and emerging agendas of a networked future.
The anomalous objects discussed in this volume can therefore be taken as indices of this novel media condition in which complex transformations occur. Yet, while on an algorithmic and compositional level, the objects and processes highlighted in spam e-mails, computer viruses, and porn communities are not in anyway different from other objects and processes of digital culture, there is clearly a repetitious and discursive filtering process going on: If software is computation that makes a difference (not just a coding of differences), then there is also a continuous marking out of what kind of processes are deemed as normal, abnormal, and/or anomalous. In other words, there is an incessant definition and redefinition of what, on the one hand, makes a good computation, a good object, and a good process, and on the other hand, what is defined as irresponsible and potentially a bad object or process. However, as noted earlier, the material and expressive boundaries of these definitions are not at all clear. We may, in this light, therefore suggest that such turbulent objects are considered as standard objects of network culture.54 Instead of merely being grasped as elements that should be totally excluded from the economic, productive, and discursive spheres of the knowledge society, they are equally understood as captured and used inclusively within the fabrication of digital assemblages. For example, the anomaly takes on new functions as an innovative piece of evolutionary “viral” or “spam” software (in digital architecture or sound production for instance), or is translated into new modes of consumer organization and activation (viral marketing), or becomes adapted to serve digital sociality in practices and communities (pornographic exchange). Ultimately, if capitalism is able to make novel use of these critical practices of resistance, then cultural and media theorists should do likewise. Otherwise, they will remain anomalies for a theory unable to perceive of new modulations of power and politics functioning on the level of software.
From the varied perspectives offered in this volume the reader will notice that our take on the anomaly is not considered sacrosanct—anomalous digital objects are distributed across many scales and platforms. However, we do feel that all of the following chapters intersect with our notion of the anomalous object, albeit provoking a controversy around its compositional theme. Therefore, in order to introduce a sense of organization to the mixture of viewpoints put forward in The Spam Book we have divided the chapters in subsections: Contagions, Bad Objects, Porn, and Censored. Each subsection has an introduction setting out how we, the editors, grasp the position and the value of each chapter. As we have already suggested, there are of course many takes on the digital anomaly, but what The Spam Book proposes to do is shed some light on what has, until now, remained on the dark side of media and communication and cultural analysis.
PART I CONTAGIONS
NO METAPHORS, JUST DIAGRAMS . . .
Digital contagions are often couched in analogical metaphors concerning biological disease. When framed in the linguistic structures of representational space, the biological virus becomes a master referent, widely dispersed in the fields of cultural studies, computer science, and the rhetoric of the antivirus and network security industries.55 The figurative viral object becomes part of a semiotic regime of intrusive power, bodily invasion, uncontrollable contamination, and even new modes of auto-consumerism. Nevertheless, although representational analysis may have an application in a media age dominated by the visual image, the approach does not, in our opinion, fully capture the imperceptible constitutive role of contagion in an age of digital networks.
Indeed, when contemplating the metaphor of contagion, it is important to acknowledge two constraining factors at work. First, the analytical focus of metaphorical reasoning may well establish equivalences, but these resemblances only really scratch the surface of an intensive relation established between a viral abstraction and concrete contagious events. Second, it is important to recognize the political import of the analogical metaphor in itself. It has an affective charge and organizational role in the spaces, practices, and productions of digital network culture. For example, as seen in this section, the resemblances established between neo-Darwinian genes and computer viruses have imposed the logic of the arms race on evolutionary computing.56 In conjunction with epidemiological and immunological analogies, the digital gene delimits the patterning of software practices, excluding the contagious anomaly from the norms of code reproduction.
In light of the often-divisive imposition of the metaphor in the materiality of digital network culture, it is important that the this volume provides a potential escape route out of the analytical constraints of representation. Gilles Deleuze could, in our opinion, function as one alternative thinker who provides a set of tools for a post-representational cultural analysis. Via Deleuze we can substitute the metaphoric burden of the biological referent on its digital “viral cousins” with an exposition of the constitutive role contagion plays in material spaces, time-based practices, and productions. In place of the negatives of the metaphor we find an abstract diagram with an affirmative relation to the concrete contagious assemblages of digitality. To be more concise, the diagrammatic refrain is what holds these assemblages together, or even more succinctly, attempts to delimit and control the identities of these larger unities. Think of the abstract diagrams used in this section as descriptions of the intensity of relations, repetitiously “installed” in the concreteness of the digital assemblages addressed in each chapter.
We begin the section with John Johnston’s chapter on the computer viruses’ relation to artificial life (ALife) research. Johnston loses the familiar metaphorical references to the spread of biological disease and instead explores the complex relationality between illicit virus production and the futures of ALife research. The chapter is stripped bare of the verbosity of Deleuzian ontology, yet arguably, the abstract diagram is ever present. It is apparent in the chapter’s endeavor to dig beneath the surface of analogical reasoning and instead explore the limitations and mysteries of “imitating biology.” In this way, Johnston refocuses our attention on the problematics of establishing a link between organic life and nonorganic digitality. In fact, Johnston’s diagram presents a somewhat challenging distinction between the two, and as a result he questions the viability of virally coded anomalies, which are both outside of the natural order of things and at risk of exceeding the services of human interest.
Tony Sampson’s chapter (chap. 2) uses three questions to intervene in a conception of universal contagion founded on the premise of “too much connectivity.” Beginning with a brief account of the universality of the contagious event, he locates the prominence of a distributed network hypothesis applied to the digital epidemic. However, Sampson points to an alternative viewpoint in which evolving viral vulnerabilities emerge from a composition of stability and instability seemingly arising from the connectivity and interaction of users. Chapter 2 then moves on to argue that efforts made by antivirus researchers to impose epidemiological and immunological analogies on these emerging susceptibilities in digital architecture are very much flawed. For example, the crude binary distinction between self and non-self is regarded here as ill-equipped to manage the fuzzy logics of an accidental topology. Indeed, Sampson sees the problems encountered in antivirus research extending beyond the defense of the body of a network to the control of a wider network of interconnecting social bodies and events. He concludes by speculating that the problematics of anomaly detection become part of a broader discursive and nondiscursive future of network conflict and security.
In Chapter 3, Luciana Parisi pushes forward the debate on the universality of viral ecologies by seeking to avoid distinctions made between digital and analogue, technical and natural, and mathematical and biological architectures. The focus of her analysis moves instead to the material capacity of infectious processes, which exceed the organizational tendencies of algorithmic logic. For example, Parisi investigates how the neo-Darwinian genetic code imposes an evolutionary schema on the diagram of digital contagion. However, unlike Johnston, Parisi argues that the organic and nonorganic of digitality are assembled together and that assumptions made in the practice of writing genetic algorithms fail to grapple with the symbiotic nature of what she terms the abstract extensiveness of digital architecture. Parisi employs a complexity of reasoning to explain her alternative blob architectures. If the reader is unfamiliar with the influential events theory of Alfred N. Whitehead or Lynn Margulis’ notion of endosymbiosis then the ideas expressed can be difficult to tap into. Nevertheless, a concentrated deep read will offer great rewards to those wanting to discover digital contagion in a novel and profound light.
Finishing this section, Roberta Buiani (chap. 4) proposes that virality is not merely an inherent “natural” part of the software code, but is continuously distributed as figures of contagion, virulence, and intensivities across popular cultural platforms. Making a useful distinction here between being viral and becoming viral, Buiani returns us to the limits imposed by the metaphoric regime of disease and the nonlimitative distribution of a flexible “single expression.” In its becoming, virality has the potential to produce creative outcomes, rather than just new threats—new diagrams perhaps?
Like this:
Like Loading...