Radical media, politics and culture.

Alex Galloway, "Protocol, or, How Control Exists after Decentralization

jim writes: from Rethinking Marxism Volume 13, Number 3/4 (Fall/Winter 2001)


Protocol, or, How Control Exists after Decentralization


Alex Galloway

So far there is no Marxist theory of the media.

-Hans Magnus Enzensberger

The basic question asked by Empire is this: How does control exist after decentralization?

In former times the answer was clear. In what Michel Foucault called the sovereign societies of the classical era, characterized by centralized power and sovereign fiat, control existed as an extension of the word and deed of the master, assisted by violence and other coercive factors. Later, the disciplinary societies of the modern era took hold, replacing violence with more bureaucratic forms of command and control. Gilles Deleuze has extended this periodization into the present day by suggesting that after the disciplinary societies come the societies of control. Hardt and Negri (2000) fundamentally agree with this periodization, calling it instead the move from the modern disciplinary societies to societies of imperial control.

As a periodizing concept, Empire allows us to recognize wholly new technologies concurrent with imperial power. Taking a cue from Deleuze we read: ?The old sovereign societies worked with simple machines, levers, pulleys, clocks; but recent disciplinary societies were equipped with thermodynamic machines . . . control societies operate with a third generation of machines, with information technology and computers? (Deleuze 1990, 180) (1). Just as Marx rooted his economic theory in a strict analysis of the factory's productive machinery, Empire heralds the coming productive power of computers to explain the sociopolitical logics of our own age. And just as Capital begins somewhat synchronically with an examination of the commodity, then shifts into a more grounded examination of production, Empire decides also to descend "into the hidden abode of production" (2) after laying out the form of imperial world order.

The reason for my essay is this curious "descending into the hidden abode of production." The hidden abode of production means many things in the age of Empire. It means descending into the real conditions of Third World chip-making factories populated by the "destitute, excluded, repressed, exploited - and yet living!" working poor (156), just as it means descending into the boardrooms of dotcom start-ups. These are material referents. Still, descending into the hidden abode of production means something else in the digital age. It means that we must also descend into the somewhat immaterial technology of modern-day computing, and examine the formal qualities of the machines that constitute the factory loom and industrial Colossus of our age. The factory was modernity's site of production. The "non-place" of Empire refuses such an easy localization. For Empire, we must descend instead into the distributed networks, the programming languages, the computer protocols, and other digital technologies that have transformed twenty-first-century production into a vital mass of immaterial flows and instantaneous transactions. Indeed, we must read the never ending stream of computer code as we read any text (the former having yet to achieve recognition as a "natural language"), decoding its structure of control as we would a film or novel.

At the core of networked computing technologies is the concept of protocol. A computer protocol is a set of rules that govern networked relations. The protocols that govern the Internet, for example, are contained in what are called Request For Comments (or RFC) documents. They are the written definitions of the protocols and policies of the Internet. The RFCs are managed by the Networking Division of the University of Southern California's Information Sciences Institute (ISI). They are freely available to anyone wishing to build hardware or software that meets their specifications. The ISI is itself under the directive of the Internet Engineering Task Force, an international, technocratic community of network scientists, and the Internet Society, an altruistic but equally technocratic organization that wishes "[t]o assure the open development, evolution and use of the Internet for the benefit of all people throughout the world" (Internet Society). The lesser protocols that aid communication on the World Wide Web (a network within the Internet) are governed by the World Wide Web Consortium. This international consortium was created in October 1994 to develop common protocols such as Hypertext Markup Language and Cascading Style Sheets. Scores of other protocols have been created for other uses.

To help understand the concept of computer protocols, consider the analogy of the highway system. Many different combinations of roads are available to a person driving from point A to point B. However, en route one is compelled to stop at red lights, stay between the white lines, follow a reasonably direct path, and so on. These conventional rules that govern the set of possible behavior patterns within a heterogeneous network are what computer scientists call protocol. Protocol is voluntary regulation within a contingent environment. Protocols operate at the level of coding: they encode packets of information so that they may be transported, they code documents so that they may be effectively parsed, and they code communication so local devices may effectively communicate with foreign devices. Protocols are highly formal - that is, they encapsulate information inside a technically defined wrapper while remaining relatively indifferent to the content of the information. Protocol is not a synonym for informatization or digitization, or at least not only. While a knowledge of informatization is crucial for understanding the new economy (as Empire illustrates in section 3.4), protocol is something different. It is a distributed management system that allows control to exist within immanent material relations.

It is common for contemporary critics to describe the Internet as an unpredictable mass of data, rhizomatic and lacking central organization. This position goes roughly like this: since new communication technologies are based on the elimination of centralized command and hierarchical control, it follows that we are witnessing a general disappearance of control as such. This could not be further from the truth. Empire does much to dispel this myth in the social and political arenas. Whereas Empire is how political control exists under decentralization, protocol is how technological control exists under decentralization. Although Hardt and Negri flirt with this gee-whiz position on new technologies, writing that, within the Internet, "[a]n indeterminate and potentially unlimited number of interconnected nodes communicate with no central point of control," and that this decentralized architecture is "what makes control of the network so difficult" (299), the attentive reader will notice that here they actually mean modern control and not imperial control. What they say elsewhere about Empire should also be true of new media. A decentralized architecture is precisely that which makes protocological/ imperial control of the network so easy. In fact, the various Internet protocols mandate that control may only be derived from such a distributed architecture. Hardt and Negri confirm this position by writing elsewhere that "the passage to the society of control does not in any way mean the end of discipline. In fact, the immanent exercise of discipline . . . is extended even more generally in the society of control" (330).

But why should this be the case at the technological level? Protocol is based on a contradiction between two opposing technologies: one radically distributes control into autonomous locales, and the other focuses control into rigidly defined hierarchies. I argue that the tension between these two technologies creates the conditions of existence of protocological control.

The first technology, and the one that gives the Internet its common image as an uncontrollable network, derives mainly from the family of protocols known as TCP/IP. Transmission Control Protocol (TCP) and Internet Protocol (IP) are the leading protocols for the actual transmission of data from one computer to another over the network. TCP and IP work together to establish connections over the network and move data packets effectively through those connections. Due to the way TCP/IP was designed, any machine on the network can talk to any other machine, resulting in an antihierarchical, many-to-many structure. As one technical manual puts it: "IP uses an anarchic and highly distributed model, with every device being an equal peer to every other device on the global Internet" (Hall 2000, 407).

The second technology, which focuses control into rigidly defined hierarchies, is called the Domain Name System (DNS). DNS is a large, decentralized database that contains information on how to map a network address to a network name. This mapping is required for every network transaction. For example, in order to visit "www. rhizome.org" on the Internet, one's computer must first translate the name "www. rhizome.org," itself geographically vague, into a specific address on the network. These specific addresses are called IP addresses and are written as a series of four numbers like so: 206.252.131.211. All DNS information is controlled in a hierarchical, invertedtree structure. At the top of this inverted tree are a handful of so-called root servers holding ultimate control and delegating lesser control to lower branches in the hierarchy. The address "www.rhizome.org" is parsed in reverse, starting with the "org." Thus, in theory, the root server receives a request from the user and directs the user to another machine that has authority over the "org" domain, which in turn directs the user to another machine that has authority over the "rhizome" subsection, which in turn returns the IP address for the specific machine known as "www." This process of deriving an IP address from a domain name is called resolution.

If some controlling authority wished to ban all Web pages ending in ".org" from the Internet, they could do so in an instant through a simple modification of the information contained in the dozen or so root servers scattered around the globe. Without the foundational support of the root servers, all lesser branches of the DNS network become unusable. Such a reality should shatter our image of the Internet as a vast, uncontrollable rhizome.

Any networked relation will have multiple, nested protocols. Take, for example, a typical transaction on the World Wide Web. A Web page containing text and graphics (themselves protocological artifacts) is marked up in the Hypertext Markup Language (HTML) protocol. The protocol known as Hypertext Transfer Protocol (HTTP) encapsulates this HTML object and allows it to be served by an Internet host. However, both client and host must abide by the TCP protocol to ensure that the HTTP object arrives in one piece. Finally, TCP is itself nested within the Internet Protocol, a protocol that is in charge of actually moving data packets from one machine to another. Ultimately the entire bundle (the primary data object encapsulated within each successive protocol) is transported according to the rules of the only "privileged" protocol, that of the physical media itself (fiberoptic cables, telephone lines, airwaves, etc.).

Protocol is materially immanent. That is, protocol does not follow a model of command and control that places the commanding agent outside that which is being commanded. For example, the protocological manipulation of an HTML object by an HTTP object begins first with the parsing of the HTML object and the creation of a special HTTP header. The header contains various pieces of information about the HTML object such as the size of the file, the date the file was last modified, and the make and model of the server offering the file. The HTTP object, then, is simply the HTML object plus its new header, all wrapped up into a new form. The new header is prefixed to the original content, becoming part of its material body. But since the HTTP header is nothing but a description of the material contents of the HTML object, the larger protocol (HTTP) is simply a way of rewriting the smaller one (HTML); the smaller data object is encapsulated by the larger one. In doing so, the HTML object is disciplined to work in the larger context of HTTP.

While protocol is immanent to a particular milieu, protocological objects never contain their own protocol. Thus, TCP/IP houses HTTP, which houses HTML, which houses ASCII text, and so forth. New headers are added at each level but, in terms of content, protocols are never continuous with themselves. At each phase shift one is able to identify a data object from the intersection of two articulated protocols. In order to see HTML, one must actually view it as it intersects with HTTP. Otherwise, one looks at HTML and sees nothing but its own internal protocols: text and mark-up tags.

The computer protocol is thus in lockstep with Hardt and Negri's analysis of Empire's logics, particularly the third mode of imperial command: the managerial economy of command (2000, 199). This command protocol knows from the start that "[c]ontingency, mobility, and flexibility are Empire's real power" (200). The flexible networks and flows identified in the world economy by Manuel Castells and other anchormen of the Third Machine Age are not mere metaphors; they are in fact built directly into the technical specifications of network protocols. By design, the Internet Protocol cannot be centralized. The RFC documents declare this in black and white.

Protocol's native landscape is the distributed network. A distributed network differs from others, such as centralized and decentralized networks, in the arrangement of its internal structure. A centralized network consists of a single central power point (a host), from which are attached radial nodes. The central point is connected to all the satellite nodes which are themselves connected only to the central host: "All roads lead to Rome." A decentralized network, by contrast, has multiple central hosts, each with its own set of satellite nodes. A satellite node may have connectivity with one or more hosts, but not with other nodes. Communication generally travels unidirectionally within both centralized and decentralized networks: from the central trunk to the radial leaves.

The distributed network is an entirely different matter: "Empire has no Rome" (317). Distributed networks are native to the age of Empire. A point in a distributed network is neither a central hub nor a satellite node. The network contains nothing but "intelligent end-point systems that are self-deterministic, allowing each end-point system to communicate with any host it chooses" (Hall 2000, 6). Like the rhizome, each node in a distributed network may establish direct communication with another node without having to appeal to a hierarchical intermediary. In order to initiate communication, the two nodes must speak the same language. Thus, distributed networks are defined by these same languages - these shared protocols. As Branden Hookway writes, "Distributed systems require for their operation a homogenous standard of interconnectivity" (1999, 77). A common protocol leads to network articulation, while incompatible protocols leads to network disarticulation. For example, two computers running the DNS addressing protocol will be able to communicate effectively with each other about network addresses. Sharing the DNS protocol allows them to be networked. However, the same computers will not be able to communicate with a foreign device running, for example, the NIS addressing protocol or the WINS protocol (3). Without a shared protocol, there is no network.

Protocological control mirrors the movements of Empire. Empire is the social theory, protocol the technical. Thus, Hardt and Negri are precisely accurate in their analysis of the "symptoms of passage." An analysis of computer protocols proves this for it reassigns the former weapons of Leftists (celebration of difference, attack on essentialism, etc.) as the new tools of Empire: "This new enemy not only is resistant to the old weapons but actually thrives on them, and thus joins its would-be antagonists in applying them to the fullest. Long live difference! Down with essentialist binaries?"(138). For example, a decentralized network is precisely what gives the Internet Protocol its effectivity as a dominant protocol. Or to take another example, the flimsy, cross-platform nature of HTML is precisely what gives it its power as a protocological standard. Like Empire, if protocol dared to centralize, or dared to hierarchize, or dared to essentialize, it would fail.

We turn now to Michel Foucault to derive one final quality of protocol: the special existence of protocol in the "privileged" physical media of bodies. Protocol is not merely confined to the technical arena. As Deleuze (1990) shows, protocological control also affects the functioning of bodies within social space, and the creation of those bodies as forms of "artificial life" that are dividuated (4) sampled, and coded. Artificial life is protocol in the sociopolitical theater, the active production of vital forms by other vital forms- what Foucault calls the "work of the self on the self," a slogan that Hardt and Negri also reference regarding "Humanism After the Death of Man" (91-2).

In Foucault, there is a distinct desire to be antianthropological - that is, a desire to write a theory of history that is not based on living human beings. In The Archaeology of Knowledge he expresses his desire to "define a method of historical analysis freed from the anthropological theme . . . a method of analysis purged of all anthropomorphism" (1999, 16). He claims that he wants to uncover the principles of an "autochthonic transformation??that is, a transformation in the world of words and things that is immanent, particular, spontaneous, and anonymous.

This antianthropological instinct is also evident in his definition of discourse. For example, he writes that ?discourse is not the majestically unfolding manifestation of a thinking, knowing, speaking subject, but, on the contrary, a totality, in which the dispersion of the subject and his discontinuity with himself may be determined? (55). Foucault's interests lie not in the social, but in the "anonymous and general subject of history" (16), a diffuse subjective logic that foreshadows the logic of protocol.

Thus, what I would like to suggest is that Foucault?s relationship to life forms is a protological one. This is expressed most clearly in his later work, particularly in the twin concepts of biopolitics and biopower. Foucault defines the former as "the endeavor, begun in the eighteenth century, to rationalize the problems presented to governmental practice by the phenomena characteristic of a group of living human beings constituted as a population: health, sanitation, birthrate, longevity, race" (1997, 73). Thus, we can assume that the whole mess of biometrics and statistical analysis (from the Bertillon identification system, to the Social Security Act of 1935, to the tabulation of birthrates by the Children's Defense Fund) all fall under the category of biopolitics. Further, he writes that biopolitics "tends to treat the 'population' as a mass of living and coexisting beings who present particular biological and pathological traits and who thus come under specific knowledge and technologies" (71). Biopolitics, then, connects to a certain statistical knowledge about populations. It is a species-knowledge.

Still, Foucault puts equal stress on "technologies" as he does "knowledge" in his definition of biopolitics. But which technologies in particular would correspond to Foucault's biopolitical scenario? They are the distributed forms of management that characterize the contemporary computer network, and within which protocological control exists.

In The History of Sexuality, Foucault contrasts the older power of the sovereign over life (one characterized by the metaphysical concern of either the absence or presence of life) with a new mode in which life is either created or destroyed: "One might say that the ancient right to take life or let live was replaced by a power to foster life or disallow it to the point of death" (1978, 138). He continues: "The old power of death that symbolized sovereign power was now carefully supplanted by the administration of bodies and the calculated management of life" (138-40, emphasis mine). Foucault's treatment of biopower is entirely protocological. Protocol is to control societies as the panopticon is to disciplinary societies. Protocol is more democratic than the panopticon, but it is still structured around command and control. Deleuze recognized this?that the very site of Foucault?s biopower was also a site of resistance. Lest we overlook its importance, he repeats his realization three times consecutively in an important section of his book Foucault: "[1] when power . . . takes life as its aim or object, then resistance to power already puts itself on the side of life, and turns life against power . . . [2] Life becomes resistance to power when power takes life as its object . . . [3] When power becomes bio-power resistance becomes the power of life, a vital power that cannot be confined within species, environment or the paths of a particular diagram" (1986, 92). Is life resistance a way of engaging with distributed forms of protocological management?

Hardt and Negri answer yes to this question in parts 3 and 4 of Empire. Those who are born against, those who constitute the new barbarians, those who have "realize[d] their gift" (216), those who love community and hate imperial power - they are the protocological actors who inhabit the underbelly of distributed networks. This essay does not have its own parts 3 and 4, its own exegesis of "the plastic and fluid terrain of the new communicative, biological, and mechanical technologies" (218). But the effects of distributed networks and protocological control on the marxist liberatory project are massive indeed. As Deleuze remarked to Negri several years before Empire:

It's true that, even before control societies are fully in place, forms of delinquency or resistance (two different things) are also appearing. Computer piracy and viruses, for example, will replace strikes and what the nineteenth century called "sabotage" . . . You ask whether control or communication societies will lead to forms of resistance that might reopen the way for a communism . . . The key thing may be to create vacuoles of noncommunication, circuit breakers, so we can elude control. (1990, 175)

The key here is not the eluding or the breaking or the noncommunication, but simply that Deleuze had the foresight to situate resistive action within the protocological field. In the same way that biopower is a species-level knowledge, protocol is a type of species-knowledge for coded life forms. The connection to Marx's "species-being" is crucial here, for this is its redemptive quality. I am not suggesting that we should learn to love our various apparatuses of control, but rather that, for all its misgivings, protocological control is still an improvement over other modes of social control. It is through protocol that we must guide our efforts, not against it.

Notes


1. An alternative translation is available as "Postscript on the Societies of Control" in October: The Second Decade, 1986-1996, ed. Krauss et al. (Cambridge, Mass.: MIT Press, 1997).

2. Hardt and Negri?s words, which appear both in the preface and on page 205 of Empire, are borrowed from Marx (1990, 279).

3. WINS, or Windows Internet Name Service, is an addressing technology developed by Microsoft for distributed networks. NIS, or Network Information Service, is a similar technology developed by Sun Microsystems.

4. Deleuze's neologism comes from the word "individuate." Dividuation would thus be the opposite: the dissolving of individual identity into distributed networks of information.

References

Deleuze, G. 1986. Foucault. Trans. S. Hand. Minneapolis: University of Minnesota Press.

---. 1990. Negotiations. Trans. M. Joughin. New York: Columbia University Press.

Foucault, M. 1978. The history of sexuality. Vol. 1. Trans. R. Hurley. New York: Vintage.

---. 1997. Ethics: Subjectivity and truth. Ed. P. Rabinow. New York: New Press.

---. 1999. The archaeology of knowledge. New York: Pantheon.

Hall, E. 2000. Internet core protocols: The definitive guide. Sebastopol, Calif.: O'Reilly.

Hardt, M., and A. Negri. 2000. Empire. Cambridge, Mass.: Harvard University Press.

Hookway, B. 1999. Pandemonium: The rise of predatory locales in the postwar world. New York: Princeton Architectural Press. Internet Society. Mission statement. http://www.isoc.org/isoc/mission.

Marx, K. 1990. Capital. Vol. 1. Trans. B. Fowkes. New York: Penguin.