Radical media, politics and culture.

hydrarchist's blog

Modem, 56 Kbps Cable, 512 Kbps T1, 2 Mbps

Picture,200 Kb 40 seconds 2 seconds 2 seconds Music track, 4 Mb 13 min 30 seconds 1 minutes 15 seconds Full-length movie, 400 Mb 22 hours 1 hour 45 minutes 25 minutes Five-minute video clip, 20 Mb 1 hour 6 minutes 2 minutes 200 page Novel, 1 Mb 4 minutes 15 seconds 4 seconds

distributed distribution

Peer networks can be used to deliver the services known as Content Distribution Networks (CDNs), essentially comprising the storage, retrieval and dissemination of information. Companies such as Akamai and Digital Harbour have already achieved significant success through installing their own proprietary mdels of this function on a global network level, yet the same functions can be delivered by networks of users even where they have only a dial-up connection. Napster constituted the first instantiation of this potential and subsequent generations of file-sharing technology have delivered important advances in terms of incrasing the robustness and efficiency of such networks. In order to understand the role that peers can be play in this context we must first examine the factors which determine data flow rates in the network in general.

The slow roll-out of broadband connections to home users has concentrated much attention on the problem of the so-called 'last mile' in terms of connectivity. Yet, the connection between the user and their ISP is but ne of four crucial variables deciding the rate at which we access the data sought. Problems of capacity exist at multiple other points in the network, and as the penetration of high speed lines into the 'consumer' population increases these other bottlenecks will becme more apparent.

If the desired information is stored at a central server the first shackle on speed is the nature of the connection between that server and the internet backbone. Inadequate bandwidth or attempts to access by an unexpected number of clients making simultaneous requests will handicap transfer rates. This factor is known as the 'first mile' problem and is highlighted by instances such as the difficulty in accessing documentation released during the clinton impeachment hearings and more frequently by the 'slash-dot effect'.

In order to reach its destination the data must flow across several networks which are connected on the basis of what is known as 'peering' arrangements between the netwrks and faciltated by routers which serve as the interface. Link capacity tends to be underprovided relative to traffic leading to router queuing delays. As the number of ISPs continues to grow this problem is anticipated to remain as whether links are established is essentially an economic question.

The third point of congestion is located at the level of the internet backbone through which almost all traffic currently passes at some point. The backbones capacity is a function of its cables and more problematically its routers. A mismatch in the growth of traffic and the pace of technological advance in the area of router hardware and software package forwarding. As more data intensive trasfers proliferate this discrepancy between demand and capacity is further exacerbated leading to delays.

Only after negotiating these three congestion points do we arrive at delay imposed at the last mile.

What are the benchmarks to evaluate Quality of Service ("Typically, QoS is characterized by packet loss, packet delay, time to first packet (time elapsed between a subscribe request send and the start of stream), and jitter. Jitter is effectively eliminated by a huge client side buffer [SJ95]."Deshpande, Hrishikesh; Bawa, Mayank; Garcia-Molina, Hector, Streaming Live Media over a Peer-to-Peer Network)

Current Technologies Current Implementations 1. Storage Service Providers descriptions of akamai freeflow hardware software mix: algorithms plus machines mapping server (fast to check hops to region) and content server http://www.wired.com/wired/archive/7.08/akamai_pr.html sandpiper applications

Akamai

13,000 network provider data centers locations edge servers click thru - 20% - 10 - 15% abdonmnet rates 15% + order completion

- overladen web servers - reduce delays

first static now dynamic and customized (edge server)

fig.1 trad server distributed server

illustrate delivery speed determinants

database/legacy ----- middleware ----- client browser

middle - performance/security/simplification of client program operation

IRAC issue: cache management TTL value

Issue: personalisation/cookie/cms driven content

Load Balancing "Load balancing is a technique used to scale an Internet or other service by spreading the load of multiple requests over a large number of servers. Often load balancing is done transparently, using a so-called layer 4 router?." [wikipedia] Lb Appliances LB Software LB Intelligent Switches Traffic Distributors

Supernodes Gnucleus Bearshare and Limewire are all compatible.

Cisco (DistributedDirector), GTE Internetworking (which acquired BBN and with it Genuity's Hopscotch), and Resonate (Central Dispatch) have been selling such solutions as installable software or hardware. Digex and GTE Internetworking (Web Advantage) offer hosting that uses intelligent load balancing and routing within a single ISP. These work like Akamai's and Sandpiper's services, but with a narrower focus. - wired

Data providers concerned to provide optimal delivey to end users are increasingly opting to use specialist services such as Akamai to overcome these problems. Akamai delivers faster content through a combination of propritary load balancing and distribution algorithms combined with a network of machines installed across hundreds of networks where popularily requested data will be cached. (11,689 servers across 821 networks in 62 countries). This spead of servers allows the obviation of much congestion as the data is provided from the server cache either on the network itself (bypassing the peering and backbone router problems and mitigating that of the first mile) or the most efficient available network given load balancing requirements.

File Sharing Technologies Popular file sharing utilities arose to satisfy a more worldly demand than the need to ameliorate infrastructural shortfalls. When Shaun Rhyder released his Napster client the intention was to allow end-users to share MP3 files through providing a centralised index of all songs available on the network at a given moment and the ability for users to connect to one another directly to receive the desired file. Essentially popular file sharing utilities enable content pooling. Napser's legal woes generated the necessary publicity to encourage user adoption and for new competitors to enter the market and to innovate further. In the following section I describe some of the later generations of file sharing software and chart their innovations which have brought them into a space of competition with Akamai et al.

Original implementation has been credited to [Justin Frankel]? and [Tom Pepper]? from a programming division of AOL (then-recently purchased Nullsoft Inc.) in 2000. On March 14th, the program was made available for download on Nullsoft's servers. The source code was to be relased later, supposedly under the GPL license. The event was announced on Slashdot, and thousands downloaded the program that day. The next day, AOL stopped the availability of the program over legal concerns and restrained the Nullsoft division from doing any further work on the project. This did not stop Gnutella; after a few days the protocol had been reverse engineered and compatible open source clones started showing up. (from Wikipedia)

[ENTER DESCRIPTION]

The greatest blind spot in McChesney’s analysis however concerns his silence on the issue of intellectual property. Thus, he devotes a section of his internet-chapter to examining the role played by a traditional media manufacturers in determining the contours of the new landscape, their advertising forecasts, their partnerships for the distribution of music, their ownership of high-profile brands etc. without so much as mentioning the important evolution which is taking place in file-sharing technology that is revolutionizing media distribution. What began as a basically centralized model vulnerable to legal attack (Napster) has evolved through at least two further generations. The Gnutella network (Bearshare/Limewire) represents the first, which is decentralized client server application. This allows a much more robust network in the sense that connectivity is not dependent on the legal health of a single operator. A trade-off with this is inefficiency in the locating of files and the problem of free riding users, which actually impede the functionality of the system beyond simply failing to contribute material. Limewire addresses this problem to some degree by providing the option to refuse to download files to users who do not share a threshold number of files. Unfortunately this cannot attenuate the problem of inefficient searches per se, merely offering a disciplinary instrument to force users to contribute. In order to sharpen search capacities in the context of a problematic network design, these networks have taken recourse to nominating certain nodes as super-peers, by virtue of the large number of files they are serving themselves. While essentially efficacious, the consequence is to undermine the legal robustness of the network. The threat is made clear in a paper published last year by researchers at PARC Xerox that analyzed traffic patterns over the Gnutella network and found that one per cent of nodes were supplying over ninety per cent of the files. These users are vulnerable to criminal prosecution under the no electronic theft act and the digital millennium copyright act. The music industry has been reluctant to invoke this form of action thusfar, principally because of their confidence that the scaling problem of the Gnutella community reduces the potential commercial harm it can inflict. As super-peering etc. becomes more effective this may change.

Another interesting attribute of the limewire system is the option it provides to set up virtual private networks, so that users can establish perimetered community based upon their own social affinities. Now this is the nightmare of the IP police.

Third generation file sharing systems begin with the Freenet architecture outlined by Ian Clarke in 1999. Although the Freenet network has not achieved anything like the same adoption scale as other systems, its design characteristics set the standard, which has been emulated by others, specifically those built on top of the ‘fast track’ system. The crux of Freenet’s genius is in its adoption of ‘small world’ organization. This refers to the experiment carried out by Milligram in the 1960s where 160 people throughout the United States were given letters to be delivered to stockbrokers and asked to pass them only through people that they knew to get them to their final destination. 42 of the letters arrived, using an average of 5.5 intermediaries. The purpose was to illustrate the level of social interconnectivity, and is an experience with which most us are familiar, as when one meets a stranger from a distant clime and discover that you know someone in common. It’s not that everyone has such an expansive social sphere, but rather that there are individuals whose circle of acquaintance cuts across a wide range of social groups. Freenet utilizes this principle through by giving its software a feature, which allows it to retain knowledge of the content available on other nodes; information is retained between sessions. The result is search capability an extremely effective storage and retrieval system. As a result this feature has been emulated by systems such as Audio Galaxy, Kazaa.

A crucial point in all of this is that both Gnutella and Freenet are open source/free software, thus allowing non-commercial motivated individuals and groups to take up the baton as the main players progressively move towards a rapprochement with industry. Napster has died attempting to placate its erstwhile enemies, whilst Kazaa will not allow downloads above 128 kilobytes per second in an attempt to appease the same industry, with whose representatives they are currently in negotiation for a license to move to a full commercial platform. These are both proprietary technologies so that they can exclude any rivalrous non-compliant competitors. Audio Galaxy however is under the General Public License. AG deals with the ‘tragedy of the commons’ in a more determined manner(!). Specifically, it only allows the user to transfer more than one file at a time if they are sharing a minimum of 25 files. Likewise, there is no option to not share – the only means of not sharing is to exit AG, which means of course that the user cannot download files either.

Similar systems are now been offered by these companies to commercial media distributors such as Cloudcast (Fasttrack) and Swarmcast, using technical devices to allow distributed downloads that automate transfer from other notes when one user logs off. The intention here is clearly the development of software based alternatives to the hardware offered by Akamai, the principle player in delivering accelerated downloads and used by CNN, Apple and ABC amongst others.

The point of all this is that there is distribution system available now that can allow the global distribution of critical media. This network is not globally inclusive and is predicated upon access to a telephone line, computer and (preferably) a high speed network connection, but other more powerful economic forces are driving the permeation of all these technologies so that this is a problem which will be progressively mitigated. In any case, exclusion is a fact of all media, whether one considers literacy (print), purchase capacity (television/satellite). Radio is probably fundamentally the most democratic media in an ideal sense, since the cost of acquisition of a receiver is relatively low, and the spread of linguistic range in the content available is basically quite comprehensive.

technical descriptions napster gnutella fast track innovations freenet (search algorithms theodore hess)

Milligram anecdote open source v proprietary commerical implementations swarmcast, cloudcast, upriser The top four file-sharing systems -- FastTrack, Audiogalaxy, iMesh, and Gnutella -- were used to download 3.05 billion files during August, according to Webnoize. edonk: Client-server based sharing/chat network with sophisticated multi-source downloading (download from someone else even when he's still downloading the same file).

FastTrack -- the technology used by Consumer Empowerment, one of the companies sued on Wednesday --has seen traffic grow 60 percent a month over the course of the year. With 970 million files shared, it's the most used file-trading application on the Internet.

The other three services -- Audiogalaxy, iMesh and Gnutella -- had 2.08 billion files swapped using the decentralized networks. While none of the systems tops Napster's peak performance of 2.79 billion files shared, industry experts believe it is only time before these services surpass Napster.

edonkey sharereactor, filedonkey, filenexus

Economic Factors Influencing Peer Distribution

The motivation atttracting participation in these networks remains that which inspired Napster's inventor: the opportunity to acquire practically unlimited content. Early in the growth of Napster's popularity users realised that other types of files could be exchanged apart from music, as all that was required was a straightforward alteration of the naming protocal such that the file appeared to be an MP3 (Unwrapper). Later applications were explicitly intended to facilitate the sharing of other media such that that today huge numbers of films, television programs, books, animations, pornography of every description, games and software are available. The promise of such goodies is obvuiously an adequate incentive for users to search, select and install a client server application and to acquire the knowledge necessary to its operation. Inuitive Graphical User Interfaces enable a fairly rapid learning curve in addition to which a myriad of users discussion forums, weblogs and news groups provide all that the curious or perplexed could demand.

Internet access pricing plans obviously the key determinant.

Motivation - performance - access to goods in kind Whilst it is obvious why users utilise these tools to extract material, it is not so plain why they should also use them to provide material in turn to others and avoid a tragedy of the commons. Key to the willingness to provide bandwidth has been the availability of cable and DSL lines which provide capacity in excess of most individuals needs at a flat rate cost. There is thus no correlation between the amount of bandwidth used and the price paid, so in brief there is no obvious financial cost to the provider. In areas where there are total transfer caps or use is on a strictly metered basis participation is lower for the same reason.

For those on flat-pricing packages there are some costs imposed, such as a slow-down in www access rate. A combination of these factors has given rise to free-riding problems as evidenced by the study carried out by researchers at PARC Xerox on the composition of the Gnutella network [ENTER MORE DATA]. There is a fairly high degree of consciousness of this problem however (such users are referred to as 'leeches' and are the subject of endless vitriol on file-sharing boards) and many applications have implemented features to address the issue, a matter o which we will return to below under the rubric of collective action mechanisms.

Dangers

appropriationFill in story about Morpheus switch from fasttarck to gnutella free riding h. Freeriding and Gnutella: The Return of the Tragedy of the Commons: Bandwidth, crisis of P2P, tragedy of the commons, Napster's coming difficulty with a business plan and Mojo Karma. Doing things the freenet way. Eyton Adar & Bernardo Huberman (2000) Hypothesis 1: A significant portion of Gnutella peers are free riders. Hypothesis 2: Free riders are distributed evenly across different domains (and by speed of their network connections). Hypothesis 3: Peers that provide files for download are not necessarily those from which files are downloaded. " In a general social dilemma, a group of people attempts to utilize a common good in the absence of central authority. In the case of a system like Gnutella, one common good is the provision of a very large library of files, music and other documents to the user community. Another might be the shared bandwidth in the system. The dilemma for each individual is then to either contribute to the common good, or to shirk and free ride on the work of others. Since files on Gnutella are treated like a public good and the users are not charged in proportion to their use, it appears rational for people to download music files without contributing by making their own files accessible to other users. Because every individual can reason this way and free ride on the efforts of others, the whole system's performance can degrade considerably, which makes everyone worse off - the tragedy of the digital commons ."

Figure 1 illustrates the number of files shared by each of the 33,335 peers we counted in our measurement. The sites are rank ordered (i.e. sorted by the number of files they offer) from left to right. These results indicate that 22,084, or approximately 66%, of the peers share no files, and that 24,347 or 73% share ten or less files. The top Share As percent of the whole 333 hosts (1%) 1,142,645 37% 1,667 hosts (5%)2,182,08770%3,334 hosts (10%) 2,692,082 87% 5,000 hosts (15%)2,928,90594%6,667 hosts (20%)3,037,23298%8,333 hosts (25%)3,082,57299%Table 1 And providing files actually downloaded? Again, we measured a considerable amount of free riding on the Gnutella network. Out of the sample set, 7,349 peers, or approximately 63%, never provided a query response. These were hosts that in theory had files to share but never responded to queries (most likely because they didn't provide "desirable" files).

Figure 2 illustrates the data by depicting the rank ordering of these sites versus the number of query responses each host provided. We again see a rapid decline in the responses as a function of the rank, indicating that very few sites do the bulk of the work. Of the 11,585 sharing hosts the top 1 percent of sites provides nearly 47% of all answers, and the top 25 percent provide 98%.Quality?We found the degree to which queries are concentrated through a separate set of experiments in which we recorded a set of 202,509 Gnutella queries. The top 1 percent of those queries accounted for 37% of the total queries on the Gnutella network. The top 25 percent account for over 75% of the total queries. In reality these values are even higher due to the equivalence of queries ("britney spears" vs. "spears britney"). Tragedy? First, peers that provide files are set to only handle some limited number of connections for file download. This limit can essentially be considered a bandwidth limitation of the hosts. Now imagine that there are only a few hosts that provide responses to most file requests (as was illustrated in the results section). As the connections to these peers is limited they will rapidly become saturated and remain so, thus preventing the bulk of the population from retrieving content from them.

A second way in which quality of service degrades is through the impact of additional hosts on the search horizon. The search horizon is the farthest set of hosts reachable by a search request. For example, with a time-to-live of five, search messages will reach at most peers that are five hops away. Any host that is six hops away is unreachable and therefore outside the horizon. As the number of peers in Gnutella increases more and more hosts are pushed outside the search horizon and files held by those hosts become beyond reach.

Easily isolated providers are set up for litigation by the RIAA etc.

Solutions? i. In the "old days" of the modem-based bulletin board services (BBS), users were required to upload files to the bulletin board before they were able to download.ii. FreeNet, for example, forces caching of downloaded files in various hosts. This allows for replication of data in the network forcing those who are on the network to provide shared files.iii. Another possible solution to this problem is the transformation of what is effectively a public good into a private one. This can be accomplished by setting up a market-based architecture that allows peers to buy and sell computer processing resources, very much in the spirit in which Spawn was created

trust - collective action mechanisms - hashing Security and privacy threats constitute other elements deterring participation both for reasons relating to users normative beliefs opposed to surveillance and fear of system penetration by untrustworthy daemons.

The security question has recently been scrutinised in light of the revelation that the popular application Kazaa had been packaging a utility for distributed processing known as Brilliant Digital in their installer package. Although unused thusfar it emerged that there was the potential for it to be activated in the future without the knowledge of the end-user.

Viruses .vbs and .exe files can be excluded from searches. MP3s etc are data not executables.Virus spreads via Kazaa (but the article wrongly identifies it as a worm): http://www.bitdefender.com/press/ref2706.php

Audio Galaxy: Contains really ugly webHancer spyware that may make your Internet connection unusable.

Other Costs CPU Resources Kazaa supernode will use a max of 10% of total CPU resources. Allowa na opt-out.

Commercial Implementations According to study executed in early 2001 by Viant consulting there are now more than 500, 000 television and film files being exchanged every day over file sharing networks and through connections made in IRC [tcc p.16 for stats and methodology]. That this is bad news for the copyright owners will not be explored here, rather the fact that this form of P2P provision of the archetypal data heavy content is taking place between users already. In the same reprot the authors assert that content companies have themselves been experimenting with the distributional potential of networks such as gnutella. (Viant, The Copyright Crusade see fn 47).

interesting comparisan of acquisition times in TCC at p. 28 http://www.badblue.com/w020408.htm http://www.gnumarkets.com/ commerical implementations swarmcast, cloudcast, upriser mojo nation's market in distributed CDN.

Design considerations impeding performance for the sake of other normative objectives

freenet - censorship resistance impediment tangler kazaa/morpheus bitrate encoding limit for copyright reasns, easily hacked.

Open Source or locked up?

closed Bearshare Kazaa Grokster Edonkey

Open Limewire GPL Gnucleus

Collective Action Mechanisms Limewire

Slots & Bandwidth Throttling

Gnotella (Windows) Easy-to-use and very popular client written in VB, with many powerful features (search & response filtering, decent bandwidth regulation, multiple searches, private networks, skins..). Limewire: Upload slots represent the number of files other users can download from you at any one time. The default number of slots varies based upon the connection speed you set at installation, and the default bandwidth usage is set at 50 percent of your connection speed. You can self-configure your number of upload slots and percentage of bandwidth usage by clicking on tools>options>uploads. Gnucleus Another new feature is Scheduling, which lets you tell Gnucleus to run on the Gnutella network at certain times during the day. This is useful for people who want to run Gnucleus only at times when the load on their local network is low, like at a college someone might configure Gnucleus to run at night so during the day academic use of the network would not be bogged down. Or at a company so day-time business traffic would not be affected. storage

Swarmed Downloads LimeWire 2. First of all, we've allowed for 'swarmed' downloads. If the file you are looking for can be located at multiple hosts, LimeWire will attempt simultaneous downloads from those sources, spidering different portions of the file. Consequently, you'll get your files MUCH faster than what you are used to.

Multi-Source Downloads multi-source downloading? A: A particular file may be available on more than one remote computer. With multi-source downloading these various sources are grouped together, and if one sources fails for some reason then another host can take its place.

Importance of Hashing and CPU consumption BearShare hashes all your existing files when launched. This is a one-time activity and should not consume more than 25% CPU utilization.

Q: What is hashing? A: Hashing is a calculation done on each file to produce a small, unique "hash". BearShare compares hashes to determine if two files are identical. It is important to do this sort of comparison to guarantee that the files being compared are the same, especially when swarming.

Superpeering and the erosion of pure peer to peer Early 2001 Limewire new Gnutella hierarchy, whereby high performance machines become 'Ultrapeers'. These machines accept connections from many LimeWire clients while also connecting to the rest of the Gnutella network. Moreover, the Ultrapeer shields these 'regular' LimeWire clients from the CPU and bandwidth requirements associated with Gnutella, directing traffic to clients in an efficient manner. : Any KaZaA Media Desktop can become a SuperNode if they have a modern computer and are accessing the Internet witha broadband connection. Being a SuperNode does not affect your performance noticeable. Other KaZaA users in your neighbourhood, using the same Internet Service provider or located in the same region as you, will automatically upload a small list of files they are sharing to you. When they are searching the will send the search request to you as a SuperNode. The actual download will be directly from the computer who is sharing the file and the persons who is downloading the file,peer-to-peer.

Retrival

Connections Every connection costs bandwidth of approximatels .5k per second Smart Downloading Smart downloading will retry a given download until it is successful. In other words, if you have tried to retrieve a file from a similar group of files, then LimeWire will try to download any of these sources until itÕs successful. Will also auto resume if interrupted.

Search Considerations Search and response take place over the same route, that's why methodology is so important. The fiole is then transferred directly using a HTTP interface

Exclusion If there are particular IP addresses you wish to ignore (if, for example, a particular IP address was sending you unsolicited results), click under Hosts where you could enter that IP address into the 'Ignore these hosts' window and click Add. Q.? Is there any means for these networks to allow prioritization of files stored on the same network or on a network nearby so as to minimize the need to travel over the backbone, through multiple peering interfaces etc?

AG: The Satellite automatically selects the closest user with the file you want, reducing external bandwidth usage. Kazaa Automatically clusters traffic by network topology to provide fastest download speed and minimal load on ISP backbones.

Sharing Incentives Default sharing: Limewire: automatic sharing of downloaded files. Limewire also allows you to require a minimum number of shared files before allowing a download to a given user. AG You must first share at least 25 files to be able to increase the number of simultaneous transfers you can have.

distribution - load balancing

existing solutions dedicated servers web caching expanding hard disk memory size

necessary preconditions flat pricing? broadband figures [see TCC fn 9] - take up - proportion of capacity utilised dial up connections

caching server farms mirrors

economic redundancy in uncalibrated approach cost of server collocation cost of memory speed of memory capacity growth

eDonkey is a client-server-based file sharing network It also means that in the moment you start a download, you may already be uploading the very same file to someone else. This also fixes the freeloader problem since even if you don't share any files, your bandwidth is always put to good use.

you only tell the donkey how much bandwidth you want to use for uploads and for downloads.

www.sharereactor.com cosmoed2k.da.ru "The VCR is to the American film producer and the American public as the Boston Strangler is to the woman alone."- Jack Valenti, MPAA

------------------ ***************** Three commercial online music suppliersPressplayMusicnetFull Audio/Clear Channel

MusicMatch, MusicNet and FullAudio don't permit burning Pressplay, Emusic, Rapsody burning allowedhttp://www.neo-modus.com/?page=Newshttp://www.neo-modus.com/?page=Help http://www.climate-dynamics.rl.ac.uk/

FastTrack is the file-trading software being used by Consumer Empowerment, which licenses its technology to Kazaa.com, Grokster and MusicCity.

Fraud. Several prolific warez kiddies figured out how to change their MAC address to bill their service to their neighbors or even to our own router (!). We're still not sure exactly how that happened. Sure, we cut them off and connected their modems to a high voltage source as punishment (our contract allowed it), but how many more are there who we didn't catch?

Billing issues. People who obviously ran up a very high bandwidth bill would call us and complain when they got their statements, asking us to lower their bills. Our position was that it wasn't our responsibility that they couldn't figure out how to close Napster or stop downloading porn. When they paid with credit card we would sometimes lose the dispute, but things were okay when they paid with cash or check. Expect ation of quality. As you know, a cable modem is a shared medium and cable companies are not at fault for your neighbors' downloading habits. However, it was considered a potential legal liability to be providing a service of varying quality. Modem, 56 Kbps Cable, 512 Kbps T1, 2 Mbps

Picture,200 Kb 40 seconds 2 seconds 2 seconds Music track, 4 Mb 13 min 30 seconds 1 minutes 15 seconds Full-length movie, 400 Mb 22 hours 1 hour 45 minutes 25 minutes Five-minute video clip, 20 Mb 1 hour 6 minutes 2 minutes 200 page Novel, 1 Mb 4 minutes 15 seconds 4 seconds

For example, a T3/DS3 connection has a capacity of 45 Mbps, while a stream with 30 fps, at 320x240 pixels can have a rate of 1 Mbps. Under such conditions, only 45 clients can be provided a maximum resolution video stream.

A Haclab in the Zapatista Rebel Secondary School

El Primero de Enero "Escuela Secondaria Rebelde Autonomo Zapatista" is the first autonomous secondary schoolo in Chiapas. It is located in the educational and cultural centre of the Zapatista movement in Oventic Aguascalientes II. There are about 150 students and 25 teachers. In the cultural center there is neither telephone or internet. But there are some computers in the school, and from there we're going to start.

Aim of the Project In the first place we're going to get to know a community. To learn their way of constituting spaces for life, social and cultual autonomy. Then we're going to assist them, in so far as we can. Wwe will install Linux, as they themselves have requested, on thePCs they already have in their laboratory, and we will teach them to install and use it, so that they can then teach others. To be autonomous is also to be independent of the monopolies on knowledge. We will assemble a local network and install some basic servers (web server, mail server etc.) All the while explaining o them what we're doing and how it is done. We'll teach them how to make a website. Each phase will will be conducted using the workshop method, or if you prefer, learning while doing, praphraseing the zpatista slogan "asking while walking".

A Project in several steps The projevt will not finish with this first trip. We will work so that we can carry on the collaboration, setting other objectives. With each voyage we will build a new tool, from time to time singling out what is most useful, together with the community. For sure one of the objectives is a connection to the internet. We will try to establish collaborations with other groups and organisations that are working on solidarity projects with the community in Chiapas, to manage to connect them to the net. In this first phase we can think of a portable machine configured at least as a mail server. In this way, once a day they could go to the closest city (San Cristobal is about an hour's drive from Oventic) connect the notebook to the network, and thus send and receive from and for the School.

The Meeting between Communities We hope that this will be just the first moment of meeting between the italian community that is drawn together around the organisation of the annual hackmeeting [http://www.hackmeeting.org] and the community of Aguascalientes in Chiapas. Let's be quite clear that this project ought not be closed to others' particpation, but that it seems to us that the attitude of the italian hackers towards building tools, of sharing knowledge "without founding powers" goes very well with the attitude of the Zapatistas to struggle for one's own autonomy and for the right to choose one's own destiny. And a working proposal that we address to all those who are putting into play activities related to the use of technology in a conscious manner; to whoever sees in the use of digital instruments a possibility to get away from the commercial logic of the technology and media multinationals. In this sense we will try to involve also Indymedia Chiapas in this process, which does great work in San Cristobal.

How you can help us We need support at various levels: hardware [the computers in the labs are not in good condition], money to buy some equipment on the spot and to guaranteee the maintenance of the lab, ideas and advice, which are always most appreciated.

Hardware UTP Cat 5 Cable PCI and /or ISA network cards 10/100 Hub Working Hard Disks Ram Video Cards CD Rome readers Portables Scanner Printer

Money You can contribute your cash by payment to C.C.P. 61804001 made out to Radio Onda Rossa, Via dei Volsci 56 00185 Roma, specifying as description of payment "Hacklab Project in Oventic"

Ideas and Advice ezr@autistici

If there is a single reader of this turgid notebook who still actually lives there. Otherwise the details recounted within are only fuel for the exiles' schadenfreude.

Irish Internet Users group

the spirit of the time, growing slowly and quietly ripe for the new form it is to assume, disintegrates one fragment after another of the structure of its previous world. That it is tottering to its fall is indicated only by symptoms here and there. Frivolity and again ennui, which are spreading in the established order of things, the undefined foreboding of something unknown-all these betoken that there is something else approaching. This gradual crumbling to pieces, which did not alter the general look and aspect of the whole, is interrupted by the sunrise, which, in a flash and at a single stroke, brings to -view the form and structure of the new world.

Hegel Preface to the Phenomenology of the mind

Some thoughts on media criticism.

The Decline and Fall of Public Broadcasting, David Barsamian.

Rich Media, Poor Democracy, Communications Politics in Dubious Times, Robert W. McChesney

As a stalwart veteran of alternative media production through his radio work based out of Boulder, Colorado, and his accessible interviews with Said and Chomsky published in print by South End Press, one would expect Barsamian to have an insightful critique of the contemporary media landscape. But despite its many qualities, this book proved something of a disappointment. Essentially the text falls victim to a series of flaws, several of which have come to characterize leftist attempts at media critique.

The first of these is a simplistic revising of the history of the FCC which would lead one to believe that its purpose also the originally honorable has now been perverted. Thus Barsamian writes:

"The New Deal-era politicians who created the FCC in 1934 viewed the communications infrastructure as a vital public resource, like utilities or the railroads. The agency was originally charged with protecting consumers from industry monopolies and price gouging. But a September-October 2001 Mother Jones article, "Losing Signal" by Brendan L. Koerner, exposes an emergent corporate-friendly FCC, as embodied by Powell."

First off this is a strange way to describe an agency established by the 1934 Communications Act as originally having had a consumer protection design. In fact, the 1934 law as the final act in a government operation undertaken in collaboration with commercial broadcasters to marginalise non-commercial broadcasters and create the circumstances which would allow an advertising driven conception of programming to develop a stranglehold over the US audience. This process had begun in 1927 with the Radio Act that created the Federal Radio Commission, an institution that rendered non-commercial broadcasters unsustainable through imposition of costly technical requirements. On this point see McChesney, telecommunication... and democracy. The squashing of FM radio in the 1940s and cable in the 1960s provides examples historically the FCC has been the vehicle through which incumbent players have held innovative technologies threatening their position at bay. The inventor of FM was so dispirited by this that he threw himself out the window of his apartment in 1954. The most recent instantiation of this pattern is the thwarting of Low Power radio by commercial broadcast interests in combination with the doyen of public broadcasting, NPR, which was among the final acts of the Clinton administration in December 2000 (See Hazlett p.46 - 90). On the movement for and potential of Microradio & Democracy: (Low) Power to the people, Greg Ruggiero, Seven Stories Press, 1999.

A sad reflection on the poverty of leftist critique is that a critical history of the FCC is produced with far more accuracy and panache by the libertarian right as exemplified by the Brooking's Institute Tom Hazlett (himself a former FCC economist) in his paper 'The Wireless Craze, The unlimited bandwidth myth, The Spectrum Auction Faux pas and the Punchline to Ronald Coase's 'Big Joke' - an essay on Airwave allocation policy.' Obviously Hazlett has other aims, namely the promotion of a policy built on maximal property rights and market mechanisms, yet perversely he succeeds also in producing something useful to left observers. Hazlett provides a sobering narrative of the origins of radio regulation (see especially his reconstruction of the period from 1926-27 which saw the 'breakdown of the law' at p.97 -100) and a skeptical outline of the protean content of both the 'public interest' and the 'fairness doctrine', two foundational legal doctrines upon which leftists would build their 'progressivist' new Jerusalem not knowing that this is a base of sand rather than stone. Notably, Hazlett has no compunction in using technical data supplied by Rappaport et al. from the media foundation and essentially lifts important sections of his historical analysis from Chesney. His section on LPFM particularly could be easily inserted in the contents of a Z magazine without drawing controversial comment.

Of course there are moments which might attract the more attentive readers suspicions. Political economy and the corruption of institutions are not something to which Hazlett is blind, thus he argues for market processes as a means of diminishing their influence. For Hazlett, Markets indeed are synonymous with the only type of ' democracy'; he conceives as practicable.

Secondly, in order that the value of spectrum is maximized he advocates maximal property rights. For leftists of course this is high blasphemy to the 'public' who are the 'owners' of the spectrum, and thus actually selling would constitute the abandonment of the public trust and a handover to corporate interests of the media on a massive scale. The hard truth is that leftists are so demoralized that the only means they see to advance their political objectives is through then ideological instrumentalising of the state to impose order on capital by authoritarian means. Yet their public ownership of the airwaves dream-sequence is just that, a daydream based on hopelessness. Broadcasters understand their relationship with politicians very well, and vice versa: they are each other’s very oxygen. Broadcasters provide politicians with publicity and frame issues so as to exclude the merest possibility of popular self-government and more specifically with cash to finance their political campaigns. In return, politicians grant them all the laws they desire to copperfasten their control over the public's attention through extension of intellectual property laws, spectrum giveaways (such as the estimated 60 billion dollars of spectrum given to television network incumbents under the 1996 telecommunications act. Thus, on a fair reading of the evidence, Hazlett's claim stands, even if one differs with his concept of preferential outcomes as produced by a property regime. The wisdom of exclusive rights now forms the crux of the polemic between those with the stomach to involve themselves in the opaque world of spectrum licensing; influential critiques have been made by Naom (on the basis of price efficiency), Lessig (on the basis of abundance) benkler (on the basis of abundance and the potential for a democratic decentralised model).

In order to counter Hazlett at root requires a serious examination of two aspects of markets that his utopianism neglects to scrutinize. The first is the question of externalities, that is the positive and negative byproducts of a process that whose costs and benefits are not taken into account in the price setting mechanism. These may be positive or negative. In the area of information, a positive externality is considered to be its effect of informing social actors and equipping them with the knowledge to make informed decisions about their lives. The same information may also help to encourage technical innovation, or can have a negative effect through phenomena such as desensitization to violence. In the area of information goods, one form of externality that has received considerable attention in recent years is that of network effects, contexts where each additional user of a specific standard of equipment or software platform for example, adds value to the other users experience or capability. This issue lies at the heart of the controversy over Microsoft’s antics to maintain and extend its dominance.

The other critique which goes to the heart of market optimists claims is the fact that most markets do not work along a simple supply/demand matrix, principally because eof inequalities in bargaining power between market actors, and the ability of deep-pocketed organizations to manufacture demand through marketing etc. This is the context in which a reading Manuel deLanda’s ‘Markets and Anti-Markets’ is very useful, turning as it does the market critique in on itself.

Spectrum allocation remains basically void of social critics with the exception of critical legal commentators and Robert McChesney, whom for reasons later elaborated spoils his performance by ignoring the impact of new technologies and the new centrality of intellectual property law.

The abiding legacy of the 1990s is a paradox, which is that at the very time when media ownership has scaled to what would once have been though unthinkable levels, the variety of conduits for the dissemination of mass media has diversified enormously. Decentralized protocols such as TCP/IP, which lie at the heart of the Internet, have also enabled the emergence of high-speed wireless networks. The telephone infrastructure is now ripe for the delivery of television programming to the home. Cable lines have been basically opened up for consumers to become producers of information themselves. Satellite systems are trying to get in on the action. At the root of all these things lies packet switching technology and digitalization. While dystopian curmudgeons take glee in the meltdown of the dotcom boom, they are missing the essence of the change, refracted in their eyes through the yuppie culture that was the period’s concubine. Yet radical groups have taken advantage of these tools and to Barsamian's credit, although he does not appear too closely informed on the matter, he does dedicate a chapter to Independent Media Alternatives, focussing bon Z magazine' transition to a digital portal, the Indymedia network and democracy Now. These examples are a pretty reasonable place to start the discussion as what has happened.

Treating the Indymedia network as a whole is a misleading exercise due to the pronounced decentralization behind it. Beyond the trademark layout of the interface and use of logos of a somewhat similar style, at one point it could be considered as a piece as each node despite its editorial independence functioned on the same software Active X. This code was developed in advance of the WTO meeting in Seattle 1999, and was designed to allow for maximum popular participation in coverage of the protests. To minimize time between observation and reportage, and to give effect to principles of DIY media and free speech, the defining characteristic of the interface and upload mechanism was to allow for unfiltered open publishing. Nowadays this identity is no more, as at least five different pieces of software (Active, Active X, slashcode, python and java) are used by different groups.

In addition, the notion of an unfiltered newswire has been jettisoned by many nodes, particularly those in Europe where liberal notions of free speech have been overwhelmed by leftist canons such as anti-fascism; thus all fascist posts are removed on the German and Italian sites, and given the history of those two countries this is hardly surprising although not uncontroversial.

This diversity in infrastructure has positive and negative elements. The advantage is that there are multiple different sets of code in development and addressing problems in different ways at any given time, the correlative disadvantage is that the attentions of politicized coders are divided and the threat of platform incompatibility is ever present.

McChesney’s book has several merits, one amongst them being this corrective lesson in the history of the FCC. Elsewhere, one of his key concerns is a dissection of the claim that the advent of the Internet will lead to a democratization of economic life and the media in particular. The concession of control to commercial forces and forms of private governance is fingered as the main reason why it is unlikely for ‘the Internet to set us free.’ This section of the book combines a healthy skepticism of the ‘democratic’ nature of the free market and a cynical eye on its more populist exponents, a little a la Thomas Frank and the Baffler. More importantly perhaps, he highlights the key role played by the state in the development of the technology and the regulatory influence exerted over markets in the form of monopoly grants, such as in the case of cable systems. This point is even more relevant to the discussion about intellectual property, and it’s an observation made elsewhere and with great insight by James Boyle. Yet there is an infuriating lack of substantive explanation; thus, when he claims that:

Business interests are coming to play the dominant role in establishing web technical standards, which are crucial to expediting electronic commerce.”

Now this is obviously true, and there is ample documentary evidence on the point, but the problem is that it is absent from McChesney’s text, and one needs to look elsewhere, such as the works of Larry Lessig to find a useful articulation of the point.

Another aspect of the picture neglected by McChesney is the real tension that exists between different players in the online environment. According to his theory, modern capitalism is defined by the attempts of large conglomerates to acquire an oligopolistic position in the market and milk the rent available to the maximum degree through means other than price competition. Thus the author traces a seemingly unending taxonomy of strategic partnerships, joint ventures and cross-investment between the major industry players in recent years, agents that according to the market prescription ought to be practically cutting one another’s throats with competitive enthusiasm. The problem is that this ignores some of the technological changes that really have disintegrated some of the walls between markets. For example, wireless networking through components built into hardware can erode the power of the owners of the telecommunications infrastructure. Thus apple builds in Airport to their systems and Nokia integrate networking protocols into their wireless devices. While this example exists at the level of the delivery conduit, similar processes have been at work over the years in content – think about the fact that the telecommunications companies make money when you download ‘free’ MP3s. Likewise Sony litigated for nearly seven years to fend off the notion pictures studios attempts to outlaw the video recorder, in a case that foreshadowed many of today’s conflicts between consumer electronics companies and media proprietors. Of course Sony is no in the ‘content’ business too, but that doesn’t stop Apple, Diamond etc. for building machines designed to capitalize on mass practices of copyright infringement. McChesney perceives many of these companies to represent the same agenda ‘the commercialization of the Internet’, when there are in fact important divergences in their interests, divergences which create space in which other processes can develop. The now classic example of this is the Music Industry and MP3s, where because the technology had enough time to develop, there is now a legacy of millions of MP3s that simply can’t be put back in the bottle, irrespective of what happens to the next generation of devices, many of which will probably integrate copyright management systems designed to thwart ‘piracy’.

The other force deflating the liberatory potential on the network in McChesney’s eyes is the centrality of the e-commerce portals as defined principally by AOL, and to a lesser degree by the likes of MSN and Yahoo.

“… or an inside the beltway public interest lobbyist so used to being ignored that just seeing the words ‘public interest’ in a government report was a life-defining event. (158)

The greatest blind spot in McChesney’s analysis however concerns his silence on the issue of intellectual property. Thus, he devotes a section of his internet-chapter to examining the role played by a traditional media manufacturers in determining the contours of the new landscape, their advertising forecasts, their partnerships for the distribution of music, their ownership of high-profile brands etc. without so much as mentioning the important evolution which is taking place in file-sharing technology that is revolutionizing media distribution. What began as a basically centralized model vulnerable to legal attack (Napster) has evolved through at least two further generations. The Gnutella network (Bearshare/Limewire) represents the first, which is decentralized client server application. This allows a much more robust network in the sense that connectivity is not dependent on the legal health of a single operator. A trade-off with this is inefficiency in the locating of files and the problem of free riding users, which actually impede the functionality of the system beyond simply failing to contribute material. Limewire addresses this problem to some degree by providing the option to refuse to download files to users who do not share a threshold number of files. Unfortunately this cannot attenuate the problem of inefficient searches per se, merely offering a disciplinary instrument to force users to contribute. In order to sharpen search capacities in the context of a problematic network design, these networks have taken recourse to nominating certain nodes as super-peers, by virtue of the large number of files they are serving themselves. While essentially efficacious, the consequence is to undermine the legal robustness of the network. The threat is made clear in a paper published last year by researchers at PARC Xerox that analyzed traffic patterns over the Gnutella network and found that one per cent of nodes were supplying over ninety per cent of the files. These users are vulnerable to criminal prosecution under the no electronic theft act and the digital millennium copyright act. The music industry has been reluctant to invoke this form of action thusfar, principally because of their confidence that the scaling problem of the Gnutella community reduces the potential commercial harm it can inflict. As super-peering etc. becomes more effective this may change.

Another interesting attribute of the limewire system is the option it provides to set up virtual private networks, so that users can establish perimetered community based upon their own social affinities. Now this is the nightmare of the IP police.

Third generation file sharing systems begin with the Freenet architecture outlined by Ian Clarke in 1999. Although the Freenet network has not achieved anything like the same adoption scale as other systems, its design characteristics set the standard, which has been emulated by others, specifically those built on top of the ‘fast track’ system. The crux of Freenet’s genius is in its adoption of ‘small world’ organization. This refers to the experiment carried out by Milligram in the 1960s where 160 people throughout the United States were given letters to be delivered to stockbrokers and asked to pass them only through people that they knew to get them to their final destination. 42 of the letters arrived, using an average of 5.5 intermediaries. The purpose was to illustrate the level of social interconnectivity, and is an experience with which most us are familiar, as when one meets a stranger from a distant clime and discover that you know someone in common. It’s not that everyone has such an expansive social sphere, but rather that there are individuals whose circle of acquaintance cuts across a wide range of social groups. Freenet utilizes this principle through by giving its software a feature, which allows it to retain knowledge of the content available on other nodes; information is retained between sessions. The result is search capability an extremely effective storage and retrieval system. As a result this feature has been emulated by systems such as Audio Galaxy, Kazaa.

A crucial point in all of this is that both Gnutella and Freenet are open source/free software, thus allowing non-commercial motivated individuals and groups to take up the baton as the main players progressively move towards a rapprochement with industry. Napster has died attempting to placate its erstwhile enemies, whilst Kazaa will not allow downloads above 128 kilobytes per second in an attempt to appease the same industry, with whose representatives they are currently in negotiation for a license to move to a full commercial platform. These are both proprietary technologies so that they can exclude any rivalrous non-compliant competitors. Audio Galaxy however is under the General Public License. AG deals with the ‘tragedy of the commons’ in a more determined manner(!). Specifically, it only allows the user to transfer more than one file at a time if they are sharing a minimum of 25 files. Likewise, there is no option to not share – the only means of not sharing is to exit AG, which means of course that the user cannot download files either.

Similar systems are now been offered by these companies to commercial media distributors such as Cloudcast (Fasttrack) and Swarmcast, using technical devices to allow distributed downloads that automate transfer from other notes when one user logs off. The intention here is clearly the development of software based alternatives to the hardware offered by Akamai, the principle player in delivering accelerated downloads and used by CNN, Apple and ABC amongst others.

The point of all this is that there is distribution system available now that can allow the global distribution of critical media. This network is not globally inclusive and is predicated upon access to a telephone line, computer and (preferably) a high speed network connection, but other more powerful economic forces are driving the permeation of all these technologies so that this is a problem which will be progressively mitigated. In any case, exclusion is a fact of all media, whether one considers literacy (print), purchase capacity (television/satellite). Radio is probably fundamentally the most democratic media in an ideal sense, since the cost of acquisition of a receiver is relatively low, and the spread of linguistic range in the content available is basically quite comprehensive.

An unfortunate fact on the other hand is that in the affluent west where the technology is accessible, it is not been taken up. A cursory search over Gnutella and Kazaa for files tagged indymedia (for example) turned up a grand total of one result: an audio track from a Ralph Nader rally….. On Audio Galaxy, one file was found… an audio recording of a lecture by Richard Stallman! This represents failure, a deep failure of ambition. But behind this, there is another problem, namely that most producers of critical audio-visual programming continue to refuse to make their works available digitally due to an outmoded belief in the copyright orthodoxy: they simply cannot conceive of any other way of receiving remuneration for their work, and thus they decide to limit dissemination to physical world networks where they can leverage distribution in exchange for fixed price payment. (insert note on italy indy and software which allows you to break up large files).

And the enormous resources devoted to fast-tracking legislation expanding and reinforcing copyright, trademark and patent monopoly rights. This appears anomalous when set aside the attention a given to the 1996 telecommunications act and the symbiotic nature of the relationship between politicians and corporate media generally in the book. In total, apart from a passing reference to the passage of the 1996 WIPO Copyright treaty, only one paragraph is allotted to the issue (179). In the mid-term, such a lacuna is even more striking as in a very real way it involves bringing both state and private forces of control and enforcement into the domestic sphere of everyday America.

Despite these weaknesses Rich Media… is an important historical contribution, and especially noteworthy in this respect is the attention paid to the phony Gore commission investigation into public interest standards for broadcasters in the aftermath of their successful customization of the 1996 Telecommunications act and the now infamous HDTV giveaway. The Gore Commission effectively functioned as a mechanism for laundering the ill-gotten gains under a rhetoric of public interest which mandated nothing literally nothing, and coming past-factum to the grant of the licenses to the broadcasters by the FCC couldn’t have been anything but impotent irrespective of the nature of its recommendations. In addition, despite a fundamental disagreement with his optimism in state orchestrated reform, McChesney’s concluding chapter conveys a confidence in ordinary people, and a healthy disrespect for paternalistic liberals, in a genuinely populist form that is unfortunately rare. His commitment to democracy has substance beyond the promotion of ideological agendas.

15,000 words for $8.00 certainly suggests Barsamian and South End press put a high price on what they do,

"In Porto Allegre, we smelt the tempest in the rooms of the forum and in the street we breather hope. A hope that clashes with the rhetoric of the debates and the official positions.

The black bloc doesn't love all this rhetoric. For example a river of words were spent on the poor countries and poverty. The poverty they were taling about with such rhetoric is the poverty of others, the pooverty which does not belong to themselves, but the poverty of the poor.Each one of them who spoke about povert meant to speak of others poverty you could tell from a thousand miles that they thought themselves rich. To stresss the reasons for others poverty, to exalt your own wealth.Speaking of it with the benign idiocy with which the rich give alms. The process of discourse is one of the reasons for poverty. Whoever hates poverty must begin by destroying their own poverty, that poverty of relations, affections, of a stupid existend that no money can substitute for. This whole attachment to one's own position, one's own group, one's own career, one's own country, one's own power is this not the most pure and radical form of poverty?"

*******

"Only starting from one's own misery can one destroy poverty."

******

.......

A very strange book. Eager to set out a defining sensibility for the black bloc, immersed in its poetry of acts.Determined and unfaltering in rejecting violence against people, enthusiastic and unapolegetic in endorsing attacks against things, property and especially images. One had the feeling that breaking the spell had run its course, but the success of this book suggests otherwise. Elsewhere the author is keen to dismiss the controversey over violence - and embracing this term over that more neutral and contestational property damage - with the simple claim that the unorganised nature of it all renders such theoertical polemic superfluous. The sourec of the dispute is ascribed to the political struggle within the movement, and the fear of criminalisation. The former is dismissed as futile as the bb does not seek to establish any form of hegemony and thus has little reason to enter into a discopurse. The second is treated more resignedly; the danger of the emergence of organised armed groups is noted and indeed the existence of the bb is posed as one of the means by which such an evolution may be pre-empted.

******

Other odd aspects are the light touch extended to the self-appointed leadership and the lack of criticism of the ideological components vying for superiority within the circuits of agitation.

*******

For a text that disclaims ideology, there is a strange dissonance with the ideology it itself is the vehicle for. The clear refutation of violence against individuals imputed to the bb flies in the face of any experience with the german autonomen or the area of autonomy in france. Not to mention the bsque country. Not that I have any problem with the expression of unabashed subjectivity, nor in several cases with the authors own view. The issue is the elevation of a personal perspective to the status of rule, and the failure to explain at least that these are contested claims. There is a name for this, and it is ideology.

********

Once more we encounter the unfortunate Seattle effect, that is the premise that the WTO meeting in December 1999 constitutes year zero for anti-capitalism and is the appropriate platform from which to construct all anew. Materially such scars are visible in the text in that of the four ancilliary accounts provided as forms of appendix two are the descriptions by Americans of Genoa, one is by another US group (Crimethinc) and one is french. Don't mistake this criticism for eurochauvinism, but understanding the specifity of locality and territory which is key to understanding the nature and properties of any given BB requires a certain amount of native nous. Other accounts by italians would have been more illuminating. There is considerable irony in the intense repudiation of the trademarks proliferated by the masss-media and the prostration before the mass medias rendering of the history and attributes of the bb itself (viz. Seattle.) The absence of any accounts from Spanish, German, Nordic and especially (for me personally) of eastern european interpretations and voices is a deep weakness of the book given their heavy numerical engagement and particularly in the latter instance the very different socio-economic context.

In New York we meet with minds and egos, private personaliites in an exchange of egoboo, mutual flattering. Occasionally we hurl our bodies into one another, but the greatest intinacy there is the product of a social alienation yearning to be healed. Our political meetings are full of frayed, eurotic spasms, exchanges of gossip and factoids, peacocks spreading theri wings through the description of projects. The apogee of social engineering is but friendship. No line of inquiry is ever exhausted, only gestured at with a flourish. To the loss of territoriality is added a loss of metal stability, a fragmentation and globalisation of the mind, held in thrall to the relentless storm of information which circulates. Loss of self-definition other that of homo economicus and an incessant parade of made to wear personlaity types.

"Glory, or internal gloriation or triumph of the mind, is the passion which proceedeth from the imagination or conception of our own power above the power of him that contedeth with us." Hobbes, MHH 84 TMHH (1)

'...hydrarchy...: the organization of of the maritime state from above, and the self-organiization of sailors from below."144 MHH

Contrasts

Ideas this week swirled in confusing currents, and the violence of their juxtaposition helped generate clartity.

On wednesday John Zerzan spoke at the university La Sapienza as part of his Italian tour, accompanied by a younger fellow from Eugene. His 'Future primitives' has been quite a popular book in anarchist circles for some years now, and forms part of a broader culture comprising Freddy Perlman (Against History, Against Leviathan), journalso such as Fifth Estate and to an increasing degree 'Anarchy, A journal of Desire Armed'.

Zerzan critiques civilization and domestication, exalts 'wildness' and unmedeiated interaction and has sought to attack technology as the toolkit for domination. For Zerzan technology is not limited to the steam engine, railway track or microchip, but also includes instruments of communication and most fundamentally symbolic thinking and language. Given the extent of contemporary ecological devestation and the ostentatious level of human alienation (expressed in phenomena such as delinquent consumerism, suicide, homicidal pathology, proliferation of assorted forms of charlatanism) there is understandably an audience for a writer who addresses such themes explicitly.

Fundamental flaws:

(a) determinism: a common form of vulgar marxism adopted by neophytes posits a simplified economics as a crude explanation for all social phenomena. A residuum of this complex sits at the heart of the base/superstructure analgy which some more advanced (at least in years) Marxists continue to cling to. The primitivist critique has a similar form. If one swaps the word technology for economy, we arrive at a pretty concise and accurate understanding of their vision. Questions of agency are given short shrift. Problems relating to class composition and consequent social and cultural effects are similarly side-lined. Some writers indulge in this form of vulgarisation for rhetorical effect, to catalyse a discourse or pull an existing discussion in a direction deemed important. Zerzan's adepts are either missing a subtlety which may be there, or may simply be swallowing a poorly digested teleology.

(b) The search for purity: the force of the primitivist polemic relies heavily on the invocation of a harmonious past, where people lived as hunters and gatherers without government or state in absolute harmony with the natural environmnet and relative harmony with one another. That the duration of that society was an order of magnitude greater than our own is repeatedly stressed, so as to underline the fleeting temporality of current conditions and to demonstrate their contingency.

Seamus Deane describes this as 'rewriting Utopia as Eden' and the trope is familiar to Irish readers who recall how Connolly critiqued British capitalism by evoking the brehon chieftain based system which relied on moral economy preceding it. Zerzan's call to the past travels back way beyond the sixteenth century however, to a barely imaginable past. The positing of an uncorrupted state of being, untouched by alienation simply has no meaning for we who have grown up in modern conditions. There is no outside to step to which could cleanse the mind and body of the multiple alienations of our cultural, social and economic fabric. For better or worse we must contend with current social conditions rather than summoning up a dream of the untarnished life which can only be illusory, in other words, a mystification.

The dreadful impossibility of this position is easily illustrated. Asked about the imporatance of violent attacks on property, his companion stressed that breaking a bank window with a brick was first of all an act of symbolic importance as it communicated to others that banks could be destroyed. Thus both the act and the respinse it seeks to provoke are instances of symbolic thinking. Of course only a mystic could find in our lexicon of words and acts which transced such parameters, and they know this. But to criticise the discourse implies having some alternative philosophical and ethical tenets to oppose to the current givens. Primitivism finds itself in this corner only because of the blinkered vision of the content of its critique. When dealing with a dogma there is no pettiness in undelining inconsistency, as it goes to the heart of the though itself.

(c) Diversionary propaganda: this emphasis on the golden era of 10'000 years ago can only have a useful effect for the guardians of the current world as it distracts attention and consideration of the possibilities for alternative social organisation immanent to the world as it is now. How to dispose of the technological and productive capacities available is the political question of our time, Far from being utopian the demands for an abolitiopn of waged labour money etc are hopelessly feasible. '...the situationist project, at least at the start, was to equip a technological society with the means to "imagine what can be done with" its technology.(SIA,87) AJ 138 The poisoned chalice proferred by Zerzan breaks the modern dogma that only the capitalist way of life is possible only say that without a rejection of technology, only lives immiserated by domination are possible.

Many readers of Zerzan may warm to him because they percieve him as a determined opponent of an ideological sloigan called 'progress', which is a concept based upon the refinement of techniques of capitalist acculation and management as opposed to a contribution to the better living of humanity in terms of either pleasure or sustainability. Like his dismantling of the claim that no other way of life is possible, this is a welcome consquence of his work. Read to the letter however Zerzan offers nothing other than a fatal resignation to fighting a dying society where the war is over before the battle has concluded, because from the beginning we are touched by the original sign of civilisation and alienation.

II

Derive Approdi held a conference on their collection Futir Anterior at the weekend. The book comprises five chapters treating different questions arising from the operaista tradition, the contents being a collage assembled through interviews with thirty nine Italian thinkers from across four generations of radical thinkers born between 1925 and 1950.

Operaismo developed out of the dicovery of Marx's grundrisse written as an introduction to Das Kapital. An impoprtant section of Grundrisse analyses the devlopment of machinery in capitalist production and positis automation as being a consequence of workers struggle sinside the factory context. Thus worker militancy leads to increased investment in fixed capital over labour. Whilst devised as an attack on labour power, automation in fact shifts the site of political conflict from labour conditions to the distribution of the social benefits. The wroker is liberated from the drudgery of production, but in order not to be enslaved at the hands of another capitalist must organise politically to impose the terms of automation.

Radical struggle thus seeks to influence the direction of development and the power relation determining that devleopment rather that the development of the forces of production as such.

Noteworthy was the prevalence of the term multitude and the absence of thatb of proletariat. Another phrase which rearose repeatedly was the idea of the communist presence in capitalism, or the harnessing of commmunist creativity and concept in the existing production process.

Dismissal of representative democracy as unable to reflect the complexity of social life was treated as a given. And rightly so. But the emphasis on horizontal organisation of the multitude and its relationship with ossified power was not addressed. Anarchist concepts were in the air, but for the particpants appeared unutterable. 'Empire' leaves this same taste in the mouth of the reader.

III

Where the withhering attack on extreme alienation formed the kernel of Zerzan's claims, it arose rarely amongst the operaisti. Only Bifo spoke of the massive increase in mental illness and pathology resulting from the dissolution of the traditional world of work, and even then treated it as a byproduct of the change in labour relation. What struck me as curious about this is the blindness to the phenomena, quite apparent in the United States, Ireland and the UK (and before that in germnay, switzerland and the netherlands) of revolt against the increasing commodification of everyday life and the social glacialisation inherent to that. The succes of Klein's book 'No Logo' owes more to this aspect than any other, certainly more than the fact that major brands are producing their commodities in third world sweatshops where labour and environmental regulation is non-existent.

This revolt against the invasion of exchange value and privatisation into every corner of existence has been pivotal to the emergence of carnivalesque rebellion epitomised by groups of a pink black complexion such as Reclaim the Streets which constituted a genuine innovation of the nineties. RTS broke with the tradition of nihilistic carnival incarnated in events like Hannover's Chaostage by advancing concrete values and creating spontaneous street situations to produce pleasurable antagonistic experiences that combined individual hedonism with collective agency and social reappropriation of space. This reappropriation neatly constituted the only media required, a fact which should be remebered when considering the disgusting prostration of most political groups before the forces of information.

The greatest loss of the recent cycle of struggles derives from the marginalisation of this innovation. Historically this occurs through the emphasis on Seattle as opposed to the Carnival Against Capitalism in london. More immediatley it has been smothered in the sterile debate between violence and non-violence (including those who are just ambiguous about violence such as the Disobeddienti).

"Innovation makes enemies of all those who prospered under the old regime, and only lukewarm support is forthcoming from those who would prosper under the new. Their support is indifferenty partly from fear and partly because they are generally incredulous, never really trusting new things unless they have tested them by experience."

Machiavelli

Some innovations just need to be reinvented.

Pages

Subscribe to RSS - hydrarchist's blog