Posts Tagged ‘Accelerated reality’

Views From an Accelerated Reality # 1: Vernor Vinge's Technological Singularity

Sunday, May 22nd, 2011

In preparation for the 1993 Vision-21 symposium held in Cleveland, Ohio, USA, NASA’s Lewis Research Centre issued a small press release. In it they explained:

Cyberspace, a metaphorical universe that people enter when they use computers, is the centrepiece for the symposium entitled “the Vision 21 Symposium on Interdisciplinary Science and Engineering in the Era of Cyberspace.” The Symposium will feature some remarkable visions of the future.[1]

Looking back it’s probably difficult to imagine the sort of excitement that surrounded symposiums built around this theme. Today our contemporary notions of a digitised reality centre on ideas of the social network and connectedness, in which one is either online or off. The Internet augments and points back to a reality we may or may not be engaged in, but it doesn’t offer an alternative reality that isn’t governed by the same rules as our own. The concept of cyberspace, or an immersive “virtual” reality in which the physical laws of our own do not apply, has all but disappeared from the popular consciousness. These days any talk of immersive digital worlds conjures up visions of social misfits playing non-stop sessions of World of Warcraft, or living out fantastic realities in Second Life. In 1993 cyber-hysteria was probably at its peak. In the previous year Stephen King’s virtual reality nightmare The Lawnmower Man was released in cinemas and grossed one hundred and fifty million dollars worldwide[2]. Virtual reality gaming systems, complete with VR helmet and gloves, were appearing in arcades everywhere (though they never seemed to work), the Cyberdog clothing franchise was growing exponentially and even the cartoon punk-rocker Billie Idol jumped on the bandwagon with his 1993 album Cyberpunk.  Vision-21 was probably right on the money, dangling cyberspace as a carrot in order to draw big name academics, dying to share their research on ‘speculative concepts and advanced thinking in science and technology’[3].

Amongst the collection of scientists and academics, who I imagine paid their participation fees to deliver papers with titles like Artificial Realities: The Benefits of a Cybersensory Realm, one participant sat quietly waiting to drop a theoretical bomb. Vernor Vinge (pronounced vin-jee); science fiction writer, computer scientist and former professor of mathematics at San Diego State University, was there to read from his paper entitled The Coming Technological Singularity: How to Survive in the Post-Human Era. You can almost picture the audience’s discomfort as Vinge  read out:

 

Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.[4]

 

The crux of Vinge’s argument, summarised for sensational effect in the two sentences above, was that the rapid progress of computer technology and information processing, ran parallel to the decline of a dominant human sapience. Technologies built to augment and increase humanity’s intellectual and physical capabilities, would eventually

order cialis online

develop a consciousness of their own and an awareness that our presence on earth was negligible. This series of events and the resulting set of consequences are what Vinge referred to as The Technological Singularity.

This dystopic future narrative, foretelling a kind of sinister digital sentience, had already been played out on the big screen in Stanley Kubrik’s 2001: A Space Odyssey and James Cameron’s Terminator (featuring Arnold Schwarzenegger’s career defining role as the ‘Micro-processor controlled, hyper-alloy combat chassis’[5], or cyborg for short).  What rescued Vinge’s thesis, from the familiar terrain of dystopic cyber-plot lines, and a hail of academic derision, was the insertion of a second and more plausible path towards a post-human era. The traditional sci-fi route to the post-human condition has the sudden self-consciousness of superhumanly intelligent machines as its root cause. This formed part of Vinge’s initial argument.

 

If the technological singularity can happen, it will. Even if all the governments of the world were to understand the “threat” and be in deadly fear of it, progress toward the goal would continue. In fiction, there have been stories of laws passed forbidding the construction of a “machine in the likeness of the human mind”. In fact, the competitive advantage – economic, military, even artistic – of every advance in automation is so compelling that passing laws, or having customs, that forbid such things merely assures that someone else will get there first.[6]

 

Still, Vinge must have known that the creation of a superhumanly intelligent, sentient, computer was a bit of a long shot. Artificial Intelligence machines still hadn’t managed to pass Alan Turing’s test since it was introduced in 1950 and Japanese electronics seemed primarily concerned with teaching robots to dance. So in order to shore up this rather shaky portion of his post-human hypothesis Vinge introduced another pathway to the technological singularity called Intelligence Amplification (IA). What the expression refers to is a process in which normal human intelligence is boosted by information processing apparatus. Vinge explains:

 

IA is something that is proceeding very naturally, in most cases not even recognized by its developers for what it is. But every time our ability to access information and to communicate it to others is improved, in some sense we have achieved an increase over natural intelligence. Even now, the team of a PHD human and good computer workstation could probably max any written intelligence test in existence.[7]

 

What Vinge sketches out above is the kind of hypothetical example in which chess grandmaster Gary Kasparov and Deep Blue, the computer programme that beat him at his own game in 1997, would have joined forces to become a superhumanly intelligent, post-human, chess player. It’s the clunky combination of a desktop computer and PHD student that makes the prospect of a superhuman chess-God so unthreatening. Even in 1993, nobody at the vision-21 symposium would have possessed a computer small and unobtrusive enough to amplify his own intelligence levels without everyone else in the room knowing about it. Today that’s a different story. What Vinge knew then was that at the accelerated speed with which reductions in computer hardware size (and their concomitant increase in processing power) were taking place, it would only be a matter of years before powerful information processing engines could fit in the palms of our hands, or even, further down the line, become interlaced with our brain’s axons and dendrites. He knew that the scientists and academics sitting in that room knew it too.

At it’s most basic, IA takes place when you check a digital watch or solve a difficult mathematical problem with a calculator. Today the amplification of intelligence is happening on nearly every street corner, in every major city in the world, courtesy of smart-phones and instant portable access to the Internet. The speeds with which developments in computer technology led to this newfound portability are unprecedented and show no signs of abating. If anything, developments are probably getting faster. Viewing social, political and cultural life through the lens of IA, there’s a pretty strong case for Vinge’s technological singularity and the idea that we are living through its latter stages.

But what’s so bad about progress? Wouldn’t it be cool if everyone was walking around with superhumanly amplified intelligence levels? Maybe so, but implicit in Vinge’s theory is an existence many of us would struggle to define as human:

 

The post-singularity world will involve extremely high-bandwidth networking. A central feature of strongly superhuman entities will likely be their ability to communicate at variable bandwidths, including ones far higher than speech or written messages. What happens when pieces of ego can be copied and merged, when the size of self-awareness can grow or shrink to fit the nature of

the problems under consideration? These are essential features of strong superhumanity and the singularity. Thinking about them, one begins to feel how essentially strange and different the post-human era will be, no matter how cleverly or benignly it is brought to be.[8]

 

The question of access to this superhuman capacity is also a cause for concern. As the possession of advanced technological apparatus is reserved for those who can afford it, will we begin to see the emergence of an underclass of sub-humans, stuck on average levels of intelligence? And what happens when the first instance of computer/human symbiosis takes place? Will the first fully awakened, integrated superhuman man/machine see his or her own flesh as the negligible half of that pairing? We’re heading dangerously into Terminator territory again, but as fantastic as these questions sound, they are entirely plausible. Whatever the case may be, as humankind hurtles towards it’s own obsolescence; accelerated reality is a disorienting place to be.


[1] //www.nasa.gov/centers/glenn/news/pressrel/1993/93_17.html

[2] http://www.imdb.com/title/tt0104692/

[3] //www.nasa.gov/centers/glenn/news/pressrel/1993/93_17.html

[4] VINGE, Vernor, The Coming Technological Singularity: How to Survive in the Post-Human Era, 1993

[5] CAMERON, James and HURD, Gale Anne, The Terminator, Screenplay, 1983

[6] VINGE, Vernor , (as above), 1993

[7] ibid

[8] ibid

 

Views From an Accelerated Reality # 2: Aphex Twin

Wednesday, March 9th, 2011

In the September 1997 issue of Sound on Sound magazine, Martin Russ described the newly released Yamaha QY70 as

“A deceptively capable workstation-like device, comprising a powerful sequencer, sophisticated auto-accompaniment and a GM/XG expander’

Yamaha QY70

The QY70 was Yamaha’s most recent contribution to the line of QY models developed since 1991. The idea behind the series was to produce portable compositional tools, small bits of hardware, no larger then a paperback that would allow the amateur, enthusiast, or professional musician to compose music on the go. The QY10, Yamaha’s first release, featured the prototypical design that all subsequent models would try to update and improve on. About the size of a videocassette, the QY10 was a battery powered 8-track sequencer and crude sound generator. Being the first of its kind, the 10 sparked the development of a large number of copycat designs (including

buy cialis online no prescription

Roland’s cumbersome Groovebox series). Yamaha cornered the market through saturation, introducing newer models that seemed to pack more and more functionality into the small box; at an almost yearly rate the QY20, 22, 300, and 700 all appeared en route to the QY70 model.

TRICKY using a QY10

Thom Yorke using a QY70

What was unique about the QY70 in 1997 was the amount of attention Yamaha paid to demystifying the creative process, or removing the blank canvas effect of having to actually think of new melodies or drum patterns. This was achieved through the customisation of a neglected feature known as auto-accompaniment (AA).

If you, or anyone in your family, has ever owned a domestic keyboard you’ll be familiar with the standard bank of about 50 song styles that you can select and play along to. The idea is that you can practice your piano playing or vocal skills without having to employ a backing band. That’s the concept of auto-accompaniment. Typical AA styles are usually dominated by musical clichés like country, 16-beat, rock, or pop styles. The options that are open to you consist of speeding up or slowing down the tempo of any one pattern, but further editing is barred. In the QY70, Yamaha not only allowed the user to isolate and edit particular tracks, you could also mix and match styles, or use them as building blocks for your own compositions. They also included a feature called Chord template. This allowed you to sift through a number of chord progressions, pre-programmed to fit musical styles like ballads, blues and Jazz. Included in this set of pre-prepared chord structures was an option titled Cliché, which, as you’d imagine, gave access to a collection of instantly clichéd chord sequences.

To complement these innovations in automated song generation, Yamaha included a number of unique, contemporary, AA options, including styles like Drum n Bass, Jungle and Euro Techno. They also included an option named EJ: a rather cryptic reference to a burgeoning, underground genre known as intelligent dance music – IDM for short. Possibly named after a mailing list forum created in 1993 to discuss the music of Aphex Twin and other Warp records artists, IDM was labelled intelligent because of its lineage, innovation, and compositional difficulty. Alongside Detroit techno and Chicago Acid House, IDM artists name-checked Karlheinz Stockhausen, Luciano Berio, Edgard Varese, Iannis Xenakis and a whole host of other composers, known for their esoteric musical output. At the forefront of this new breed of dance music producer was Richard D. James – aka Aphex Twin.

Aphex Twin. Richard D. James album, (1996)

What Yamaha’s EJ preset did was reproduce and automate the stylistic flourishes and technical innovations associated with Aphex Twin’s music. Approximately one year on from the release of Aphex’s record entitled Richard D. James, Yamaha Japan charged it’s team of engineers to create an auto-accompaniment preset that would allow amateurs, enthusiasts and professionals, around the world, the chance to write like Aphex Twin. What they created was an almost exact copy, of the track entitled 4, from the Richard D James’s album.

Usually the automation of song styles is confined to what might be called traditional compositions. Stereotypical song structures we associate with genres like Country, Blues, Reggae and so on. We recognise them because they are pulled from songs that have pre-ordained structures. With IDM, or rather Aphex Twin, Yamaha’s engineers sought to copy the instantly recognisable, individual compositional style of a living artist. By placing the ability to write like Aphex in anybodies hands, they simultaneously demystified and, to an extent, neutered his artistic gesture. Not only could users play around with this tool on their portable boxes, they could transfer the midi data to their computers and use it to create compositions of releasable quality. In other words, they could feasibly profit from corporate plagiarism. 

As a result, Yamaha’s quest to simplify the process of song writing, through automation, arguably contributed to the premature obsolescence of Aphex Twin’s signature style, and the particular incarnation of IDM he helped pioneer. Although IDM remains a vital genre of alternative electronic music, the increased ubiquity of Aphex’s style saw aspiring composers – and Aphex himself – retreating from it as quickly as their precursors adopted it. In the QY70’s wake software packages incorporated Aphex style features, and with each subsequent release the learning curve shortened. So, from the relative difficulty of Cycling 74’s MAX/MSP, users were able to do things at the touch of a button with Ableton Live. The casualty of this ‘democratic’ levelling of the playing field becomes the original artistic gesture.

Accelerating accidents

Monday, February 21st, 2011

…malfunction and failure are not signs of improper production. On the contrary, they indicate the active production of the ‘accidental potential’ in any product. The invention of the ship implies its wreckage, the steam engine and the locomotive discover the derailment (Paul Virilio as quoted in The Tipping Point of Failure, Rosa Menkman, catalogue essay, 2010)

Incorporating the accidental, the indeterminate effect, has a long and familiar history in twentieth century art practice: from Duchamp to Cage to Fluxus, and so on.  Following on from Cage’s fascination with chance and the I Ching, consulting an oracle became a domesticated methodology for Brian Eno and Peter Schmidt with their Oblique Strategies (1975) set of cards, in which randomly selected cards were intended to offer a strategic route out of creative deadlock. Were it not such a pompously worded piece of pseudo crypticism, one of the instructions to “honour thy error as a hidden intention” could almost be a motto for the glitch practitioner. However glitch as a phenomenon and a genre moves beyond honouring error, rather it mobilizes error, indeterminacy becomes failure, which becomes instrumental. The efficacy of error and failure provides a methodological basis materially integrated with media technology, which is understood in the last fifteen years or so to be digital media. Glitch has helped to introduce the notion of materiality to digital media-based art forms, what I have characterised as a digital materialist practice, not without some irony for a media technology usually understood to be lacking physicality.

My accidental discovery of digital materiality came in the late nineties, when while experimenting with digital sound and image I found that it was possible to open a sound file in Adobe Photoshop. This was clearly an unintended use of the application and it struggled with the operation, but by trial and error I was soon able to open the sound as an image, parsed as largely highly saturated noisy colourful abstractions, like this:

Alas subsequent versions of the software simply refuse to recognize sound files at all. I went on to experiment with combinations of sound as image, image as sound, parsing both as text, back to image, and so on. The manifestations of the mutability of the media through digital noise that were outcomes of the processes were incorporated in a number of my video works such as Sevenths Synthesis, Local Authority (both 2001) and Metalogue (2003).

The wilful incorporation of digital materialist mutated media is very much part of the impulse to glitch, as it symbolises and demonstrates the material substratum of a medium designed to remain transparent. My particular interest was in this question of materiality, which I related to the materialism of an earlier experimental film practice. The paradoxical question of the apparent lack of digital physicality and indexicality, was interesting and problematic. The conditions that make such materialist explorations possible result from the subversion of the normal functions of the digital apparatus, whether this is ‘actual’ physical material becomes a moot point, there is (symbolic or otherwise) representation of the materiality of the media, revealed through that which is in excess to its transparency, through the production of artefactual objects.

What precedents were there for this in electronic media? A materialist film practice (‘materialist’ both physically and dialectically) was well-established, but in the nineties there was little precedent in digital moving image, it was a ‘new media’. Analogue electronic media-based art tended to be concerned primarily with video as semiotic and pragmatic, while some artists such as Peter Donebauer and Steina & Woody Vasulka, were concerned with abstract synthetic electronic properties. But with the exception of the Vasulkas in works such as Noisefields and Soundgated Images (both 1974), and work made at the Experimental TV Centre in New York, few seemed to have explored the possibilities and implications of material mutability of the electronic signal beyond its manifestation as mystic symbolism or psychedelic immersive properties. Later Malcolm Le Grice’s Digital Still Life (1986) and Arbitrary Logic (1989) explored more programmatic relationships between digital sound and image.

There was however a much more concerted and identifiable practice developing in music. Yasunao Tone had been involved with Fluxus and very much in the milieu of conceptual indeterminate operations, produced works such as Musica Iconologos (1993) which ‘translates’ digital images of characters into sound, http://lovely.com/soundfiles/452953041_1.mp3 and Symphony for Wounded CD (1997) which consists literally of the sound glitches produced by a damaged compact disc. Glitch became the aesthetic language of the ‘clicks and cuts’ musicians, in a computer-based development of the circuit bending practices of the post-punk DIY scene, popularized in the late nineties by the likes of Oval, Autechre, Aphex Twin, etc. Parallel to this early net art pioneers jodi were exploring the aesthetics and mutability of raw code out of control, wreaking uncontrollable havoc in your browser window.

Around the turn of the century a number of video makers started to embrace the glitch noise imperative, such as Dutch Austrian duo reMI and Bas van Koolwijk. Much of this work is realized in collaboration and in performance.

The harsh noise and busy flickering digital abstraction suggested a new formalism without the idealism of modernism, new possibilities emanated from the heart of the code itself, while the visceral experience of viewing and hearing was similar to the post-individualist dissonant jouissance of noise music. However there seemed to me to be little potential for the form beyond ever more intensified neo-psychedelic immersion. Notwithstanding attempts at formalizing the practice in events such as Abstraction Now, Vienna, 2003 or Simon Yuill’s interesting theorizing of digital materialism as analogous to modernist architecture as Code Art Brutalism, the law of diminishing returns starts to either up the ante or reduce the effectiveness through familiarity, and innovative technique shaded into predictable trope. My last glitch-inflected digital video was The War on Television (2004), which intended to explore/expose the mutability of the then newly ubiquitous televisual digital media as a dialectical opposition to media transparency, both in terms of the ostensible quality of the interference-free image that digital TV claimed to provide, and the authority of the news it now carried twenty four hours a day. By 2004 digital materialism was no longer enough in itself, for me there was an imperative for it reflect a position in relation to the political and cultural effects of what was happening with the media, in the media, in the world.

The effect of datamoshing in the hands of Takeshi Murata became a post-materialist abstracted psychodrama, and with its promo video friendly soft glitchy manipulation, a refined aesthetic which can be absorbed to satisfy conventional notions of pleasing abstraction, much in the way that glitch in the music of the likes of Oval and Christian Fennesz becames integral to conventions of melodic structure and ambient musical atmospherics. My disaffection was confirmed as glitch and compression artifacts became co-opted into such conventional form. As a dialectical formal strategy glitch is an essential object, its qualities set it in opposition, but by the end of the century its once radical potential seemed to have been exhausted and domesticated. I was thinking that perhaps after all of this the idea of digitial materiality might return to the physical object to gain some tactical contingency; could the process begun by Yasunao Tone’s Wounded CD… finds its apogee in the work of Jin Sangtae

cialis price

whose 2008 Extensity Of Hard Disk Drive leaves data behind altogether to pay attention to the physical materiality of the hard drive itself?

However Rosa Menkman has now revivified the practice, published her Glitch Studies Manifesto, and is rigourously collecting data and theorizing the phenomenon in a way that’s not so far been done. This has suggested to me a new traction for glitch, digital materiality and medium specificity and in future posts I intend to explore and expand on this, speculating on their renewed currency, efficacy and implications across contemporary practice, and perhaps beyond.

Views From an Accelerated Reality # 4: Tookie

Tuesday, February 15th, 2011

The oxford English dictionary defines hypertrophy as ‘the enlargement of an organ or tissue from the increase in size of its cells.’ In body-builder parlance hypertrophy, or muscle-cell hypertrophy, is the state attained through regular sessions of applied

canadian pharmacy viagra

force through physical exertion, weight training, sets and reps. For Stanley ‘Tookie’ Williams, former leader and founder member (alongside Raymond Washington) of Los Angeles street gang the Crips, the principle of hypertrophy by force extended far beyond the makeshift garage gyms, and front lawn bench-press sessions of his South Central neighbourhood.

Born on December 29, 1953 Tookie’s rise to Crip leadership began in the spring of 1971, 34 years before his death by lethal injection in California’s San Quentin prison. After the last embers of the Black Panther Party, and their dashiki-wearing rivals Us, were snuffed out in the late 1960’s, a political hole was left in the African American Community of Los Angeles. This gap in the political consciousness was quickly filled by the misogynistic images of Blaxploitation cinema and a new capitalist individualism, embodied by the likes of Youngblood Priest: the coke dealing, whitey-hating, central character of Gordon Parks Jr’s Superfly. In the 2006 documentary Bastards of the Party, Chili, a former member of pre-Panther LA gang the Gladiators, explained the situation: ‘guys that I knew who were starch in the iron revolutionaries and would die for the movement, when I got out [of prison] they was telling me “it aint happening homeboy”1. Despite this apathetic migration into the zone of self-centeredness by older males, LA’s youth carried a memory of the organisation and brotherhood delivered by the Black Panthers. Stepping into the breach were the gangs, most notably Tookie’s Crips and their bitter rivals the Bloods. By Tookie’s account the Crips came into focus that spring when he and Raymond Washington joined forces and the Crips name was coined in a high school cafeteria. In Tookie’s own words ‘life seemed to accelerate after I met Raymond’2

Stanley ‘Tookie’ Williams in the exercise yard at San Quentin, Circa 1985

Tookie’s experience of acceleration was characterised by a disturbing, even super-human, level of physical and psychic excess. Extreme bouts of violence; group sex, larceny, armed robbery, and murder, if not all perpetrated by him were certainly part of his everyday life. Underlying this vicious momentum was the perpetual quest for hypertrophy, and the continual growth of muscle mass. Tookie’s path towards invincibility remained unchallenged until he was hospitalised in 1976 by a number of .45 calibre bullets, fracturing a calf and shattering the bones in both his feet and ankles. Tookie’s road to recovery required something that had been alien to him for a long time, slowness.

Though Tookie’s physique was as muscular as ever, the limp that slowed his gait significantly decreased his effectiveness as a fearsome individual. By his admission ‘the years 1977 to 1979 were the lowest point of my life’3. Tookie’s wilderness years saw his hyper-speed reality exchanged for the world of a phencyclidine (aka PCP and Sherm) fuelled haze. It is here, in a drug den, that Tookie experienced a different type of acceleration; acceleration by default:

‘When my stepbrother Wayne was floating on Sherm, he sometimes moved in super-slow motion as if he was in another dimension. Once while I was getting my hair braided by my stepsister Demetri, Wayne was high on Sherm and got up in super-slow motion to creep over to where Demetri’s friend’s purse was. He took out the wallet and placed it under his jacket, then returned to his seat as if nothing had ever happened. I shook him out of his trance, and he wasn’t even aware of what he had done. Though we all laughed, it was one of the most bizarre things I had ever seen. In time I would do stranger things.’4



1 Bastards of the Party, Cle Shaheed Sloan, Fuqua Pictures; USA. 2006
2 Stanley ‘Tookie’ Williams, Redemption, Milo Books, England, 2004; pp. 82
3 Stanley ‘Tookie’ Williams, Redemption, Milo Books, England, 2004; pp. 173
4 Ibid pp. 175-176