By Adam Kleinman, Orit Gat
Author

Adam Kleinman (New York City) is Chief Editor of Witte de With’s online platform WdW Review. He has worked at Witte de With since the end of 2012. He is a writer and curator and former dOCUMENTA (13) Agent for Public Programming. Kleinman was curator at Lower Manhattan Cultural Council, where he created the interpretative humanities program “Access Restricted.” Kleinman also developed LentSpace, a cultural venue and garden design by Interboro Partners, which repurposed an entire vacant Manhattan block. There, Kleinman curated “Avenue of the Americas” (2010) and “Points & Lines” (2009). Kleinman is a frequent contributor to multiple exhibition catalogues and magazines including AgendaArtforume-flux journalFriezeMousse and Texte zur Kunst.

Orit Gat is the Managing Editor of WdW Review. Her writing appears regularly on Rhizome, where she is also the Features Editor, and has been published in a variety of magazines, including frieze, ArtReview, The White Review, Art-Agenda, and The Art Newspaper.
The Net, 1995.

When Yahoo! was sold to Verizon for $4.8 billion this past July, it was the beginning of the end of the long reach of 1995. First imagined as Jerry and David’s Guide to the World Wide Web in 1994, Yahoo! was incorporated in Sunnyvale, California as a directory of websites organized hierarchically rather than as a searchable index (which was what Google, incorporated in 1998, did with PageRank). In the period we now call the “early internet,” the foundation of Yahoo!, alongside other 1995 landmarks—the introduction of eBay, the Netscape IPO—was the moment that signaled to the world outside of Silicon Valley what the internet could be.

1995 was a time of optimism, but also fear, as portrayed across pop culture: in The Net, Sandra Bullock plays Angela Bennett, a brilliant, socially isolated systems analyst who, after discovering a serious security breach, has her entire record—social security number, credit cards, medical history, and so on—erased by hackers; in order to undo the erasure of her identity she needs to hand over proof of cyberterrorism to the FBI. In Hackers, Angelina Jolie and her posse of teenage computer whizzes get into trouble when they discover a scheme to steal $25 million. They quote the 1986 Hacker Manifesto: “Yes, I am a criminal. My crime is that of curiosity.”

These fears and excitements are representative of a year marked by many like pendants; the Bosnian Wars raged while the Schengen Agreement for free travel in the European Union was implemented. It was the year of the Oklahoma City bombing and the indefinite extension of the Nuclear Nonproliferation Treaty—just before the fiftieth anniversary of the dropping of the atomic bombs in Hiroshima and Nagasaki. 1995 also witnessed the only time the nuclear briefcase (Cheget) ever opened, when the radar detection of a Norwegian scientific missile by Russian early warning systems was misinterpreted as an American-led attack. Rising to the forefront, the O. J. Simpson trial unfolded, live on television, with coverage greater than the Bosnian Wars and the Oklahoma City bombing combined, finally ebbing when the former athlete and media star was found not guilty in October. That month also saw the Million Man March, in which African-American leaders organized a mass protest and summit on Washington’s National Mall so as to address the economic and social ills plaguing black communities, something that echoes strongly today as American police forces increasingly target African-American men across the country, while #BLM calls for greater representation and respect for the same disadvantaged communities. On the other side of the world, the assignation of Israeli prime minister Yitzhak Rabin chilled the Oslo Accords, and thus derailed hope for resolution in the Israel–Palestine conflict.

In 2005, writer Kevin Kelly published a text called “We Are the Web” in the magazine he founded, Wired. 1Kevin Kelly, “We Are the Web,” Wired (1 August 2005), http://www.wired.com/2005/08/tech/ (accessed 8 September 2016). It celebrated the ten-year anniversary of Netscape’s Public Offering, as a landmark in the history of the internet: “The Netscape IPO wasn’t really about dot-commerce. At its heart was a new cultural force based on mass collaboration. Blogs, Wikipedia, open source, peer-to-peer—behold the power of the people.” Kelly writes about what he imagined in 1995, what he sees in 2005, and projections for 2015. It is incredibly optimistic: “This view is spookily godlike. You can switch your gaze of a spot in the world from map to satellite to 3-D just by clicking. Recall the past? It’s there. Or listen to the daily complaints and travails of almost anyone who blogs (and doesn’t everyone?). I doubt angels have a better view of humanity.”

The Clinton presidency saw a balanced United States budget for the last time and the Dow Jones closed over 4,000 for the first time in February 1995, then over 5,000 in November, surpassing benchmarks twice in one year. The boundless economic growth in the United States and Western Europe in the ‘roaring nineties’ is attributed to the growth of the technology sector, which exploded—we now call it the “dot-com bubble”—as we stepped into the 2000s. 1995 is the midpoint of a decade whose legacy stretched deep into the next millennium in terms of geopolitical and financial effects. As the European Union is handling the fallouts of Brexit, the current wars in the Middle East create the largest refugee crisis since World War II, and the entire world battles with post-recession economics, the boundless optimism of 1995 seems almost quaint: as Kelly says, “if we have learned anything in the past decade, it is the plausibility of the impossible.”

—Sediments

Start Getting Real: Irony, identity, and implosion in The Real World 1995

By Elvia Wilk
Author
Elvia Wilk is a writer and editor living in Berlin. She writes for publications including Artforum, frieze, Art in America, art agenda, e-flux journal, Texte zur Kunst, Dazed, and The Architectural Review. She often collaborates to make texts for exhibitions and performances, and writes poetry and fiction. She is a contributing editor to Rhizome, was a founding editor of uncube magazine from 2012–2016, and since 2015 has been the publications editor for transmediale, festival for arts and digital culture. She has participated in events at venues including the Moscow International Biennale, Johann Koenig Gallery, Kiasma Museum of Contemporary Art, the Institute for Contemporary Arts, Tensta Konsthall, EME3 Barcelona, and Humboldt University.
The cast of MTV’s "The Real World," season 4, 1995.

For television’s whole raison is reflecting what people want to see. It’s a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile.

David Foster Wallace

 

Season four of MTV’s The Real World was the first to take place off the American continent. The “true story of seven strangers”—as the now-cliché introductory voiceover goes—featured an international bunch, sent to London to “live in a [town]house” in Notting Hill and “have their lives taped.” The seamless translation of the still-nascent reality-TV formula to a foreign context showcased just how effectively not only MTV culture, but the uniquely American export of reality TV itself, had already been absorbed into global teenage consciousness by 1995.

As in all first episodes of TV series with human subjects, the lead episode introduces the cast. We start with nineteen-year-old Kat, a bright-eyed, bushy-tailed NYU student excited to explore the world (and its older men). She has apprehensions, though: “I have a stereotype that Europeans have a stereotype of us as rude,” she worries. Next we meet Mike, twenty-one, an amateur race car driver from Missouri; we watch his face contort with panic when his dad warns him that London is an “international community where they have gypsies and bombers and terrorists.” Our first Brit, Neil, twenty-four, is an Oxford psychology dropout-cum-rock star, who describes himself as “both misanthropic and xenophobic” and displays his double nipple piercings to the camera. Jay from Portland, just shy of twenty, is a soft-spoken, spectacled, aspiring playwright who is mainly nervous to leave his high-school girlfriend behind. Then there is the glamorous Australian supermodel, blasé twenty-two-year-old Jacinda, who is jaded from feeling like a clothes hanger for the male gaze to rest upon. Our requisite Person of Color, Sharon, is a bubbly singer-songwriter from Essex, and though she is only twenty, she is the mature, non-partying type. Lastly there is Lars, twenty-four, the chain-smoking DJ from Berlin.

Though the whole gang seems equally capable of owning up to their respective assigned roles, the Americans are markedly younger, in reality and in appearance, than the non-Americans, suggesting an imbalance of TV-ready growth hormones across the continents. The Americans, it appears, are ripe and ready at an early age to exhibit themselves, having been raised fully on a diet of exposure.

Real World hype was high in 1995, following the previous year’s season, which had reached peak drama with a string of historical firsts for TV broadcasting. Filmed in San Francisco, the plot of the 1994 season revolved around roommate Pedro Zamora, a young Cuban-American living with AIDS. The candid footage depicted his physical deterioration from illness, his harassment by a homophobic cast member, his blossoming relationship with his boyfriend, and their eventual commitment ceremony on TV. Zamora became a figurehead for LGBT rights (the “Q” was not added until 1996) and HIV/AIDS awareness, lauded by the likes of Bill Clinton. He died shortly after the show aired, compounding the tragedy.

The choice to include cast members like Zamora allowed MTV to make claims about the show’s cultural-political value. Co-producer Jonathan Murray said in a recent (2014) interview with Rolling Stone: “I like to think that maybe The Real World had a part in making this the most tolerant, open-minded generation ever.” And indeed the show was commended by some at the time for its diverse casting, inclusive of demographics typically unrepresented in mass media. As hard as it is to imagine reality TV as a public service today, for a time MTV framed it as a consciousness-raising experiment.

Yet, as became increasingly clear through the many seasons of The Real World—thirty-two and counting—and as viewers of season four may have already begun to suspect, in order to preserve the very ‘different’-ness of the participants, their portrayals ended up consistently flattening their identities into prepackaged one-liners. This, in order that they could be pitted against each other to create a semblance of a storyline. The gay activist and the homophobe. The Southern belle and the diva with the Afro. Or, in the case of the London season, the race car driver from Alabama, who has a meltdown in the supermarket when he realizes the Brits do not sell ranch dressing, and the high-minded, Oxford-educated rock musician, who is endlessly offended by American bad taste. This tendency toward parodic, one-dimensional behavior only increased as a set of personality tropes became established over the years.

In the classic inversion of so many ‘awareness’ campaigns, bringing the lives of minorities to light wound up further marginalizing them. Identity boundaries that were supposed to be transcended were instead reinforced, partially to pander to supposed viewer expectations, but also because of the inherent structure of an ‘unscripted’ show. In order to engineer a semi-organic plot structure, the characters have to be molded into predictable actors who, when put in motion, will be guaranteed to set off certain chemical reactions. Amanda Ann Klein wrote in a 2015 New Yorker article titled “Thirty Seasons of The Real World” that the very structure of the show “implies a belief that bearing witness to difference somehow creates tolerance. And the show depicts intolerance as stemming directly from identity. One is racist because one is from the South. One is sexist because one is a male jock. Just mix, shake, and film.”

*Visual ReferenceMike, the race car driver from Alabama, has a meltdown in the supermarket when he realizes the Brits don’t sell ranch dressing, in "The Real World," season 4, 1995.Mike, the race car driver from Alabama, has a meltdown in the supermarket when he realizes the Brits don?t sell ranch dressing, in “The Real World,” season 4, 1995.

Bright-eyed expectations, or at least justifications, for what reality TV could do—raise awareness, abolish difference, create communities and solidarity—as opposed to what it did do, which is reestablish stereotypes according to market target groups, is not at all dissimilar from the dire trajectory the internet took in the early 2000s. In 1995, the internet’s era of counter-cultural optimism was on the rise just as TV’s was on the wane. Culture critic Mark Greif lamented the lost potential of broadcast media in an essay for n+1 (the magazine he then edited), writing, “[t]he utopia of television nearly came within reach in 1992, on the day cable providers announced that cable boxes would expand to 500 channels. The promise of the 500 channels went to waste. The techno-utopians’ fantasies shifted to the internet. Nothing like the paradise we hoped for came to fruition on TV, that’s for sure. Instead we got reality TV.”

In the MTV era, counterculture was sold back to itself, becoming over-the-counter culture. And, as David Foster Wallace famously described, in the early 1990s the capacity of irony as a subversive political device, for which it had functioned for at least a few hundred years, was subsumed by mass culture and rendered impotent. Foster Wallace traced a twin development in literature of that era: a post-ironic flattening of affect that lost all potency in its attempt to mimic life with no refraction or distance—ending up as neither irony nor its supposed opposite, earnest engagement. In other words, when placing oneself at an (ironic) distance from reality was no longer seen as a primary method of critiquing it, critique attempted the opposite: to get as close to ‘reality’ as possible, to meld with it, to become it. (The issue of proximity to the subject of critique has plagued artistic forms ever since; endless debates revolve around whether or not an artwork is ‘reproducing’ the reality it purports to critique because it is too close, or whether it is so far from its subject it offers no real intervention.) In this way, the draining of irony’s political capabilities led not only to mass media but also to high-culture forms (cinema and literature included) that sold themselves as hyperreal.

Incidentally, one reason for reality TV’s proliferation, which began a few years after The Real World launched (Surivor premiered in 1997; Big Brother in 1999) was that no one wanted to pay creative writers. When producers Mary-Ellis Bunim and Murray originally conceived The Real World in 1992, it was intended as a scripted drama à la Beverly Hills 90210 or Melrose Place—but MTV did not have the budget to pay for the screenplay or any A-list actors. Bunim and Murray’s ingenious solution was to outsource the dramaturgy to real people at less than minimum wage (besides room and board, Real World participants back in 1995 were paid around $2,500 for their cooperation, or $500 per month), guided by an army of emotionally manipulative producers. 2This is assuming payment had not risen since the first season in 1992, when contestants were paid $2,500. Divvied up per month, the payment is well below minimum wage, which, according to the United States Department of Labor, would have been $680 a month in 1995 (based on a forty-hour workweek). Reality TV has, since then, remained very cheap to produce; however high grossing, participants are rarely paid well. Moreover, series are perpetually spin-off-able in new permutations (1995 was also the birth of Road Rules, a gamification of The Real World wherein contestants traveled the country competing in challenges). Naturally there are surges in reality programming following writers’ and actors’ strikes.

Though the concept arose semi-accidentally, Bunim and Murray cite their inspiration for the idea in reality predecessor An American Family, a unique and widely popular miniseries by PBS that aired in 1973, which depicted the disintegration of a real family, the Louds, over seven months in 1971. Despite its success, the reality form remained a novelty; a show of that kind was not reproduced until two decades later when Bunim and Murray stumbled onto the format.

If reality TV can be traced through the history of candid-camera or documentary filmmaking, it can also be understood as an outgrowth of public fascination with social psychology experiments. The Real World creates a microcosm in which to observe human behavior, mimicking the basics of any psychology study (the sadism and schadenfreude duly amplified). But the experiment does not focus on any one behavior; the psychological phenomenon under scrutiny on reality TV is and always will be what psychologists call the “observer effect,” also known as the Hawthorne Effect: the effect of being watched. Not accounted for by the original observer effect was the way that watching someone being watched changes the watcher, in a cycle of increasing self-awareness. “The reality of reality television is that it is the one place that, first, shows our fellow citizens to us and, then, shows that they have been changed by television,” writes Greif. This awareness of self-as-viewer-as-performer is already highly evident in season four of The Real World. All the roommates understand the meta-level (watching themselves) of the semi-scripted lives that they are leading—when Sharon theatrically burns a cake in the first episode she jokes, “this is the sort of thing that happens in sitcoms!”—and yet no amount of self-awareness can lead one out of the trap of performing oneself.

*Visual ReferenceWomen in the Relay Assembly Test Room of Western Electric Hawthorne Works, where the "Hawthorne Test" was first realized, ca. 1930. Western Electric Company Hawthorne Studies Collection, 2007 President and Fellows of Harvard College; all rights reserved.Women in the Relay Assembly Test Room of Western Electric Hawthorne Works, where the “Hawthorne Test” was first realized, ca. 1930. Western Electric Company Hawthorne Studies Collection 2007 President and Fellows of Harvard College; all rights reserved.

Even by 1995, most reality TV participants had already seen enough TV to know how they were supposed to act; but what may have still been unclear to them, and to audiences, at that point, was the extent to which their performances would be engineered on the production side. Today (thanks to a stream of journalism and popular media ‘uncovering’ what goes on behind the scenes, not to mention TV series that do the same), the instruction and manipulation of participants is common knowledge. The process of building the story is collaborative at best, exploitative at worst. But as many have written, this has made no difference in our fascination, it maybe even exacerbated it. For instance, in an article titled “Doll Parts,” the New Yorker’s TV critic Emily Nussbaum put it this way: “It doesn’t help that I know, as we all do, that some proportion of the show is scripted—that simply helps us enjoy the humiliation without guilt.” Today, cast members refer ironically to the scripted nature of their performances, and in the same breath earnestly insist that they are on the show for the right reasons. Our loop of self-awareness has spiraled far beyond what Foster Wallace pinpointed as the death of irony in the 1990s.

The main thing that stands out when watching the 1995 season of The Real World today is its remarkable lack of controversial material. It essentially steers clear of the political stakes one might have expected, especially after the previous season, instead rolling out hour after hour of banal interpersonal drama. With ‘culture clash’ as its intended backbone, the twenty-three episodes amount to little more than a spineless record of petty jealousies and roommate bickering over a messy kitchen. The apex of drama, in episode six, when rock star Neil tries to kiss a heckler at a concert and gets his tongue nearly bitten off, is treated as a plot point only insofar as it affects his relationship with his on-again-off-again girlfriend. At the end of the season, in an attempt to spice things up, the producers send everyone on safari in Kenya, leading to a round of sentimental bonding and a polite discussion about the ethics of eating meat. Admittedly, it is entertaining, but it is decidedly mild.

The mildness of the material’s presentation is precisely what highlights the violence of the show’s format. This violence can be hard to identify when watching the Real Housewives of today, because it is been buried under so many stacks of self-awareness. But it comes across loud and clear in the early Real World: the personalization and therefore the depoliticization of structural difference through the individuation of behavior. In other words, any actual potential political conflict is panned off as the result of personality difference. For example, Sharon, the unskinny black roommate, is uncomfortable when goaded to try on the clothes of the size-0 white Australian supermodel; this is surely because she is uptight and uneducated in fashion—not because of any larger societal discrepancies between them. With individuals isolated from their cultural contexts and subjected to observer effect, conflicts that undeniably stem from race and class are consistently framed as interpersonal issues resulting from individual pathologies. The pettier the conflicts, the more their underlying causes are neutralized. In season four, we witness the quiet implosion of identity politics.

The actual formal devices of reality TV pioneered by The Real World lend themselves beautifully to this end. The producers piloted those first-person, confessional interviews interspersed throughout documentary footage that have by now become inseparable from the reality genre as a whole, where a scene is followed by someone’s meta-commentary explaining his or her behavior in the scene, followed by a scene demonstrating what has just been explained. This structure performs a constant reinforcement of subject position such that it becomes the root of all behavior.

The erasing of the structural by the personal that began in the 1990s can tell us a lot about what has happened, and is still happening, with internet culture. As Foster Wallace wrote in “E Unibus Pluram” in 1992, “Television […] has become able to capture and neutralize any attempt to change or even protest the attitudes of passive unease and cynicism that television requires of Audience in order to be commercially and psychologically viable at doses of several hours per day.” 3Foster Wallace, “E Unibus Pluram”: 171. Replace “television” with “internet” and “audience” with “user” and you have a formula to describe why we are all on Facebook for an average of fifty minutes a day.

Reality TV now makes up about 65 percent of all United States-based programming. As broadcast TV wanes and online streaming services take over the domain, the two media—reality and internet—are fusing together, making this a good moment to look back at the lost utopias of both. If reality TV erased structural violence with petty personality conflict, and if the internet rendered all of our personalities public, subjecting us to universal observer effect, the best way forward may be through turning our observations away from each other and toward the underlying structures engineering—producing—our behavior. Trying to ‘really’ see each other is laudable, but it is worth remembering that there is no such thing as verbatim reality: it’s all created. Every episode has a producer.

—Sediments

To the Next Level

By Clemens Jahn
Author
Clemens Jahn is a Berlin-based designer and researcher. He has worked with a range of international institutions and partners, such as the MAD Museum of Arts and Design, New York, SCHIRN Kunsthalle, Frankfurt, ZKM | Center for Art and Media, Karlsruhe, Kunsthalle Bern, and Sternberg Press. He supports Karlsruhe University of Arts and Design as an advisor and strategist. His writing has been published in Texte zur Kunst and by Spector Books and Kulturverlag Kadmos.
Image realism comparison from videogame Crysis, developed by Electronic Arts, which had its first release in 2007.

Video games took a wrong turn at the end of the 1990s. In the past two decades, the graphics of video games have become increasingly realistic, their game play smoother, and more complex and immersive than ever before. The scale of some of today’s game productions can easily compete with that of blockbuster movies, but not unlike twenty-first century Hollywood, large parts of the game industry remain caught in a loop of repetition and self-replication. The most elaborate game mechanics, audiovisual atmospheres, and virtual reward and achievement systems keep millions of players hooked, but plotlines are often shallow and highly predictable, and the cast stereotypical and normative—in some cases inhumane, sexist, and/or misogynist. 4On the Feminist Frequency website—which started as a blog and is now a not-for-profit educational organization—Canadian cultural critic Anita Sarkeesian has published a series of videos: “Tropes vs. Women in Videogames,” Feminist Frequency website, https://feministfrequency.com/series/tropes-vs-women-in-video-games/ (accessed 23 February 2017). Immense production budgets, tight schedules, and pressure to secure sales and returns on investment restrict the willingness of game companies to experiment and take conceptual risks. Innovation is still often limited to aesthetic embellishment and gaming experiences are defined by the corporate interests of an industry deeply rooted in the dynamics of globalized network capitalism. In contrast, looking back at the mid-1990s allows a glimpse into the potential of empowering, and politically and conceptually progressive gaming experiences, just before the experimental and utopian aspects of early online gaming and internet culture began to attract the attention of commercial applications.

***

The Closet is a dark, cramped space. It appears to be very crowded in here; you keep bumping into what feels like coats, boots and other people (apparently sleeping). One useful thing that you’ve discovered in your bumbling about is a metal doorknob set at waist level into what might be a door. 5Excerpt from the first text a user encounters when entering famous MUD LambdaMOO. From: Sherry Turkle, Life on Screen (New York: Simon & Schuster, 1995), 182.

In 1995 about 60,000 people regularly visited Multiuser Dungeons (MUDs for short): 6Ilsa Godlovitch, “Jackal takes Dragonfly to be his bride,” Independent, 28 August 1995, http://www.independent.co.uk/life-style/jackal-takes-dragonfly-to-be-his-bride-1598406.html (accessed 8 October 2016). online, multiplayer virtual worlds, usually maze-like and text-based, in which users interacted in real time. 7Alternatively “Multi-User Dimensions” or “Domains” for those with without a hack-and-slash focus. Today some MUDs are technically still ongoing, but they are not in wide practice anymore. The website mudstats.com provides an extensive list of still active and inactive MUDs, and their current online user numbers. Originally based on the fantasy tabletop role-playing game Dungeons & Dragons (1974) and in the tradition of early fantasy computer games such as the Zork series (1977–82), MUDs combined elements of traditional role-playing games, hack-and-slash, world building, collective writing, and online socializing.

*Visual ReferenceCreated in 1996, Aardwolf is a MUD still managed and played online today.Created in 1996, Aardwolf is a MUD still managed and played online today.

MUD1 (1980), created by Roy Trubshaw and Richard Bartle, at the time students at Essex University, is considered the first of its kind and was followed by a growing number of increasingly popular successors. In the mid-1990s, MUDs had reached the height of their popularity: there were around 600 active games, covering a wide range of different settings from fantasy (historical, dark, or medieval) to contemporary, sci-fi, and cyberpunk, and involving communities like furry, “adult,” and educational groups.

Although they were built on the metaphor of three-dimensional physical environments, most MUDs consisted of text only. Users navigated through virtual space by typing directional words such as “go,” “look at,” and “open,” commands like “say,” “tell,” and “whisper,” to interact with other human or non-human actors, or “bash,” “punch,” and “kill” when in combat inside a hack-and-slash MUD. With TinyMUD (1988), which focused on world building and socializing with other users, a range of mainly socially themed MUDs began to emerge. There were also MUDs dedicated to TinySex, mudsex, or netsex—different names for cybersex within the platform—such as the still-active Tapestries MUCK (1991), “an adult social/roleplay muck with the theme of sexual exploration within a Furry Theme.” 8“Tapestries MUCK,” MUDStats.Com website, http://www.mudstats.com/World/TapestriesMUCK (accessed 23 February 2017). “MUCK” and “MUSH” are puns on the word MUD, although backronyms like “Multi-User Shared Hallucination,” “Multi-User Shared Hack,” “Habitat,” “Holodeck,” “Multi-User Chat/Created/Computer/Character/Carnal Kingdom” and “Multi-User Construction Kit” can be found. These text-based, highly abstract, anonymous, and imagination-based multiuser spaces provided an ideal environment for their inhabitants to freely switch between different genders, ages, and appearances.

Despite various accounts of net.sleazing, mud rape, and other kinds of violent action among users, as well as reports of broken trust and broken hearts caused by advanced flirtatious “chatterbots,” MUD culture is a perfect example of the novel heterotopia that the internet was lived as—and perceived as—by many in the mid-1990s. With people from all around the world participating, the mostly noncommercial MUDs seemed to make the dream of a limitless, heterarchical, global, and social online community tangible, at least for a small fraction of the world population who not only had a computer with internet access at that time but could also afford to spend half the day online, when surfing the Web was still charged by the hour (it will thus come as no surprise that the concept of MUDs was invented at universities where online access was free). The ability to cycle through parallel worlds, different sexes, and multiple characters provided a perfect alternative to the rigid constructions of real-life gender and identity. That users’ shared authorship in terms of the ability to cocreate, modify, and expand every aspect of their virtual environments, furthered the fictional quest narratives of many MUDs, giving their users a new experience of world building and virtual boundlessness.

By 1995 a few commercial game studios had already begun to publish the source code of their games, allowing players to create their own levels, maps, textures, characters, sounds, and animations. Among the first larger firms aware of the potential of game modifications (or mods) was Texas-based developer id Software. Publishing source code of their successful first-person shooters Wolfenstein 3D (1992) and Doom (1993) for individual modification created a large international mod-making community. Fans of the games made thousands of new levels with custom designs, whether in the style of the original game or referencing other pop cultural franchises such as Aliens, Sailor Moon, and Sonic the Hedgehog. In the following years, several successful Doom mods were published commercially, and id Software even recruited talented mod-makers to work for the company, like Tim Willitis, who is now id Software’s creative director. Some of the most popular online games started as mods, including Team Fortress (1996, based on Quake), Counter-Strike (1999, based on Half-Life), and Defense of the Ancients (2003, based on Warcraft III). Game mods furthered the concept of allowing players to coauthor and customize their game environments, within the limits of their inevitable corporate affiliation, regulation, and occasional re-commodification, however.

Compared to MUDs, which were exemplary for pioneering decentralized and nonlinear online gaming, most of what the commercial video game industry produced in the mid-1990s seems rather underwhelming. Rapid advances in computer technology at the time made more and more developers shift their focus to state-of-the-art 3-D graphics. “You want slow, flat games?” a TV ad for the Atari Jaguar console asked. “No. Slow is for turtles, flat is for boards. Fast and 3D is for games.” 9Quoted from: “E3 1995 – first show ever! Full length documentary. Great history!,” YouTube video, posted by QLvsJAGUAR, https://www.youtube.com/watch?v=fC9ZJWHFjhc (accessed 23 February 2017). But up-to-date visuals were often detrimental to the conceptual qualities and imaginative potential of games. Technologically less sophisticated 1995 classics such as Command & Conquer, Rayman, and Street Fighter Alpha were based on simple concepts and a set of already known formulas—all well made with attention to detail, addictive and entertaining, but highly predictable with shallow plots and linear game play.

By 1995 the internet was largely privatized and was shaping up to become a vast global infrastructure. 10“A Brief History of NSF and the Internet,” National Science Foundation website, 13 August 2003, https://www.nsf.gov/news/news_summ.jsp?cntn_id=103050 (accessed 23 February 2017). The hints of an alternative, postmodern online social reality one could briefly experience in the 1980s and early 1990s had slowly drawn to a close and begun to segue into the normative mainstream version of network culture that most of us know today. For the video game industry, the mid-1990s marked a tipping point because of new technological infrastructures creating possibilities of combining the high-end graphics of offline gaming with online multiplayer experiences. Massively Multiplayer Online Role-Playing Games (MMORPGs), like Neverwinter Nights (1991) and Ultima Online (1997) were built on a tradition of fantasy hack-and-slash MUDs with a social component. However, they were graphical and made to please a large market, as their participant numbers grew rapidly, along with the massive spread of the internet. Similarly, Linden Lab’s Second Life (2003)—which the company insists is not a game—might be considered a direct successor of socially themed MUDs, where the users’ primary goal is not to beat a game, but to experience or inhabit a virtual world. From 2004 onward, MMORPG World of Warcraft took a globally networked gaming experience to yet another level, with player numbers quickly reaching millions.

*Visual ReferenceMenu screen for  the video game Neverwinter Nights, 1991 release.Menu screen for the video game Neverwinter Nights, 1991 release.

***

We control the game. But a game also grips us: we play, and are played by the game as well. 11Johannes Kirsten in: Olaf Nicolai and Jan Wenzel, Four Times Through the Labyrinth (Leipzig and Zurich: Spector Books, 2012), 149.

In 2015 the game industry grossed $23.5 billion in digital game sales in the United States alone, 12Chris Morris, “Level up! Video Game Industry Revenues Soar in 2015,” Fortune (16 February 2016), http://fortune.com/2016/02/16/video-game-industry-revenues-2015/ (accessed 23 February 2017). compared with $7 billion in 1994. 13John Markoff, “Sony Starts a Division To Sell Game Machines,” The New York Times, 19 May 1994, http://www.nytimes.com/1994/05/19/business/company-news-sony-starts-a-division-to-sell-game-machines.html (accessed 23 February 2017). According to The Entertainment Software Association, in more than 60 percent of American households there is at least one person playing video games for more than three hours per week, with a gender ratio of 59 percent male and 41 percent female. 14Essential Facts About the Computer and Video Game Industry (Washington, DC: Entertainment Software Association, 2016), PDF e-book, http://essentialfacts.theesa.com/Essential-Facts-2016.pdf (accessed 23 February 2017). The increasingly vast distribution of video games supports game designer and theorist Ian Bogost’s hypothesis about the exceptional persuasive power of video games: “In addition to becoming instrumental tools for institutional goals, videogames can also disrupt and change fundamental attitudes and beliefs about the world, leading to potentially significant long-term social change.” 15Ian Bogost, Persuasive Games: The Expressive Power of Videogames (Cambridge, MA: MIT Press, 2007), ix. But the number of games that are noticeably aware of this social agency, aiming at anything but putting commercial success first, remains comparatively low and barely visible. A problem New York-based nonprofit Games for Change tries to take action against. According to Asi Burak, chairperson of Games for Change, “Opposition to sophisticated critique of videogames tends to come from within the gaming industry itself […]. It’s the nature of gaming to be edgy and anti-establishment. It’s a young industry. It saw rapid commercial success [and it is] historically an underground kind of field, not used to a spotlight that could reveal flaws alongside beauty.” 16Jessica Conditt, “Video games can drive social change, if they grow up first,” Engadget (22 April 2015), https://www.engadget.com/2015/04/22/games-for-change-asi-burak/ (accessed 23 February 2017).

Video games are still often stigmatized as “low culture” and are subsequently absent in most “high culture” contexts, such as the art world, or within the scope of a critical academic discourse. Also the technical complexity of producing up-to-date video games, and the widespread unfamiliarity with the workings of the medium could explain its still-narrow employment. Looking back at the MUD era, however, demonstrates how significant innovation can emerge almost from scratch—albeit perhaps in the safe space of a university—and that a mammoth-like industry does not necessarily have to be its predestined birthplace. Ultimately, combining the noncorporate freedom and open spirit of early online gaming with today’s advanced digital infrastructures, distributed technology, and experience could lead video games to the next level—as emergent possibility spaces, and as unconditional media for collaborative experiment and creation.