Hermeneutics and analytics

“Language is easy to capture but difficult to read”, in the words of the poet and media researcher John Cayley.1 Cayley wrote this sentence merely as a footnote to an essay on his “terms of reference”, yet it sums up the whole dilemma of so-called “big data” processing. Data “analytics” deals with the same structural problem that the oracle priests of Delphi tried to solve: how to make sense out of an endless stream of (drug-induced) gibberish? Delphi became one of the birthplaces of hermeneutics, the theological-philological discipline of exegesis: without expert interpretation, first through priests, later through philologists, gibberish would have remained gibberish. Literary studies secularized hermeneutics in the 19th century, and Freud’s psychoanalysis – the close reading of the gibberish captured from a patient’s subconscious – made it medical and thus applied science. Intelligence agencies, investment banks and internet companies turned analysis into analytics.2 In order to quickly make sense of captured data, computer analytics had to take shortcuts in the process from capturing to reading, by jumping from syntax to pragmatics, by operationalizing and thus simplifying semantic interpretation in the process.

Computational analytics – whether performed by intelligence services, on stock markets or on web server logs – is limited to what can be expressed as quantitative-syntactical operations to be performed by algorithms. This conversely changes the perspective on the gibberish. Rather than a narrative in need of exegesis, it is now a data set in need of statistics. As Johanna Drucker pointed out,

“the abandonment of interpretation in favor of a naïve approach to statistical [analysis] certainly skews the game from the outset in favor of a belief that data is intrinsically quantitative —– self-evident, value neutral, and observer-independent. This belief excludes the possibilities of conceiving data as qualitative, co-dependently constituted”.3

Yet it could be argued that data is always qualitative, even when its processing is quantitative: this is why algorithms and analytics discriminate (see Steyerl, Chun in this volume).

Crisis Computing4

A staple part of Fluxus festivals in the 1960s were Emmett Williams’ Counting Songs (1962) which consisted of the artists on stage counting the audience members one by one. Aside from being early pieces of performance art and poetry, minimal music and concept art, they also served the pragmatic purpose of obtaining “an exact head count to make sure that the management [of the festival venues] wasn’t cheating us”.5 With the same shortcut from instruction to pragmatics as in today’s computer analytics, Williams’ score was thus a simple data-mining algorithm. The semantic interpretation of the piece was left to the audience, which in the 1960s was likely to have read the piece as absurd theater in the tradition of Ionesco and Beckett, rather than as a musical-poetic performance in the tradition of John Cage’s and La Monte Young’s event scores. Today’s audiences might be inclined to associate the Counting Songs with the counting of individuals in other confined spaces such as kindergartens, aircrafts and refugee camps. Like other Fluxus pieces, the Counting Songs have been commonly read as participatory artworks, since they cannot exist by themselves but instead are structurally dependent on their audience. Yet they effectively establish and reinforce the various divides between the artist-composer, the performers who execute the score instructions, and the audience upon whom the score is performed. As data processing, the piece thus contains the hierarchy of programmer, program and data, while selling the same illusion of participation and interaction with which “interactive systems”, from computer games to social networking platforms, are being sold today. With their instruction code and performance, however, the Counting Songs openly expose this manipulation, like a Brechtian theater of algorithm. (The Fluxus artist who most consequently worked in the medium of minimalist instruction scores coincidentally adopted the name George Brecht. Born George MacDiarmid, he had previously worked as a chemist conducting research and development on tampons at Johnson & Johnson.)

On the level of their pragmatics, the Counting Songs may be interpreted as an early piece of crisis computing. Williams recalls that

“[s]ometimes, there were more performers than spectators at these ‘public performances’. And sometimes, when the audience outnumbered the performers, the spectators took advantage of the situation. One night, students climbed up onto the stage, harried the performers, and tried to set fire to the score of my Opera. And once, during a performance, in Amsterdam, a girl tried to set Dick Higgins on fire”.6

The suspicion that managers tried to cheat the artists proved true, since “our share of the gate on the first night of the festival had been considerably smaller than the standing-room-only crowd had led us to expect”.7 As crisis computing, the Counting Songs thus enact the notion of “crisis” in its original Greek meaning (decision) as well as in its contemporary sense (state of exception). The songs perform decision-making through computing, with the purpose of regaining control in a state of exception. However, an inherent issue of the Counting Songs is their necessity, as a fixed data-mining algorithm for computational analytics, to always anticipate the state of exception. They could only react to a crisis scenario which the Fluxus artists were already familiar with, and which predictably repeated itself at each new festival location. But how can a state of exception live up to its name when it has become predictable? How would the Counting Songs deal, for example, with an overnight Brexit in which the Fluxus artists would lose their permit to commercially perform as foreigners? How would the Songs deal with a sudden monetary crash that invalidates all cash, leaving people only with the possibility to pay for online services through crypto-currencies? How would they deal with non-paying refugees seeking shelter in a festival venue?

The reduction of audience members to countable numbers – data sets, indices – is thus a self-fulfilling prophecy of stability. Its production of numbers would remain perfectly self-referential, even if the counting instructions were riddled with bugs, or were combined with instructions from others scores (such as, for example, Takehisa Kosugi’s Music for a Revolution which requires the performer to “Scoop out one of your eyes 5 years from now and do the same with the other eye 5 years later”8) in such a way that would result in interferences and unpredictable system behavior. Today, such complexity nightmares have become everyday phenomena, from computer crashes to Y2K bugs, and popular fiction such as the_Robocop_ character (in Paul Verhoeven’s original 1987 film) whose circuits simply shut down when his programmed instructions – to arrest criminals – conflict with another programmed instruction to never arrest board members of Omni Consumer Products, the company that constructed him and that runs Detroit’s privatized city administration and police force.9

Common wisdom in crisis computing is to increase the complexity of algorithms so that systems can cope with the complex realities they encounter. The instruction set for Williams’ Counting Songs could be extended to also include behavioral rules for Brexit and other states of exception, or to cope with a fascist regime under which counting people has become the privilege of private warfare contractors. What becomes of performance art, with its implicit program of disrupting static social situations, when it has to operate in situations of maximum social disruption? How could a Fluxus score be performed in a territory overwhelmed by drone warfare or controlled by gangland criminality?

The popular narratives for these scenarios are, of course, not to be found in Fluxus. From 2005 to 2010, CBS television broadcast the series NUMB3RS with plots revolving around modern mathematics being applied to solve crimes.10 The show’s two main characters were an FBI agent and his brother, a professor of applied mathematics who becomes drawn toward police work through his tireless invention of algorithms that predict behavioral patterns of crime suspects and the probability of future crime scenes. When the show first aired, the term “big data” had not yet been coined. There were, however, historical precursors to algorithmic law enforcement. When the bombings and kidnappings of the extreme-left Baader-Meinhof group reached a climax in West Germany in 1977, Federal Criminal Police director Horst Herold ran population databases through mainframe computers in order to narrow down the list of terrorist suspects. The Hamburg-based punk band Abwärts (“Downward”) reacted to this in 1980 with their song “Computerstaat” (“Computer State”) whose lyrics, translated into English, read as follows:

Monday, someone knocks on your door
Arafat stands on the floor
Tuesday, test alarm
Paranoia in the tram
Wednesday, war’s become very cool
Brezhnev lurks in your swimming pool
Thursday, you knew it
A thousand spies in the sewerage
Friday’s owned by the mafia
Your ravioli comes from Florida
Saturday night, madness is loose
The KGB in the German woods
Sunday, all is toast
World War’s begun on Mallorca’s coast
Stalingrad, Stalingrad
Germany catastrophe state
We live in the computer state
We live in the computer state
We live in the computer state.11

The LP on which the song was released ends with a sound sample of Horst Herold warning Baader-Meinhof members that they would eventually crack under the pressure of the police manhunt against them. The final statement of his speech, “wir kriegen sie alle” – “we’ll get them all” – is pressed into an endlessly repeating lock groove on the record. This way, the analog audio medium emulates the cybernetic feedback loop of a computerized dragnet search.

Not much seems to have changed between 1977 and 2017 in the use of technology and the state of world affairs, if one replaces Arafat with the Islamic State of Iraq and Syria (ISIS), Brezhnev with Putin, the KGB with the FSB and perhaps Stalingrad with 9/11. Predictive policing had already been imagined much earlier, notably in Philip K. Dick’s 1956 short story Minority Report. The story’s film adaption by Steven Spielberg in 2002 featured three-dimensional computer interfaces which likely paved the way for the visual aesthetics and mainstream television success of NUMB3RS in 2005. On the surface, NUMB3RS might have seemed no more than an updated version of the 1950s radio and television show Dragnet; the police method featured in Dragnet, of searching criminals by gradually narrowing down lists of suspects, was itself updated/renewed in real life in 1970s Germany using mainframe computers for dragnet searches, a method strongly proposed and advocated by Horst Herold and reflected in Abwärts’ song Computerstaat. In Minority Report, predictive policing was pure science fiction with no basis in real technology. But NUMB3RS for the first time presented modern computer-based analytics in each of its episodes. The formulas, statistics and algorithms in NUMB3RS were neither old-school database searches, nor Hollywood smoke-and-mirrors, but genuine mathematics and fairly realistic cases of modern “big data” analytics. Wolfram Research, the developers of the Mathematica software package and the Wolfram Alpha search engine, were employed as the show’s scientific consultants to make sure that all the mathematics presented in the episodes were real and that the algorithms and visualization could work. The producers of the series were the brothers Ridley and Tony Scott, whose feature films Black Hawk Down (2001) and Top Gun (1985) were about modern warfare and had been produced with direct support from the U.S. Army (and in the case of Top Gun, also with financial support from the U.S. Department of Defense); conversely, Tony Scott’s 1998 film Enemy of the State presented a dystopic, technologically realistic scenario of NSA communication surveillance.12

Whether or not NUMB3RS should be read as an early 2000s military-industrial sales pitch for 2010s big data and predictive policing technology, the analytics of each episode lends itself perfectly to critical review by civil rights activists as well as digital humanities scholars. Today, it is a widely reported fact that data sets and algorithms, or the combination of both, can and do discriminate. In 2016, an op-ed piece in the New York Times called for the need to “Make Algorithms Accountable” in relation to algorithmically computed “risk scores” for creditors and prospective criminals.13 In an article for the same newspaper, Kate Crawford, researcher at Microsoft, referred to this as “A.I.’s [= artificial intelligence’s] White Guy Problem”:

“Sexism, racism and other forms of discrimination are being built into the machine­learning algorithms that underlie the technology behind many ‘intelligent’ systems that shape how we are categorized and advertised to.

Take a small example from last year: Users discovered that Google’s photo app, which applies automatic labels to pictures in digital photo albums, was classifying images of black people as gorillas. Google apologized; it was unintentional.

But similar errors have emerged in Nikon’s camera software, which misread images of Asian people as blinking, and in Hewlett­Packard’s web camera software, which had difficulty recognizing people with dark skin tones.“14

Crawford also mentions predictive policing as problematic, since “software analyses of large sets of historical crime data are used to forecast where crime hot spots are most likely to emerge”, thus “perpetuating an already vicious cycle” with “more surveillance in traditionally poorer, nonwhite neighborhoods, while wealthy, whiter neighborhoods are scrutinized even less”.15

When in 2005 the pilot episode of NUMB3RS featured crime hotspot mapping through mathematical formulas implemented into computer algorithms, this was presented as the convergence of police work and clean-room lab science. The reality of the technology, however, is not quite as spotless. In 2016, the American non-profit investigative journalism platform ProPublica found that “[t]here’s software used across the country to predict future criminals. And it’s biased against blacks”.16 Surveying the algorithmically computed “risk scores” of more than 7,000 people arrested in Broward County, Florida in 2013 and 2014, ProPublica concluded that the “score proved remarkably unreliable in forecasting violent crime: Only 20 percent of the people predicted to commit violent crimes actually went on to do so.” The algorithm “was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. […] White defendants were mislabeled as low risk more often than black defendants”.17 Furthermore, the algorithm that assessed the risk score was not developed by the police or by any other government agency, nor was it published; rather, it was developed and kept as a trade secret by the private company Northpointe (a subsidiary of the Canadian Volaris Group) whose stated mission is “to improve correctional decision making at the level of individual offender case decisions, and at the level of system-wide policy, planning, and program evaluation”.18

In practice, predictive policing programs extend to a principle of tightly policing neighborhoods identified through analytics as crime hotspots. In 2014, a spokesperson for the American Civil Liberties Union called this principle “guilt by association”: “Because you live in a certain neighborhood or hang out with certain people, we are now going to be suspicious of you and treat you differently, not because you have committed a crime or because we have information that allows us to arrest you, but because our predictive tool shows us you might commit a crime at some point in the future.”19

Positivism dispute redux

The MIT Technology Review, a periodical whose overall perspective on technology tends to be optimistic and trustful, published in 2016 an article on how artificial intelligence analytics “Reveals the Hidden Sexism of Language”.20 A neural network trained with mainstream news media articles as its data set would answer the question “father : doctor :: mother : x” with “x = nurse” and “man : computer programmer :: woman : x” with “x = homemaker”.21 The problem is not only in the semantic bias of the data set, but also in the design of the algorithm that treats the data as unbiased fact, and finally in the users of the computer program who believe in its scientific objectivity.

The issue of discrimination and even killings of people based on hidden biases in computing is nothing new. The 1982 book The Network Revolution by the computer scientist Jacques Vallee begins with the following account:

“On Friday, 9 November 1979, at 10 p.m., three young men driving on Highway 20 stopped at a gas station in Etampes, near Paris. […] Mr. Nicolas, the service station operator, took a dim view of the tattered blue jeans, the leather jackets, the license number which did not look right because it was patched up with bits of black tape. […] Nicolas […] called the police to report the ‘suspicious’ car and its even more disreputable occupants. In Etampes, police officers went to the computer terminal linking them with the central file of the Interior Ministry, in Paris, a file whose very existence had recently been denied by a Cabinet member. In response to a brief flurry of commands, the police entered the car’s license number into the computer’s memory for checking against its data bank. The system soon flashed its verdict: the vehicle was stolen. […] A special night brigade was dispatched. The white and black police Renault intercepted the Peugeot driven by Francois at a red light. […] The only police officer in uniform stayed inside the Renault: the other two, in civilian clothes, got out. One of them covered the Peugeot with his machine gun at the ready. The other stood in front of the suspect’s car and armed his .357 Magnum. […] A moment later, a shot rang out. The bullet went through the windshield and hit Claude’s face just under the nose. […] Subsequent investigation disclosed that the car belonged to Francois, who had bought it, legally, ten days before. It had indeed been stolen in 1976, but it was soon recovered by the insurance company, which sold it to the garage where Francois bought it. The computer file had never been updated to reflect the change in the status of the car. The central police records still regarded it as stolen property.”22

Compared with 1970s/1980s database dragnets, contemporary big data analytics have only become even more speculative, since their focus is no longer on drawing conclusions for the present from the past, but on guessing the future, and since they no longer target people based on the fact that their data matches other database records but instead based on more speculative statistical probabilities of environmental factors and behavioral patterns. Whether or not human-created (and hence human-tainted) data is to be blamed for discrimination, or for the hidden assumptions hard-coded into algorithms that are employed for processing this data – or whether machine-generated data can even be biased – they all confirm Cayley’s observation that language is “easy to capture but difficult to read”; that each operation of automated analytics involves shortcuts from capturing to execution, from syntax to pragmatics, leaving behind semantics and thorough critical interpretation as their collateral damage. This is as much illustrated by the news story mentioned above as by each episode of NUMB3Rs which in forty-five minutes covers, besides a crime and its resolution, the finding of a mathematical model for a particular crime and the translation of that model into an algorithm and computer program (alongside such trivia as the brothers’ conflicts with each other and with their father, and one of the brother’s relationship with his grad student).

Critical discussions of data analytics, such as in the present publication, inevitably re-enact the positivism dispute of 1960s continental European social sciences.23 Its two main adversaries were the Frankfurt School with its orientation towards hermeneutic humanities, and Karl Popper who argued in favor of a common methodological orientation of social and natural sciences towards problem-solving.24 Popper however still distanced his position from pure quantitative science by insisting that “insight neither begins with perceptions or observations, nor with collection of data or facts, but it departs from problems”.25 In light of this dispute, the 21st-century shift from interpretation towards analytics, and from problems towards data, amounts to a much more radical positivism than either Adorno or Popper imagined. Arguing against Popper and empirical sociology, Habermas stated in 1963 that

“[t]he analytical-empirical modes of procedure tolerate only one type of experience which they themselves define. Only the controlled observation of physical behaviour, which is set up in an isolated field under reproducible conditions by subjects interchangeable at will, seems to permit intersubjectively valid judgments of perception.”26

From this perspective, the issues that Crawford and others observed in big data and artificial intelligence analytics are not limited to biases and skewed parameters within empirical “controlled observation” – for which the authors of the MIT Technology Review article propose, in all seriousness, a de-skewing algorithm.27 Rather, the bias lies in the setup as such, the “experience which they themselves define” (to again quote Habermas), which therefore involves a priori choices and decisions as well as unacknowledged biases. Interpretation hence constitutes the setup, while at the same time being disclaimed by the analysts. Hermeneutics, in other words, is always at work in analytics though it is rarely acknowledged as such. The art theoretician Boris Groys identifies the internet corporations’ business model of collecting their users’ personal information – including “interests, desires and needs” – as a “monetization of classical hermeneutics” in which “hermeneutic value” becomes a “surplus value”.28 Groys effectively blends the Frankfurt school’s 1940s critique of the culture industry with its 1960s critique of positivism, reflecting the early 21st-century status quo in which Silicon Valley has replaced Hollywood as the epitome of creative industries, with analytics of user-generated content rather than content production as its (multi-billion dollar) business model.

Since an objective analytics, devoid of any interpretation and thus of any bias, does not exist, hermeneutics creeps in through the back door of analytics. This already begins at the point where data is captured, since almost any type of data acquisition requires subjective decision-making (for example, concerning digital representation of color in scanned images).29 Such technical-operational decisions become political when, for example, they concern accuracy of skin-tone reproduction, a problem that is not new but already existed in the days of analog film when filmmakers (including Jean-Luc Godard) boycotted Kodak due to the company’s color and dynamic range calibration of film stocks, which was optimized for the reproduction of white skin and left black actors’ faces underexposed.30 In addition, data acquisition introduces its own artifacts – such as lens and microphone distortion, video and audio noise – whose retroactive filtering requires interpretative, often aesthetic decisions. Operators are interpreters. Though interpretation of data – or interpretation of sheet music by a musician – may be more confined than, for example, the interpretative reading of a novel, they are structurally no less hermeneutic.

From capturing to reading data, interpretation and hermeneutics thus creep into all levels of analytics. Biases and discrimination are only the extreme cases that make this mechanism most clearly visible. Interpretation thus becomes a bug, a perceived system failure, rather than a feature or virtue. As such, it exposes the fragility and vulnerabilities of data analytics. Analytics and hermeneutics thus relate to each other like the visible front-end and the invisible back door in a piece of software, i.e. the kind of “backdoors” that remote attackers can exploit in order to gain control of a system. Hermeneutics also becomes a back-door practice in a libidinous sense. Not only does any network interface, as Chun (2016) pointed out, “act promiscuously” and does the internet leak by design;31 the fact that this promiscuity occurs on the level of technical automation (of network hardware as well as software) conversely obscures interpretative agency, including by intelligence and law enforcement agencies and intellectual property law firms that intercept and judicially interpret the network communications of surveilled individuals. Since this promiscuity does not happen on the front-ends but on the back-ends, through the back doors and sometimes in the darkrooms of the internet, it is clandestine promiscuity and stigmatized hermeneutics; its practitioners will rarely come out of the closet the way Edward Snowden did. Historically, there may never have been as much interpretation going on as there is in the age of analytics, yet this paradoxically coincides with a blindness for the subjective viewpoints involved.

Drucker, too, insists on the crucial role of interpretation in the analysis (and visualization) of data, except that she is more optimistic regarding the necessity – rather than some back-door repressed expression – of the humanities perspective. She argues that the

“[…] natural world and its cultural corollary exist, but the humanistic concept of knowledge depends upon the interplay between a situated and circumstantial viewer and the objects or experiences under examination and interpretation. That is the basic definition of humanistic knowledge, and its graphical display must be specific to this definition in its very foundational principles. The challenge is enormous, but essential, if the humanistic worldview, grounded in the recognition of the interpretative nature of knowledge, is to be part of the graphical expressions that come into play in the digital environment.”32

The paradox of big data is that it both affirms and denies this “interpretative nature of knowledge”. Just like the Oracle of Delphi, it is dependent on interpretation. But unlike the oracle priests, its interpretative capability is limited by algorithmics – so that the limitations of the tool (and, ultimately, of using mathematics to process meaning) end up defining the limits of interpretation. Similarly to Habermas, Drucker sees the danger of “ceding the territory of interpretation to the ruling authority of certainty established on the false claims of observer-independent objectivity”.33 This relates to her example of the visual perspective in which the graph of an epidemic is drawn, just as much as the interpretation of criminological data in alleged “hotspot” neighborhoods.

The territory of interpretation thus becomes a battleground between quantitative analytics and critical theory humanities – in which the mode of operation is always hermeneutic in the broad sense of being interpretative, discursive, and not privileging quantitative methodology, regardless of whether this methodology sails under hermeneutic, structuralist or materialist, humanist or post-humanist flags, and regardless of the debates between these schools. The question as to whether there is any qualitative difference between analytics and interpretation ultimately addresses the viability of artificial intelligence. If analytics can, hypothetically, render interpretation obsolete, then algorithms should ultimately be able to replace most sociologists, critics and humanities scholars – or, at least, to render obsolete their hands-on interpretative work and shift their profession towards research and development of data analytics algorithms.

The crapularity is here

Leaving aside all philosophical debates on artificial intelligence, current big data applications show that the viability of A.I. is not so much an epistemological issue, but rather one of pure pragmatics. Whether or not A.I., or some types of A.I., are fundamentally flawed and unfit for their purpose, they will nevertheless be developed and used when they seem to get things done and when they deliver, most importantly, quantifiable results such as a decrease in crime statistics (no matter the social and political side effects), as well as cutting labor costs.

To put it in the words of one of A.I.’s most popular evangelists: The Singularity Is Near.34 But if the “singularity” is indeed near, this is not because machines or algorithms are becoming more intelligent (or just smarter, which is not the same thing), but because society is dumbing itself down in order to level the playing field between humans and computers. In order to close the gap between the ease of capturing and the difficulties of reading, culture and society must make themselves perfectly computer-readable. When autonomous cars cause lethal highway accidents because their computer vision mistakes a white truck for a street sign – which is what happened to the A.I. autopilot of a Tesla car on May 7, 2016 in Williston, Florida – then this almost exactly fulfills the “Don’t Drive Evil-ularity” scenario sketched in 2011 by the Postnormal Times researcher John A. Sweeney:

“Crash of Google-controlled robot car drives S&P to lower credit rating of USA, sending car loan rates and insurance premiums through the roof. Police suspect robot was watching Transcendent Man while driving”.35

In the case of crashed Tesla car, it was actually the human driver who was watching a Harry Potter movie.36 The long-term solution is not to improve the pattern recognition algorithms of cars, an endeavor which is as prone to over-complexity and systemic failure as the extension of the Fluxus Counting Songs to crisis and catastrophe scenarios. Instead, all cars and highways could be redesigned and rebuilt in such a way as to make them failure-proof for computer vision and autopilots. For example, by painting all cars in the same specific colors, and with computer-readable barcode identifiers on all four sides, designing their bodies within tightly predefined shape parameters to eliminate the risk of confusion with other objects, by redesigning all road signs with QR codes and OCR-readable characters, by including built-in redundancies to eliminate misreading risks for computer vision systems, by straightening motorways to make them perfectly linear and moving cities to fit them, and by redesigning and rebuilding all cities to make them safe for inner-city autonomous car traffic.37 In addition, all buildings – residences, offices, factories, hotels, stations, airports – could be redesigned so they can be fully serviced (cleaned, maintained and front-desk-clerked) by robots; a much more realistic scenario than speculating on breakthroughs in artificial intelligence systems such as computer vision and robotics that would, sometime in the future, make robots fit for servicing existing buildings. (This scenario has countless precursors in popular science fiction, including for example Stuart Gordon’s 1996 movie Space Truckers in which the protagonists transport square pigs that have been genetically modified to make more efficient use of limited spaceship cargo capacity.38)

Instead, “legacy” buildings that cannot be easily serviced by robots would likely become a surcharge luxury of the rich who can still afford human services. The singularity scenario would further entail, for example, a redesign of all education as automated online courses with computerized tests and certificates, leaving brick-and-mortar schools only for those who still can afford the higher tuition. The “social credit” system which China announced for its citizens in 2015, could become a worldwide model: each person’s online activities receive positive or negative scores based on their supposed social productivity (in China: support of Communist Party politics), with access to – for instance – higher education and mortgage loans becoming dependent on a good credit score.39 Globally implemented, all automata that provide services or goods could accept “social credit” as payment so that this system could eventually replace traditional currencies. The “sharing economies” that are now provided by companies such as Uber and Airbnb could be scaled up to make them all-pervasive, allowing one to rent out all of one’s belongings, even for the shortest periods of non-use, as well as potential labor services. This would not so much be a means to generate surplus income, but rather a socio-ecological austerity measure and necessity for everyone (except the rich) to make ends meet. Such systems could, after all, be introduced by liberal politicians as ostensible measures against nationalist, racist and fascist backlashes in public opinion, promising liberal voters to fight prejudice and class or race privilege with a universal meritocracy based on objective (and thus fair) quantitative measurements.

The “singularity” described above could be achieved using today’s technology. It would not even require any further fundamental research in the field of machine cognition, or any algorithms and chips that do not yet exist. Software and hardware research could even be stopped in order to yield the additional benefit of standardization based on a few optimized machine designs mass-produced at lower cost, which would conversely allow for a greater number of chips to be included in everyday devices.

In his contribution to a 2011 collaborative document on Alternatives to the Singularity, the technology anthropologist Justin Pickard characterized the corresponding present state of affairs as the “crapularity”:

“3D printing + spam + micropayments = tribbles that you get billed for, as it replicates wildly out of control. 90% of everything is rubbish, and it’s all in your spare room – or someone else’s spare room, which you’re forced to rent through AirBnB”.40

The degree to which this dystopia has become our present-day reality can be monitored through the popular Twitter feed “Internet of Shit” which currently has 125,000 subscribers.41 Under the motto “The Internet of Shitty Things is here. Have all of your best home appliances ruined by putting the internet in them!”, the microblog publishes – for example – Windows “blue screens of death” in elevators, ransomware messages on train station displays, and a car performing a software update on its central computer console while it is being driven.42

Whether crapularity or singularity, the differentiation of systems into such subcategories as “internet”, “artificial intelligence”, “machine vision” and “pattern recognition”, “big data”, “smart cities” and “internet of things” will likely soon become a thing of the past. These systems are converging in the same way in which Hans Magnus Enzensberger, in 1970, predicted the convergence of communication media – “news satellites, color television, cable relay television, cassettes, videotape, videotape recorders, video-phones, stereophony, laser techniques, electrostatic reproduction processes, electronic high-speed printing, composing and learning machines, microfiches with electronic access, printing by radio, time-sharing computers, data banks” – into “a universal system”.43 What sounded monumental then, has now become banal, as could eventually be the case with the future convergence of analytics systems. Besides rendering obsolete such differentiations as those between big data, A.I. and smart cities, it is also likely to render obsolete the term “media” itself. The issue that information ceases to be a “difference which makes a difference”44 within technology is as old as McLuhan’s definition of media as “extensions of man”,45 which lacks any meaningful differentiation between “media” and other types of technology.

Fluxus showed, in 1962, how social network analytics and “social credit” can be computed using almost any technology, including the cheapest computational device of manual counting. In a 2004 interview with the curator Hans-Ulrich Obrist, Emmett Williams recalled how the artists disguised their control device as a friendly game. Counting audience members, Williams explains, meant that “[y]ou could touch them; you could have them write their names on the program, put a candy in everybody’s mouth. This way you had contact with the audience and at the same time could work out exactly how many people were there and demand our fair share of the money.”46 Contemporary design calls this “gamification”, and it has become a widely practiced method for creating “soft” or “nudging” control measures in public and private spaces.47

A Fluxus score written a year before the first Fluxus festival, La Monte Young’s Compositions 1961, consisted only of the instruction to “Draw a straight line and follow it”,48 thus anticipating the singularity of a society whose architectures and processes have been streamlined and simplified, even “zombified”, in order to be fully readable and serviceable by dumb bots. If machine-readability and human-readability, capture and analytics (as opposed to perception and interpretation), mark the difference between the “humanistic concept of knowledge” (Drucker, 2011) and A.I., then this difference reveals a fundamental problem of A.I.: its very concept is, to use a term from speculative realist philosophy, correlationist, since the word “artificial” dialectically references “natural”. The quality standard for A.I., and the “singularity” as predicted by its advocates, is how convincingly it measures up against natural (i.e. human) intelligence.

Since there is no firm definition or universally agreed-upon scientific theory of “intelligence”, one could just as well define intelligence as the capability to perform mathematical equations. Then, the singularity would already have been reached with pocket calculators, or even with some mechanical entrance gate device that would had counted Fluxus festival visitors more efficiently than the Counting Songs did.49 The Faroese musician and artist Goodiepal, who from 2004 to 2008 taught his students at the Danish Institute for Electroacoustic Music (DIEM) to compose music for alien and artificial intelligences,50 therefore proposes to read A.I. not as an acronym for “artificial” but for “alternative intelligence”. If machine intelligence is indeed a different form of intelligence, then it can be observed and judged on the basis of its own merits, as opposed to a messianic waiting for a moment where it might equal or eclipse (weakly defined) human intelligence. This would even render obsolete the question as to whether or not machines can think – which in itself willfully glosses over the corresponding opposite question, “can humans think?” posed by the former Fluxus artist (and Emmett Williams’ collaborator) Tomas Schmit in the year 2000.51

The singularity is here, but it is in fact a crapularity. Its crappiness (which includes crappy big data “analytics”) could be celebrated and enjoyed like other crappy culture, including television shows such as NUMB3RS and B movies such as Space Truckers. The problem, however, is that the crapularity is not a movie but has become daily life, and that its worst jokes are actually deadly.

Negative theologies of the subject

While A.I. has become “alternative intelligence”, the critical theory that Habermas defended against empirical positivism no longer seems to embody the human “alternative intelligence” which it was in the 20th century. Why, Hito Steyerl asked the author of the present text, can one analyse fascism all day long and no one cares?52 No one cares, it should be added, whether such analysis happens under Marxist or post-Marxist, feminist, postcolonial, post-structuralist, fundamental-ontological or object-oriented ontological, media-theoretical, speculative-realist, humanist or post-humanist denominations, since positivism boils all of these down to one undifferentiated “continental”, “non-empirical” and “speculative” discourse.

In the crapularity, “subjectivity” gains a renewed significance as soon as this subjectivity is no longer an issue of metaphysical vs. ontological thinking, but more generally of criticism vs. positivism. With her insistence on the “graphical expression of humanistic interpretation” as distinct from “the visual display of quantitative information as a close reading of a poem is from the chart of an eye tracker following movements across a printed page”,53 Drucker shows how the word “humanistic” can be salvaged even for those kinds of cultural and media studies that have been thoroughly informed by post-structuralism and subsequent schools of anti-metaphysical thinking.

Before the crapularity, any inclusion of “subjectivity” in “terms of media”, or more precisely: in information technology, seemed to be an oxymoron, since rejection (or at least criticism) of the humanist subject has been a common denominator of cybernetics, post-structuralism and most schools of materialism and feminism.54 The focus of media theory on technologies, rather than on their human creators, may in itself be seen as an anti-humanist statement. Terry Eagleton’s characterization of structuralism thus broadly applies to most media theory: it “is ‘anti-humanist’, which means not that its devotees rob children of their sweets but that they reject the myth that meaning begins and ends in the individual’s ‘experience’”.55 This intellectual tradition began with Darwin’s and Freud’s shattering of the subject’s autonomy and continued after the Second World War with cybernetics. In its close relatedness to psychological behaviorism, cybernetics understood human behavior as situated within control systems. In 1946, Heidegger – who was in the process of “turning”56 his fundamental ontology into a philosophy of technology – stated that “every humanism remains metaphysical” and as such obstructs ontological inquiry, even of humanity itself.57 What was primarily meant as clarification of Heidegger’s philosophy in opposition to Sartre and his humanist misreading of Heidegger’s existential philosophy,58 had a lasting impact on French post-structuralism and the media theory that subsequently borrowed from it.

When Michel Foucault declared the “death of man” in Order of Things (1966),59 the death of God did not mean, as it did for humanism, his replacement by the human subject, but rather the death of the Christian god as well as of the humanist god-like subject. Kittler’s lifelong “exorcism of humanism from the humanities” in which technology took the place of the historical subject, built upon Foucault while battling the remains of 19th-century idealism in continental European humanities.60 Anti-humanism became post-humanism when post-structuralist dystopias turned into cyber-utopias. Donna Haraway’s Cyborg Manifesto and N. Katherine Hayles’ How We Became Posthuman examined A.I. and Silicon Valley culture from a critical theory angle. Post-humanism turned what once had been negative theology into new utopias and new forms of gnosis.61 Contemporary critiques of correlationism62 and debates on the Anthropocene amount to a contemporary comeback of post-humanism, with a systems thinking that has shifted from 1990s cyber-utopias to 21st-century ecological dystopias.63

Did the anti-theologies of “the subject” simply create new theologies of “the system”? The post-structuralist critique of subjectivity was more differentiated than it is often given credit for. In What Is an Author?, Michel Foucault states that “suspicions arise concerning the absolute nature and creative role of the subject” while also insisting that “the subject should not be entirely abandoned. It should be reconsidered, not to restore the theme of an originating subject, but to seize its functions, its intervention in discourse, and its system of dependencies”.64 Subjectivity, in other words, is relative rather than absolute (as was previously the case in humanism and romanticism). In Speculative Realism, objects conversely become independent from the human perspective, they are no longer “correlationist”. Slavoj Žižek criticizes this position to the extent that “the true problem is not to think pre-subjective reality, but to think how something like a subject could have emerged within it; without this (properly Hegelian) gesture, any objectivism will remain correlationist in a hidden way – its image of ‘reality in itself’ remains correlated (even if in a negative way) with subjectivity”.65 Like Drucker, Žižek insists on the human perspective when he states (referring to Lacan and Hegel) that “their problem is not ‘how to reach objective reality which is independent of (its correlation to) subjectivity,’ but how subjectivity is already inscribed into reality – to quote Lacan again, not only is the picture in my eye, but I am also in the picture”.66

The inscription of subjectivity into media – of perspective and pictures, whether or not machine vision is involved – needs no explanation when algorithmic processes produce racial, social and other biases. Most engineers might consider these an optimization problem, an issue of the platonic ideal of singularity vs. its crapularity in real life. Yet everyone who has ever coded a computer program, programmed a database or marked up a document knows that this constantly involves subjective decisions:67 for example, the criteria according to which input data is classified, sorted and categorized, including the multiple choice values for a person’s gender in an address database, or the interpretation of italic type as either “emphasis” (“<em>”) or “citation” (“<cite>”) when transcribing text from print to HTML. No algorithmic analytics can sensibly accomplish the latter; it will only be able to compute and heuristically apply the statistical norm. “If you want a vision of the future, imagine the past (artificially) extended forever” – this line from the 1986 zine SMILE (which had the unusual characteristic that anyone could publish a zine under the name SMILE), written by the artist and later internet entrepreneur John Berndt under the multiple-use pseudonym Karen Eliot, is an precognitive summary of the crapularity and its analytics.68

However, programmed systems also help to define more precisely what exactly differentiates “semantics” from “syntax” and interpretation from formal analysis. They thus bring to hermeneutics and structuralism, which only had vague definitions of these terms, an understanding of what these words really mean. Figures of speech, for example, can now be clearly understood as being subject to an interpretation that is difficult or impossible to formalize. Ambiguity and figurative speech mark the limits of what computer algorithms can analyze. Are Abwärts’ “Computerstaat” lyrics an affirmative, oppositional or cynical political statement? Even A.I. algorithms that determine the degree to which a statement is ironic based on semantic context would be thrown off track.69 Abwärts’ enumeration of the days from Monday to Sunday may thus provide data for Fluxus Counting Songs, but would be quite useless for political opinion polling.

20th-century structuralists such as Roman Jakobson still thought of figures of speech as a formal aspect of language, since they could be structurally described; a metaphor for example could be understood as a linguistic operation based on the principle of similarity.70 Metaphor was classified as “formal” because it could be made part of a systematics. Jakobson, and later 20th-century anti-humanism, thus maintained the romanticist notion of “subjectivity” as being antithetical to systems, discourses and apparatuses. In the crapularity however, subjectivity needs to be de-romanticized. It can be simply defined as the agency and decisions – in other words: politics – that make up these systems, discourses and apparatuses. To deny that these politics exist would be an extremist, if not fascist, form of post-humanism advocating post-politics and post-democracy.71

The invisible hand of openness

If the Fluxus Counting Songs were performed by machine, running forever as an autonomous, unobserved process, this wouldn’t take away the human agency and politics that went into their design. But their potential automation illustrates, perhaps counter-intuitively, the degree to which they are an open process – or, to use Umberto Eco’s term, an “open work” characterized by an internal “dialectics between work and openness”.72 For Eco, this dialectic is one of the traditional material characteristics of an artwork, which it still retains in order to remain dialectical, as opposed to its modern-art processuality, for example in action painting.73 In the case of the Counting Songs, this would be its dialectics between (fixed) notation and (open) performance. Yet, as previously discussed, the closure (in the sense of non-openness) of the Counting Songs lies in its implicit assumptions about the situation – the kind of closure that would make a crapularity bot stoically perform the Counting Songs in a heap of post-nuclear ruins, counting people while they are being shot dead by drones, rendering the sum outdated even as it is being computed. Big data and network analytics, data mining and pattern recognition all suffer from this issue, since they are applying predefined formulas to an alleged mass – and mess – of contingent phenomena and information, regardless of whether this information happens to be airport surveillance camera images, petabytes of intercepted e-mails, the sensor data of a “smart city” or the visitors of a Fluxus festival.

But whatever the type of analytics or interpretation may be, it relies on operation upon the open, the “great outdoors”,74 regardless of its technical limitation to seeing only what pre-inscribed search and correlation methods will tell it to see – and the risk of a crapularity bot data-mining minefields even as they are blowing up. “Openness” is where analytics and hermeneutics meet: “open data”, the sibling of big data, and “open work” both imply an anti-scholastics of rejecting pre-categorized and pre-hierarchized knowledge. When hermeneutics was still a theological discipline, its mere existence implied that the meaning of the scripture (whether Torah, Bible or Qur’an) was not literal and fixed, as the orthodoxies and fundamentalisms of the monotheistic religions hold, but rather subject to interpretation and, over the course of time, re-interpretation. This process not only secularized scripture but also hermeneutics itself so that, by the 19th century, it had mutated into literary criticism.75

In the 1960s and 1970s, Eco was not the only literary theoretician to modernize hermeneutics and literary criticism, and to make “openness” (in the sense of open work as well as open interpretation) the key factor in this modernization. Roland Barthes advocated the “networks” and “galaxy of signifiers” in the “indeterminable” codes of the “writerly” text,76 while Wolfgang Iser and Hans-Robert Jauß (building on previous work by Roman Ingarden) proposed a reader-response hermeneutics that focused on the gaps that artworks leave for their readers’ imagination.77 While these theories only addressed the aesthetics – perception – rather than the media technology of text, they were nevertheless misread as technology blueprints in the hypertext literature criticism that to some extent preceded, and further accompanied the emergence of the World Wide Web in the early 1990s.78 Around the same time, activism to make and keep the internet an “open” medium began in grassroots initiatives such as the Electronic Frontier Foundation. By the late 1990s and early 2000s, the concept of openness was extended to software (open source) and other media (open content), as well as academic publishing (open access) and data (open data).

From Eco’s Open Work in 1962 to the Open Government Data definition in 2007,79 “open” thus always meant “good”, or at least: “more interesting”. Openness provides more value for interpretation, whether for literary philologists or real-estate app developers using open government data to assess the potential market value of a neighborhood. For philologists as well as app developers, interpretative value translates into economic value as it helps keep them in business. (In this light, claims of the imminent “end of work” seem exaggerated.80) Both “open work” hermeneutics and open data analytics presuppose a culture and society that enables them while preventing closure (non-openness) through orthodoxy. They are thus close cousins to Popper’s general concept of the “open society”.81 Projected from science onto politics, Popper’s principle of falsification turns this open society into a market of competing ideas that are given the opportunity to prove each other wrong. On a dystopian level, this also creates a business model for the age of crapularity. Since falsification never ends (as opposed to Hegel’s and Marx’s historical dialectics), it amounts to an infinite license for the crapularity to carry on with crap analytics, crap results and crappy technology, keeping culture and society in a state of permanent system updates, error messages and software dependency hells where doors stop working because their remote control apps are no longer being maintained and where two bugs are fixed by introducing ten new ones.82

Popper’s open society, however, is not radically open since it still differentiates between itself and “its enemies”: fascism, soviet communism, and their alleged precursors in political-philosophical utopias. It is, in other words, only open to the degree to which its systemic principle is not challenged. Openness thus only exists on the object level of what is being observed, not on the meta-level of the observation where the organizing principle, “open society”, remains as fixed as the scores of “open works” such as the Counting Songs. Whenever “open” is used as a term to describe media – most typically as a prefix, such as in open standards, open networks, open source, open access, open data – then the same logic of immutability remains at work. Openness is standardized in policy documents (such as the Open Source Definition,83 the eight criteria of Open Government Data,84 the Open Content Definition,85 “gold” and “green” open access and the comprehensive Open Definition,86) making all these “terms of media” compliant to, and cybernetic heirs of, the Popperian liberal politics equation of open science, open markets and open society.

The myth underlying both these politics and the overall concept of open systems is their inherent self-regulation towards “thermodynamic equilibrium” and “equifinality” towards a “steady state” of a system, to quote Popper’s collaborator, the biologist and founder of General Systems Theory Ludwig von Bertalanffy.87 For Popper and Bertalanffy, these principles amounted to a general model of science, nature and politics in the cold war period. Ultimately, they are riffs on Adam Smith’s “invisible hand”. In a founding manifesto for the open source movement, the software developer Eric S. Raymond summed up this ideology as follows: “The Linux world behaves in many respects like a free market or an ecology, a collection of selfish agents attempting to maximize utility which in the process produces a self-correcting spontaneous order more elaborate and efficient than any amount of central planning could have achieved”.88 Tuned for equilibrium and self-regulation, the system is thus not open in the sense of being contingent or indeterministic; instead, it is meant to produce a desired outcome, with what one could call “liberal” variations. The same logic applies to “open work”, including aleatory musical composition, action painting, participatory art such as the Counting Songs (with their desired outcome of knowing the number of paying visitors) and contemporary community art and social design.

The people against post-humanism

For Popper’s open society and for open source software, the desired outcomes were a better society and better software respectively, through systemic processes that are by design self-organizing and self-optimizing.89 But just as the Counting Songs cease to produce sensible outcomes in a post-apocalyptic world and end up as no more than a formula running amok, open source software ended up as the technological back-end of the crapularity, with Linux, Apache, MySQL and PHP driving the commercial web and mobile devices (including, to name just a few examples, Google’s search engine, Gmail, YouTube, Facebook’s social networking platforms, Amazon’s online retail store, Android smartphones and tablets, Google’s Chromebooks, the Kindle e-reader and Tesla’s autopilot). The “open society” is now better known under the name coined by Popper’s Mont Pelerin Society collaborator Alexander Rüstow, “neoliberalism”,90 which has historically proven to be able to falsify anything but itself.

This explains the resurgence of fascism and other forms of populism in the context of the crapularity. On the basis of Carl Schmitt’s political theology, populism offers a more honest alternative to the existing regime: against equilibrium promises and crapular reality, the proposed antidote is the state of exception; against invisible hands, the remedy is decision-making as a virtue in itself, what Schmitt referred to as “decisionism”.91 In other words, the states of exception and decisionism that various “systems” (from international political treaties to big data analytics) and post-democratic powers currently conceal, seem to become tangible and accountable again through populist re-embodiment. “Populism” chould be literally read as the will to power against “the system”, not only a specific system, but the concept of system as such (including the way in which Popper’s “open society” positions itself). Contemporary populism is an attempt to regain agency of people against post-human ecologies, to literally put up the demos, the body of the people, against crapularities – whether on occupied squares or at fascist campaign rallies.

The tragedy, or farce, of this confrontation is how it often ends up as one form of fascism against another: populist fascism against big data fascism. The algorithm that stigmatizes people of color with a higher crime risk and a lower credit score differs from a white supremacist – or in continental Europe: “identitarian” – street rally only in its symbolic form, not in its semantics and pragmatics. Both can be based on the same crapularity analytics, since today’s populist street rallies are often the outcome of algorithms that bring like-minded people together in online social media echo chambers. Either way, subjectivity is destined to remain hard-coded into this analytics, even after humanity is literally (and not just figuratively) dead and gone.

(for Rasheedah, Camae & Ras) [JM1]BBC News (just so we don’t forget)


  1. Cayley, John. “TERMS OF REFERENCE & VECTORALIST TRANSGRESSIONS. Situating Certain Literary Transactions over Networked Services.” Amodern. http://amodern.net/article/terms-of-reference-vectoralist-transgressions/#rf21-2020 Web. 27 July 2016.

  2. And also turned certain analysts into another kind of analysts: some languages including German now differentiate between “Analytiker”, a psychotherapeutic, philosophical or mathematical analyst, and “Analyst”, a stock market, business or data analyst).

  3. Drucker, Johanna, Humanities Approaches to Graphical Display, http://www.digitalhumanities.org/dhq/vol/5/1/000091/000091.html

  4. I am reusing a term coined by Linda Hilfling Ritasdatter for the accompanying symposium to her exhibition Bugs in the War Room at Overgarden, Copenhagen, Denmark, May 2016.

  5. Williams, Emmett. My Life in Flux and Vice Versa. Stuttgart, London, Reykjavik: Edition Hansjörg Mayer, 1991. Print. 32

  6. Williams (1991), 32

  7. Ibid.

  8. Sohm, Hanns, Harald Szeemann, and Kölnischer Kunstverein. Happening & Fluxus: Materialien. Köln: Kölnischer Kunstverein, 1970. n.p.

  9. Verhoeven, Paul. RoboCop. 1987. Film.

  10. Scott, Ridley, and Scott, Tony. “NUMB3RS.” TV Series. Los Angeles: CBS, 2005-2010.

  11. “Montag klopft es an die Tür / Und Arafat, der steht neben dir / Dienstag gibt es Probealarm / Paranoia in der Straßenbahn / Mittwoch ist der Krieg sehr kalt / Breschnew lauert in der Badeanstalt / Donnerstag, du weißt es schon / Tausend Agenten in der Kanalisation / Freitag gehört der Mafia / Das Ravioli kommt aus Florida / Samstag Abend, Irrenanstalt / Der KGB im deutschen Wald / Sonntag, da ist alles tot / Im Golf von Mallorca der Weltkrieg droht / Stalingrad, Stalingrad / Deutschland Katastrophenstaat / Wir leben im Computerstaat / Wir leben im Computerstaat / Wir leben im Computerstaat” Abwärts. Abwärts – Computerstaat. Vinyl 7“. Hamburg: ZickZack, 1980. Audio recording.

  12. Scott, Ridley. Black Hawk Down, 2002. Film. – Scott, Tony. Top Gun, 1986. Film. – Scott, Tony. Enemy of the State, 1998. Film.

  13. Angwin, Julia. “Make Algorithms Accountable.” The New York Times, August 1, 2016. http://www.nytimes.com/2016/08/01/opinion/make-algorithms-accountable.html. Web.

  14. Kate Crawford, A.I.’s White Guy Problem, The New York Times, 26 June 2016

  15. It would be worthwhile to research possible correlations between the surge in police shootings of black people since 2014 and the introduction of predictive policing programs in the United States (on the other hand, the availability of inexpensive media technology has surely increased the coverage of previously unreported incidents, so that correlations are difficult to draw). In their paper Police killings of unarmed Black people: Centering race and racism in human behavior and the social environment content, the social work researchers Willie F. Tolliver, Bernadette R. Hadden, Fabienne Snowden and Robyn Brown-Manning argue that “the passage of laws like ‘stand your ground’ joined with policing strategies such as ‘broken windows,’ ‘stop and frisk,’ and ‘predictive policing’ (Eligon & Williams, 2015) results in Black and Brown people being exposed to surveillance by police, vigilantes, and the general public.” Tolliver, Willie F. et al. “Police Killings of Unarmed Black People: Centering Race and Racism in Human Behavior and the Social Environment Content.” Journal of Human Behavior in the Social Environment 26.3–4 (2016): 279–286. Taylor and Francis+NEJM.

  16. Kirchner, Julia Angwin, Surya Mattu, Jeff Larson, Lauren. “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks.” ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

  17. Ibid.

  18. Northpointe. “Northpointe – About Us.” Northpointe, 2016. http://www.northpointeinc.com/about-us. Web.

  19. Eligon, John, and Timothy Williams. “Police Program Aims to Pinpoint Those Most Likely to Commit Crimes.” The New York Times 24 Sept. 2015. NYTimes.com. http://www.nytimes.com/2015/09/25/us/police-program-aims-to-pinpoint-those-most-likely-to-commit-crimes.html Web. 30 July 2016.

  20. arXiv, Emerging Technology from the. “Neural Networks Are Inadvertently Learning Our Language’s Hidden Gender Biases.” MIT Technology Review. https://www.technologyreview.com/s/602025/how-vector-space-mathematics-reveals-the-hidden-sexism-in-language/. Web. 30 July 2016.

  21. Ibid.

  22. Vallee, Jacques. The Network Revolution: Confessions of a Computer Scientist. Berkeley, Calif.: And/Or Press, 1982. Print. 3-4. [Autobiographical note: the German edition of this book, published in 1984, introduced the author of the present text to network computing and its criticism.]

  23. A similar dispute existed in 1950s American political science over the school of behavioralism (not to be confused with behaviorism) whose advocacy for an empirical approach of “verification”, “quantification” and “pure science” was critiqued and rejected among others by Crick, Bernard. The American Science of Politics: Its Origins and Conditions. Berkeley: University of California Press, 1959. Print.

  24. Popper, Karl R. “Die Logik Der Sozialwissenschaften.” Kölner Zeitschrift Für Soziologie Und Sozialpsychologie 14, no. 2 (1962): 233–248. Sechste These, (a)

  25. “Die Erkenntnis beginnt nicht mit Wahrnehmungen oder Beobachtungen oder der Sammlung von Daten oder von Tatsachen, sondern sie beginnt mit Problemen”, Popper (1962), Vierte These.

  26. Habermas, The Analytical Theory of Science and Dialectics. A Postscript to the Controversy Between Popper and Adorno, in: The Positivist Dispute in German Sociology, London: Heinemann, 1976, 134

  27. arXiv (2016)

  28. Groys, Boris. In the Flow. Verso, 2016. Print. 179-180

  29. Today’s optical sensor technology cannot capture the full range of color information present, for example, in Kodachrome slides and film negatives; therefore digitization requires a decision regarding the color gamut to be captured. After scanning, the captured color range needs to be additionally, and quite heavily, compressed in order to fit the even more limited color space and dynamic range of computer displays.

  30. “Film emulsions could have been designed initially with more sensitivity to the continuum of yellow, brown, and reddish skin tones, but the design process would have had to be motivated by a recognition of the need for an extended dynamic range. At the time film emulsions were developing, the target consumer market would have been ‘Caucasians’ in a segregated political scene”. Roth, Lorna. “Looking at Shirley, the Ultimate Norm: Colour Balance, Image Technologies, and Cognitive Equity.” Canadian Journal of Communication 34, no. 1 (2009): 118.

  31. Chun, Wendy Hui Kyong. Updating to Remain the Same: Habitual New Media. The MIT Press, 2016. 51

  32. Drucker, Johanna. “Humanities Approaches to Graphical Display.” 5.1 (2011): http://www.digitalhumanities.org/dhq/vol/5/1/000091/000091.html. Digital Humanities Quarterly. Web. 30 July 2016.

  33. Ibid.

  34. Kurzweil, Ray. The Singularity Is Near: When Humans Transcend Biology. New York: Viking, 2005.

  35. http://www.scribd.com/doc/62056338/Alternatives-to-the-Singularity, 2012, accessed through the Internet Wayback Machine, https://web.archive.org/web/20120916123714/http://www.scribd.com/doc/62056338/Alternatives-to-the-Singularity

  36. Levin, Sam, and Nicky Woolf. “Tesla Driver Killed While Using Autopilot Was Watching Harry Potter, Witness Says.” The Guardian 1 July 2016. https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter The Guardian. Web. 31 July 2016.

  37. A less rigorous version of this program was carried out in the redesign of Western cities to make them car-friendly after the Second World War.

  38. Gordon, Stuart. Space Truckers. 1997. Film.

  39. News, B. B. C.[JM1] “China ‘Social Credit’: Beijing Sets up Huge System.” BBC News. http://www.bbc.com/news/world-asia-china-34592186 Web. 30 July 2016

  40. http://www.scribd.com/doc/62056338/Alternatives-to-the-Singularity, 2011, accessed through the Internet Wayback Machine, https://web.archive.org/web/20120916123714/http://www.scribd.com/doc/62056338/Alternatives-to-the-Singularity

  41. Anonymous. “Internet of Shit (@internetofshit).” Twitter, 2015. https://twitter.com/internetofshit. Web.

  42. Kawaguchi, Kohsuke. “Over the Air Update of a Toyota Car in Progress While the Car Is Driving. Wow!pic.twitter.com/54hMOr27Bj.” Microblog. @kohsukekawa, July 9, 2016. https://twitter.com/kohsukekawa/status/751614148715220992. Web.

  43. Enzensberger, Hans Magnus. “Constituents of a Theory of the Media.” The New Media Reader. Ed. Noah Wardrip-Fruin and Nick Montfort. Cambridge, Mass.: MIT Press, 2003. 261–275. Print.

  44. Bateson, Gregory. Steps to an Ecology of Mind. Chicago: University of Chicago Press, 1972. Print. 459

  45. McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964. Print.

  46. Obrist, Hans-Ulrich, Charles Arsène-Henry, and Shumon Basar. Hans Ulrich Obrist: Interviews. Charta Art, 2010. Print.

  47. A good example are the welcome gifts handed out by public service workers to newborn children in some European countries; this also serves as a measure of identity control.

  48. Young, La Monte. “Compositions 1961.” Fluxus No. 1, U.S. Yearbox. New York: Fluxus Editions, 1962. Print.

  49. See Bruno Latour’s related discussion of the doorstop as a non-human actor performing a previously human task, in Latour, Bruno. “From Realpolitik to Dingpolitik or How to Make Things Public.” Making Things Public. Atmospheres of Democracy. Cambridge, Massachusetts: The MIT Press, 2005. 14–41. Print.

  50. “I wanted to teach my students how to make music for an artificial intelligence in the future, but I was told I was not allowed to do that. I said if I cannot do that I will leave. And I will not leave silently. This is academic war!”, Goodiepal in an interview by Yardumian, Aram. “A Gentleman’s War.” Times Quotidian. http://www.timesquotidian.com/2012/03/22/a-gentlemans-war/, 22 Mar. 2012. Web. 31 July 2016.

  51. Schmit, Tomas, Julia Friedrich, Museum Ludwig, and Sammlung Falckenberg. Tomas Schmit: Können Menschen Denken? = Are humans capable of thought? Köln; Hamburg; Köln: Museum Ludwig ; PhoenixArt, Sammlung Falckenberg ; Walther König, 2007. Print. 18-19

  52. Hito Steyerl’s comment to the first draft of this paper.

  53. Drucker (2011)

  54. Braidotti, Rosi. The Posthuman. 2013. Print.

  55. Terry Eagleton, Literary Theory: An Introduction, second edition 1996, 98

  56. German: “Kehre”

  57. “In defining the humanity of the human being, humanism not only does not ask about the relation of being to the essence of the human being; because of its metaphysical origin humanism even impedes the question by neither recognizing nor understanding it”. Heidegger, Martin. Pathmarks. Edited by William McNeill. Cambridge; New York: Cambridge University Press, 1998, 245. Print.

  58. German: “Existenzphilosophie”

  59. “It is no longer possible to think in our day other than in the void left by man’s disappearance. For this void does not create a deficiency; it does not constitute a lacuna that must be filled. It is nothing more, and nothing less, than the unfolding of a space in which it is once more possible to think”, Foucault, Michel. The Order of Things: An Archaeology of the Human Sciences. Psychology Press, 2002. Print. 373

  60. Kittler, Friedrich A, ed. Austreibung des Geistes aus den Geisteswissenschaften: Programme des Poststrukturalismus. Paderborn; München; Wien; Zürich: Schöningh, 1980. Print.

  61. Haraway writes that there is a “utopian tradition of imagining a world without gender”, in: New Media Reader, 516; while Hayles argues that “cybernetics […] should be called a ‘Manichean science’”, Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press, 1999. Print. 106

  62. Meillassoux, Quentin. After Finitude. An essay on the necessity of contingency. London: Continuum, 2009. Print.

  63. Braidotti (2013)

  64. Foucault, Michel. “What Is an Author?” The Norton Anthology of Theory and Criticism. Ed. Vincent B Leitch. New York: Norton, 2001. 1622–1636. Print. 1635

  65. Žižek, Slavoj. Less than Nothing: Hegel and the Shadow of Dialectical Materialism. London; New York: Verso, 2012. Print. 643

  66. Ibid.

  67. In 2001, the artist and computer programmer Adrian Ward summed up this issue as follows: “[W]e should be thinking about embedding our own creative subjectivity into automated systems, rather than naively trying to get a robot to have its ‘own’ creative agenda. A lot of us do this day in, day out. We call it programming.” Rhizome mailing list, May 7th 2001

  68. Eliot, Karen. “ANTI‐POST‐ACTUALISM++++++.” A Neoist Research Project. Ed. N.O. Cantsin. London: OpenMute, 2010. n.p. Print.

  69. The algorithm proposed by Amir et.al. depends on strong contextual cues from unambiguous (social media) messages; Amir, Silvio, Byron C. Wallace, Hao Lyu, and Paula Carvalho Mário J. Silva. “Modelling Context with User Embeddings for Sarcasm Detection in Social Media.” arXiv:1607.00976 [Cs], July 4, 2016. http://arxiv.org/abs/1607.00976.

  70. Jakobson, Roman. “Two Aspects of Language and Two Types of Aphasic Disturbances.” Fundamentals of Language. The Hague, Paris: Mouton. 115–133. Print.

  71. See Crouch, Colin. Post-Democracy. Malden, MA: Polity, 2004. Print.

  72. Eco, Umberto. The Open Work. Cambridge, Mass.: Harvard University Press, 1989. Print. 104

  73. Eco (1989), 102

  74. “le grand dehors”, Meillassoux (2009)

  75. Schleiermacher, Friedrich. Hermeneutics and Criticism and Other Writings. Ed. Andrew Bowie. Cambridge, U.K.; New York: Cambridge University Press, 1998. Print.

  76. Barthes, Roland et al. S/Z. New York: Hill and Wang, 1974. Print. 5

  77. Iser, Wolfgang. The Act of Reading: A Theory of Aesthetic Response. Baltimore: Johns Hopkins University Press, 1978. Print.

  78. Most prominently in Landow, George P. Hypertext: The Convergence of Contemporary Critical Theory and Technology. Baltimore: Johns Hopkins University Press, 1992. Print.

  79. https://fcw.com/articles/2014/06/09/exec-tech-brief-history-of-open-data.aspx

  80. As opposed to Black, Bob. The Abolition of Work and Other Essays. Port Townsend, WA: Loompanics Unlimited, 1986. Print. and Srnicek, Nick, and Alex Williams. Inventing the Future: Postcapitalism and a World without Work. 2015. Print.

  81. Popper, Karl R. The Open Society and Its Enemies. London: G. Routledge & Sons, Ltd., 1945. Print.

  82. Philip K. Dick anticipated this type of crapularity in his 1969 novel Ubik in which a character struggles with a door in his apartment that refuses to open unless it is paid with coins and ultimately threatens to sue him because he tries to unscrew its lock. Dick, Philip K. Ubik. New York: Vintage Books, 1991. Print. 24

  83. Open Source Initiative. “The Open Source Definition (Annotated).” Open Source Initiative, 2016 1998. https://opensource.org/osd-annotated. Web.

  84. OpenGovData.org. “The 8 Principles of Open Government Data.” OpenGovData.org. https://opengovdata.org., 2007. Web. 1 Aug. 2016.

  85. Wiley, David. “Defining the ‘Open’ in Open Content and Open Educational Resources.” Opencontent.org, 1998. http://www.opencontent.org/definition/. Web.

  86. “Open Definition 2.1 – Open Definition – Defining Open in Open Data, Open Content and Open Knowledge.” Opendefinition.org. Accessed August 1, 2016. http://opendefinition.org/od/2.1/en/. Web.

  87. Bertalanffy, Ludwig von. General System Theory; Foundations, Development, Applications. New York: George Braziller, 1969. Print.

  88. Raymond, Eric S. “The Cathedral and the Bazaar.” Catb.org, 1998. http://www.catb.org/esr/writings/cathedral-bazaar/cathedral-bazaar/ar01s11.html. Web.

  89. According to the Open Source Initiative, “the promise of open source [is]: higher quality, greater reliability, more flexibility, lower cost, and an end to predatory vendor lock-in”. “Open Source Initiative.” Open Source Initiative, 2016. https://opensource.org/. Web. In the past, the organization’s rhetoric showed an even more optimistic attitude, praising, in 2006 and on the same web page, open source for software development “at a speed that, if one is used to the slow pace of conventional software development, seems astonishing”, with a “rapid evolutionary process” that “produces better software than the traditional closed model. […] Open source software is an idea whose time has finally come. […] Now it’s breaking out into the commercial world, and that’s changing all the rules”. “Open Source Initiative OSI,” February 7, 2006. https://web.archive.org/web/20060207222246/http://www.opensource.org/. Web.

  90. Rüstow understood “neoliberalism” as a synonym of “ordoliberalism”, the German (and Northern European) concept of a market liberalism tempered by a strong system of checks and balances enforced by the state, including provisions for public welfare. He eventually left the Mont Pelerin Society in disagreement with proponents of radical free market liberalism. Prollius, Michael von. Herrschaft oder Freiheit: ein Alexander-Rüstow-Brevier. Bern: Hep Ott, 2007. Print.

  91. Schmitt, Carl. Political Theology: Four Chapters on the Concept of Sovereignty. Chicago: University of Chicago Press, 1985. Print. See also Mouffe, Chantal. The Challenge of Carl Schmitt. London: Verso, 1999.