PART I ][ Chapter 5
No Signal without Symbol: Decoding the Digital Humanities
David M. Berry, M. Beatrice Fazi, Ben Roberts, and Alban Webb
The ubiquity and potency of digital culture in the early twenty-first century have been the subject of sustained interest and comment in both public and academic spheres. The digital, as a computational form, mediates our everyday lives, offering new modalities, surfaces, and processes for (re)presenting the world. The word “automation” was coined in its contemporary sense in the immediate postwar era. It referred to a more extensive and systematic mechanization, closed-loop systems of feedback and control, and the incorporation of discrete electronics. We use the term “computational automation” to refer to the present extension of automation: the incorporation of machine learning and large-scale data analysis into economic production, academia, and everyday life. Computational automation is revolutionizing scientific, cultural, and artistic human endeavors, which are constantly being reorganized into new temporal, spatial, and conceptual structures with different degrees of agency and reflexivity. Due to the speed and scale of these organizing forces, our understanding of computational automated systems, such as self-driving cars, high-frequency trading systems, autonomous military drones, or organized swarms of shelf-stacking robots, falls behind these technological developments. This lag creates what Bernard Stiegler calls “disorientation”: a disjuncture between human temporality and the accelerated technological ordering of time in a hyperindustrial era.[1] Equally, certain deployments of computational automation engender epistemic transformations; that is, changes in the way in which knowledge is justified and understood. These changes could be understood as anti-hermeneutical; they short-circuit critical interpretation in favor of statistical correlation and pragmatic “results.” These epistemic transformations require further interrogation, which is exactly what we intend to do here.
We propose that a critical assessment of computational automation should be one focus of the digital humanities. Via a reading of information theory and Wolfgang Ernst’s arguments about media archaeology, we explore the automated systems enlisted by the digital humanities in terms of signal processing. “Signal processing” is an expression that we take from Ernst and by which we mean ways of analyzing cultural heritage that bypass traditional forms of interpretation and historiography. We argue that signal processing represents an impoverished vision of the possible contribution of the digital humanities to debates about the future of culture. The contours of a new digital public culture are emerging; as digital humanists we need to address the critical implications of the computational state that makes our work possible.
The French usefully deploy the acronym GAFA (Google, Apple, Facebook, Amazon) to identify the wave of Silicon Valley technologies and ideologies that stand in some sense against the European ideal of public goods, shared culture, and the Enlightenment.[2] We can think about their business model as what Rey Chow, following Phil Agre, calls capture. Within the technology sphere, and particularly in computer science, the most frequent use of the term “capture” refers to “a computer system’s (figurative) act of acquiring certain data as input, whether from a human operator or from an electronic or electromechanical device,” Agre explains. A second use of the term, he continues, refers to “a representation scheme’s ability to fully, accurately, or ‘cleanly’ express particular semantic notions or distinctions, without reference to the actual taking in of data” (106). This twofold definition masks the ambiguity of the same word being used to describe an epistemological idea (“acquiring the data”) and an ontological idea (“modeling the reality it reflects”), even as this usage is remarkably common across the technology sector and in computational disciplines (Agre). The “machinic act or event of capture” that interests Chow creates the possibility for further dividing and partitioning—that is, for the generation of copies and images—and produces an ontology that is structured around the copy (4). Because a significant focus of recent software development has been to enable more effective systems of surveillance, new capture systems have proliferated, fueling a political economy supported by shared data and knowledge. These surveillance systems also make the collection of data relatively easy. This ease of collection results in exploding quantities of data and in claims that we are part of a digital economy. Information and data are each seen as a source of profit, if captured in an appropriate way.[3]
The notion of capture also implies derangements in the organization of knowledge, derangements caused by the unprecedented adjacency and comparability or parity that digital computation makes possible. These concepts define computation, which itself works through a logic of formatting, configuration, structuring, and the application of computational ontologies (Berry, Critical Theory; Philosophy of Software). In other words, everything becomes data and, in doing so, becomes equally comparable and calculable. Data are manifested in a form of character-based notation that, while resembling writing, is illegible without the computers that capture, store, and process it.
In fact, this very form of capture—digitization—is the one identified by Andrew McAfee and Erik Brynjolfsson as emblematic of the “Second Machine Age.” They identify it as “an inflection point in the right direction—bounty instead of scarcity, freedom instead of constraint—but one that will bring with it some difficult challenges and choices” (11). Whether or not this “inflection point” does indeed aim in the right direction, it is certainly the case that digitization has generated widespread feelings of anxiety and concern over its implications for the economy and society more generally. Even more, digitization has brought about a crisis in thought. This crisis is manifested, at a sociological level, in a form of anxiety over automated capture—a worry that with atrophied human reason and diminished cognitive faculties, humanity will soon be replaced by intelligent robots.[4] Automation anxiety, in turn, raises key techno-epistemological questions around the assembly and reassembly of mechanisms of signal production, dissemination, and consumption.[5] The crisis in thought also questions the relevance of human subjectivity, while pointing toward the development of a new machine-subject. Capture, automation, and digitalization suggest a cultural complicity between technology and nature, where society and subjects no longer exist except as a cloud of data points. Signal processing therefore links automation anxiety to the crisis in thought.
In this respect, it is possible to argue that we are entering a time of a new unintelligibility, whereby people can no longer make sense of the world due to our increasing reliance on digital technology that writes and reads for us, as a form of algorithmic inscription. This unintelligibility results in new forms of what the philosopher Bernard Stiegler calls “grammatization”; that is, the standardization and discretization of idiomatic human cultures, which now also include human-machine cultures. These new forms of grammatization are symbols and discrete representational units, which are opaque to humans even as they are drawn from human-created devices.
Digital technologies have reconfigured the very texts that humanities and DH scholars take as their research objects. These technologies have encoded texts as fragmentary forms, often realigned and interleaved with fragments from other texts, and placed them into digital archives and computational tools.[6] To be working within the field of digital humanities is thus to already be cognizant of the need to “build” digital systems, to encode, and to be involved in a practice of a highly technical nature. But this practice-based approach remains a contentious point, even within digital humanities. At times, it has been interpreted as a move away from theoretical and hermeneutic concerns and critical engagements (Berry, Philosophy of Software; Cecire). Indeed, Galloway asks, “Is it the role of humanities researchers to redesign their discipline so that it is symmetrical with that [digital] infrastructure?” (126).
We understand the strong technical orientation of the digital humanities through the concept of signal processing as described in Wolfgang Ernst’s “Media Archaeography.” He observes that media archaeology differentiates itself from media history through its concern with media artifacts and apparatus not only as “symbolic acts” (244, emphasis added), but as the “signal processing of culture.” Discussing Milman Parry and Albert Lord’s 1930s recordings of Serbian and Montenegrin epic songs on aluminum disc and electromagnetic wire, Ernst draws attention to the way in which systems of analog recording have allowed the “physical layer below symbolically expressed culture” (244) to be registered. Media archaeologists are thus concerned not only with symbol but also with signal; instead of being preoccupied with the textual or musical content of recordings (symbol), “[the] media-archaeological ear listens to radio in an extreme way: listening to the noise of the transmitting system itself” (250). Of course this does not mean that it is exclusively concerned with signal. Indeed, as Ernst notes, “media-archaeological analysis opens culture to noncultural insights without sacrificing the specific wonders and beauties of culture itself” (245). The point is not a turn to oppose signal to symbol, but to think both together.
What are the lessons of signal and symbol that might be applied to the digital humanities? After all, the digital humanities is not often concerned with the physical vibrations of air recorded in the electromagnetic flux of a fragile piece of wire, but much more commonly with the digital encoding and analysis of (primarily) textual data. Our contention here is not that the digital humanities lacks a cultural critique, a point that has been made many times before—particularly persuasively by Alan Liu, for example. Rather, we argue that the more specific problem of the digital humanities is seeing its contribution to public culture as a form of processing of signals, rather than as symbol. Understood as symbol, the technologies of the digital humanities have great implications for humanistic models of research. As Johanna Drucker asks,
So can we engage in the design of digital environments that embody specific theoretical principles drawn from the humanities, not merely work within platforms and protocols created by disciplines whose methodological premises are often at odds with—even hostile to—humanistic values and thought? This question is particularly pressing in light of the absorption of these visualization techniques, since they come entirely from realms outside the humanities—management, social sciences, natural sciences, business, economics, military surveillance, entertainment, gaming, and other fields in which the relativistic and comparative methods of the humanities play, at best, a small and accessory role. (85–86)
It is vital that the deployment and creation of tools drawn from outside the humanities do not simply supersede the theoretical principles that, as Drucker suggests, should inform DH design practice. Yet one complaint about the digital humanities is that, too often, tool making is seen as a substitution for hermeneutics. Indeed, this condition is epitomized by the use of the expression “more hack, less yack” (more computer programming and less theorizing).[7] This represents a very strong trend in those strands in the DH community that have self-identified around the notion of “building things” (i.e., the construction and making of digital systems, archives, interfaces, visualizations). This model of digital humanities might be understood, following Jean-Francois Lyotard’s suggestion in The Postmodern Condition, as a way of opening the data banks, giving the public access to information in order to resist the corporate and military enclosure of computerized knowledge systems (67; see also Berry, Critical Theory, 178). But the digital humanities needs to go beyond simply placing information in the public domain and making data accessible and manipulable.
As our reading of Ernst suggests, there is a need for the digital humanities to think signal and symbol together. Or, to frame the issue in terms of Shannon and Weaver’s mathematical model of communication: there is no communication without encoding and decoding. One might therefore interpret the digital humanities as an intellectual practice concerned with the transmission of knowledge as messages through channels. It is our contention that this transmission is generally assumed to work on the basis of a “sender-receiver” conception of communication, a conception that could be described as similar to that put forth by Claude Shannon (and popularized in his work with Warren Weaver) in the mid-twentieth century. In Shannon and Weaver’s model, the sender is the originator of the message, while the receiver is the recipient. By virtue of its simplicity and generality, this model has been successfully applied to various communication scenarios. Here, we would like to argue that the knowledge production of the digital humanities (understood in its paradigmatic mode of thought) and its consequent input into public culture might be seen to work as a similarly one-way process of communication, albeit one complicated by the signal processing that takes place on the receiver end. In part, it is the unidirectional dimension implied by the notion of a signal that we are challenging here. However, we also consider the limits of signal processing as a way to understand how the digital humanities might not only critically intervene in but also construct public culture.
In Shannon and Weaver’s model, the sender is the originator of the message, but the message is then encoded by an encoder; this becomes the second step in the transmission. Once encoded, the message becomes a signal that is compatible with the technologies used to transmit it. For instance, in a traditional telephone communication, a sound wave such as a voice saying “Hello, world” is transformed into an electronic signal to be sent via cables, satellites, etc. At the point of reception, a decoder must then convert the signal back into the original message. In our telephone example, the electronic signal is decoded into a sound wave again. The receiver of the decoded message is the final node in the model, to which the signal is directed. “Hello, world” has reached its destination as a message with intelligible meaning.
Within the digital humanities, we believe, scholars more often than not aim to create strong signals, rather than strong messages, to be given to the receiver.[8] By way of the digitization of archival materials, for instance, or by encoding texts into queryable information, what the digital humanist can be seen to be doing is striving for the release of the purest, strongest, widest signal to reach a public audience. Problems arise, however, when the audience is a receiver with no decoder; when, in other words, there is only signal processing and no way of turning the signal back into a comprehensible message.
We are, of course, simplifying, and also reducing a great deal, the many activities of the digital humanities, as well as Shannon and Weaver’s famous schema. What we are proposing is rather a sort of analogy, meant to highlight an issue that the digital humanities, as a critical field of inquiry, needs to tackle. It is often assumed that the public has the means, and the wish, to turn signals back into messages. This is, however, not always the case. Digital humanists must address the limits of signal processing head-on, which becomes even more pressing if we also consider another question brought about by the analogy to Shannon and Weaver’s model of communication. The sender-receiver model describes the transmission of information. The charge of the digital humanities is, instead, the production of knowledge. An uncritical trust in signal processing becomes, from this perspective, quite problematic, insofar as it can confuse information for knowledge, and vice versa.
We can now turn to some examples of the digital humanities understood as signal processing. In his New Republic article, “Technology Is Taking over English Departments,” Adam Kirsch criticizes the impoverished model of reading found in large-scale analysis of literary corpora. Kirsch illustrates this with the case of Moretti’s “distant reading” of novel titles from 1740–1850. Moretti’s computer-based quantitative analysis, analyzing title lengths in a database of novel titles, shows that titles are much longer at the beginning of the period than at the end. Kirsch observes, “The computer can tell you that titles have shrunk . . . but it takes a scholar with a broad knowledge of literary history [i.e., Moretti]—that is, a scholar who has examined the insides and not just the outsides of literary and artistic works—to speculate about the reasons titles shrink, and why it matters.” For Kirsch, unlike Moretti’s rich scholarly account, digital methods often adopt a position of “faux naïveté” about textual history: “their proud innocence of prior historical knowledge and literary interpretation” is, in his view, “partly responsible for the thinness of their findings.” This thinness that Kirsch finds, we would argue, is precisely the result of understanding the work of the digital humanities according to a model of signal processing; the belief that new scholarly insights can emerge simply from the digital manipulation of textual data such as novel titles, without the wider interpretative context that Moretti provides.
Yet, computational analysis is not the only activity of the digital humanities. In “Ignoring Encoding,” Ryan Cordell argues that Kirsch is describing a model of the digital humanities that is “all coding, no encoding.” For Cordell, Kirsch ignores the benefits brought by the less glamorous work of digitization, and the literal encoding that is required to draw meaning from digitized texts. Cordell uses here the example of the Women Writers Project (WWP), a project dedicated to the TEI encoding of early modern women’s writing. He argues that the work of encoding “has not been ‘simply’ the application of computer technology to traditional scholarly functions,” since it depends on scholarly debate about how to encode different entities within the text. His objection to Kirsch is then not only an argument for the inherent value of digitizing texts, placing them in the public domain, and making them “accessible to a wide audience of teachers, students, scholars, and the general readers,” as the WWP describes it.
For Cordell, the Text Encoding Initiative (TEI) encoding is also a model of scholarship in that it involves making interpretive decisions about texts. In other words, encoding is not just a useful application of computer technology but is also an important part of the contribution made by digital humanities as a scholarly activity. As he writes, “You may find encoding or archival metadata development boring or pedantic—certainly some do—but you cannot pretend that encoding is less a part of the digital humanities than coding.” In this view, the activity of digitizing texts and “processing” them according to the TEI standard—adding tags identifying semantic content, for instance—is and should be seen as just as important a contribution to humanities scholarship and to the wider public as big data and textual analysis. For Cordell, the TEI is indeed “one of the best examples of humanistic scholarship applied to computer technology.” He argues that “encoding inherited the stigma of scholarly editing, which has in English Departments long been treated as a lesser activity than critique—though critique depends on careful scholarly editing, as text analysis depends on digitization and encoding.”
We believe Cordell’s piece to be a very telling lament, partly because it underlines that, in the comparison between “traditional” and “digital” humanities, what is lost is not scholarly editing, which is preserved and extended in encoding, but rather critique, which disappears in Cordell’s battle to extend the digital humanities to mean not only big data text analysis but also encoding. Cordell’s defense of the digital humanities in no way responds to Kirsch’s central point: neither encoding or coding (textual analysis) is in fact a substitute for humanistic critique (understood in the broad sense). In Cordell’s formulation of the digital humanities, the space for symbolic, humanistic interpretation is now confined to the same relatively narrow role as scholarly editing. However worthwhile, symbolic interpretation is directed, along with scholarly editing, toward a higher goal of creating open, computer-readable, and universally accessible editions. Here, symbolic interpretation is purely instrumental: it is, as Cordell reminds us, applied humanistic scholarship. For us, therefore, encoding remains a paradigmatic example of digital humanities understood as signal processing: in other words, a humanistic computational effort to create the purest possible digital signal and send it to the scholarly community or interested public.[9]
It is useful here to consider, as a further example, the Folger Shakespeare Library and its creation of an extremely detailed TEI XML versions of Shakespeare’s plays.[10] The Folger Library opened in 1932 as a fairly traditional archive and library connected to a strong collection of Shakespeare’s works. Today, it describes itself as an “innovator in the preservation of rare materials, and major new digital initiatives support our leadership role in digital humanities (DH) research.” Since it contains the world’s largest Shakespeare collection, from the sixteenth century to the present day, producing canonical versions of the texts is centrally important to the library. Indeed, the library has “added sophisticated coding that works behind the scenes to make the plays easy to read, search, and index—and lays the groundwork for new features in the future” (Folger). Moreover, implicit editorial information that previously may have been difficult to access or unveil is now encoded directly into the XML files, so that, the library claims, the editorial process is “as nearly transparent as is possible.”
The values implied by the Folger’s encoding process, which seeks to embed as much information as possible into their digitized texts, demonstrate how strong encoding has become an important part of the justificatory discourses around why a digital edition of a work is created. It is also connected to a wider performativity about the cultural value of computational automation. As Sarah Werner, then digital media strategist at the Folger, once argued, “Right now there’s the catalog over here, and then the digital images over here, and then there’s a blog over here, and then there’s this thing over here. We need something that unites these products, and makes them more usable” (Rosen). The organization and classificatory logic of digital encoding through XML is then also bound up with the calculatory possibilities of making different things equal—that is, a process of standardization via the classificatory encoding system that is woven around the text and metadata of Shakespeare’s plays in those versions.
The encoding process influences and shapes our comprehension of digital objects and the narratives that these inspire. In this respect, the performativity of computation becomes highly important to our understanding of digital public culture. As Dan Edelstein notes, the “mass digitization of books has made research more efficient than ever” (237). Yet, while “quantification allows us to scale up from one book to the many . . . it only displaces, rather than resolves, the problem of figuring out what the numbers mean” (240). Edelstein’s corrective implies a much greater role for DH researchers as human encoders and decoders. They must use their understanding of the limits and biases of computational techniques to critique their application, as, for example, Jeffrey M. Binder does in “Alien Reading.” They must bridge the gap between digital and public culture.
One area we might look to for a less impoverished model of the digital humanities than that of signal processing would be the emerging field of études digitales. This term designates a set of predominantly, but not exclusively, French-speaking academics such as Bernard Stiegler, Antoinette Rouvroy, and Yves Citton, as well as the work collected in the journal Études Digitales, edited by Franck Cormerais and Jacques Athanase Gilbert. As the latter put it in their introduction to the journal, “Why études digitales rather than études numeriques? Because the term digital conserves in French the reference to the fingers (digits) whereas numerique refers to machine computation” (Cormerais and Gilbert, our translation). (“Numerique” is the French word used more normally for digital.) Such work would appear to use a more explicit engagement with the relationship between technology and research method than we tend to find within DH scholarship.
For Stiegler, this might mean understanding automation anxieties, such as those surfaced by the claims of the Folger library, as part of a “generalized and integrated automation” that encompasses both digital methods and the digital humanities themselves (La Société Automatique, 60). One starting point for Stiegler’s analysis is Chris Anderson’s Wired article, “The End of Theory.” Anderson argues that, in what he calls the “petabyte age,” the need for models and theories is eclipsed by algorithmic analysis of big data. As Anderson puts it, “Petabytes allow us to say: ‘Correlation is enough.’ We can stop looking for models. We can analyze the data without hypotheses about what it might show.” Anderson’s argument might almost seem to be a manifesto for the instrumental data-driven vision of the digital humanities, where data, modeling, and visualization (i.e., what is conceptualized in this chapter as signal) eclipse theory, critique and interpretation. Indeed, Tom Scheinfeldt’s “Sunset for Ideology, Sunrise for Methodology?” arguably makes Anderson’s argument about science in relation to the humanities, envisioning a golden age where methodology and “organizing activities” displace theories and “ideologies” such as “socialism, fascism, existentialism, structuralism [and] poststructuralism” (124). Stiegler, in contrast, builds on the critique of Anderson found in Kevin Kelly, arguing that behind every automated understanding of facts lies a hidden theory. For Stiegler, then, the automation of knowledge, as one that has no need for thinking (La Société Automatique, 96), must be addressed alongside the automation of society and itself made the subject of critique. Indeed, the automation of knowledge implicit in Google-like algorithmic analysis of big data that claims to disavow theory is always already theory-laden. In response, we need to sharpen our critiques of big data and what Stiegler calls the “neuroeconomy” (60). Implicitly, this requires a critique of automated or algorithmic knowledge, inasmuch as the latter sees itself as signal processing, bypassing the cultural critique and interpretation of symbols.
Antoinette Rouvroy and Bernard Stiegler stake out the terrain of big data and algorithmic knowledge well in a classic piece of études digitales work, “Le régime de Vérité Numérique”:
The work of producing big data, or rather raw data, is therefore the work of suppressing all signification, with the goal that these raw data can be calculable and no longer function as signs that signify something in relation to what they represent, but as something that substitutes for signifying reality, making it disappear. In place of signifying reality, a set of asignifying networked data is substituted, which function as signals, meaning that although they have no signification, or rather because of that, they become calculable. In fact, this accords with the definition of signal that Umberto Eco gives: a signal is an element without signification which signifies nothing but because it truly signifies nothing, becomes all the more calculable. (114, our translation and emphasis added)
The idea that knowledge can be reduced to the analysis of data is, as Rouvroy and Stiegler argue, essentially the ideology of big data, one that rests both on the principle that empirical facts can replace theoretical models and on the false assumption that big data is simply the natural emanation of an empirical reality.[11]
This has immediate implications for our argument about signal and symbol in the digital humanities. The signal-processing model of digital humanities, far from dispensing with ideology or being an essentially pragmatic methodological exercise, collaborates in what Rouvroy and Stiegler identify as the ideological purification of data as signal. In this way, the digital humanities participates not only in the processing (or purification) of signal but also in its essentially ideological suppression of symbol and signification. The digital humanities partakes in what Rouvroy and Stiegler are calling the “digital truth regime.” That is why it is important for the digital humanities to consider the political, cultural, and legal implications of the technologies it employs.
Thinking signal and symbol together, as we propose to do here, requires digital humanists to consider method in ways that are at odds with the paradigmatic understanding of method in the DH field. In spite of some trenchant critiques, method is still often used to disavow theory. Here it is worth considering, as an example, Tom Scheinfeldt’s well-known position in “Why Digital Humanities Is ‘Nice’”:
Digital humanities is nice because, as I have described in earlier posts, we’re often more concerned with method than we are with theory. Why should a focus on method make us nice? Because methodological debates are often more easily resolved than theoretical ones. Critics approaching an issue with sharply opposed theories may argue endlessly over evidence and interpretation. Practitioners facing a methodological problem may likewise argue over which tool or method to use. Yet at some point in most methodological debates one of two things happens: either one method or another wins out empirically, or the practical needs of our projects require us simply to pick one and move on.
Scheinfeldt sees methodological debates as clearly distinguished from theoretical ones: the latter require endless arguments “over evidence and interpretation,” whereas the former can be resolved quickly, empirically, and pragmatically. According to this view, method has almost no relationship with theory, and methodological debates are oriented around the best way to deal with empirical data. Method, in this context, becomes the pragmatic “cleaning up” (to use Rouvroy and Stiegler’s expression) of data, or the signal suppressing symbol.
We see this disavowal of theory as symptomatic of the model of DH work that we are describing as signal processing. The disavowal of theory indicates the need to address the mutual relationship between signal and symbol that we have outlined in this chapter. In this respect, it is important to stress that our argument about the interdependence of signal and symbol involves contending that method and theory are also equally inseparable. Just as we need to think more critically about the relationship between signal and symbol, so too do we need to think more critically about the relationship between method and theory.[12]
This point is crucial, insofar as the implications of the digital humanities’ initial relegation of theory to a status below method are manifested in the way in which academic knowledge and its dissemination are understood within the field. With an impoverished model of signal/symbol, there is an overemphasis on encoding and coding, and on the clarity of the channel, instead of on the decoding and reception of the symbolic content, which are essential ingredients of both academic and public discourses. We are not proposing that symbol should eradicate signal or that the relationship is necessarily conflictual. On the contrary, we see it as beneficial for the humanities to engage more robustly with signal; for instance, along the lines that Wolfgang Ernst suggests in relation to media archaeology. In this regard, digital methods potentially have a great role to play. By drawing attention to the status of both signal and symbol when actualized in the archives, tools, models, formalizations, and research infrastructures of the digital humanities, we can restore symbol to a position of significance and importance alongside signal. To return to Stiegler’s suggestion, we need a digital hermeneutics within the digital humanities that binds together method and theory, signal and symbol.
Notes
1. Here we are mobilizing a concept of disorientation drawn in part from Stiegler (Technics and Time).
2. In financial markets, the seemingly more threatening term FANG (Facebook, Amazon, Netflix, Google) is often used to denote a special group of technologies stocks concerning new models of computational media as streaming technologies (Berry, Philosophy of Software).
3. Indeed, data and information were said by Alan Greenspan, the former Chairman of the Federal Reserve of the United States, to be the new “oil” of the digital age (Berry, Copy, Rip, Burn, 41, 56).
4. Stiegler, for instance, has identified this as the emergence of the “automatic society” (la société automatique). This is an era in which “calculation prevails over every other criteria of decision-making, and where algorithmic and mechanical becoming is concretized and materialized as logical automation and automatism” (La Societé Automatique, 23, our translation).
5. We are currently running “Automation Anxiety,” an AHRC (UK Arts and Humanities Research Council) project exploring methods for analyzing contemporary cultural anxiety about automation. See http://blogs.sussex.ac.uk/automationanxiety.
6. The interdiscursivity and intertextuality engendered by the digital have, of course, been much remarked on and even used creatively in the writing of new forms of electronic literature.
7. For an alternative etymology of the phrase, see http://dhdebates.gc.cuny.edu/debates/text/58.
8. As Andrew Prescott commented on an earlier version of this chapter, “the idea that there is a distinction between information (signal) and carrier has been fundamental in much infrastructural provision for the humanities, not only in a digital environment but also for example in microfilming of newspapers. This concept is fundamental to XML and TEI. The problem is that for large swathes of humanities scholarship the information is fundamentally bound up with the medium in which it is carried. For historians, it is important to know whether we are talking about a telegram, medieval writ, newspaper, etc. Much of the interest of the digital humanities is precisely in the use of digital technology to explore the materiality of these carriers.”
9. “It is as if, when the order comes down from the funding agencies, university administrations, and other bodies mediating today’s dominant socioeconomic and political beliefs, digital humanists just concentrate on pushing the ‘execute’ button on projects that amass the most data for the greatest number, process that data most efficiently and flexibly (flexible efficiency being the hallmark of postindustrialism), and manage the whole through ever ‘smarter’ standards, protocols, schema, templates, and databases uplifting Frederick Winslow Taylor’s original scientific industrialism into ultraflexible postindustrial content management systems camouflaged as digital editions, libraries, and archives—all without pausing to reflect on the relation of the whole digital juggernaut to the new world order” (Liu, 491).
11. On this point, see also the essay collection edited by Lisa Gitelman, ‘Raw Data’ Is an Oxymoron.
12. As Savage argues, methods demand to be understood as neither simply instrumental (“the practical needs of our projects require us simply to pick one and move on,” as Scheinfeldt has it) nor uncontroversial. Savage contends that we need to resist the “instrumental framing in which [methods] are simply seen to be technically ‘better or worse’ means of doing social research” (5). Far from being merely tools to investigate theoretical questions, methods are now “the very stuff of social life”: “Social networking sites, audit processes, devices to secure ‘transparency’, algorithms for financial transactions, surveys, maps, interviews, databases and classifications can be seen as . . . modes of ‘making up’ society. This move . . . is part of a striking rethinking of the relationship between theory, culture and method which is currently underway in contemporary academic research” (5).
Bibliography
Adorno, Theodor W. Minima Moralia: Reflections from Damaged Life. Translated by E. F. N. Jephcott. London: Verso, 1978.
Agre, Phil. “Surveillance and Capture: Two Models of Privacy.” Information Society 10, no. 2 (April–June 1994): 101–27.
Anderson, Chris. “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.” Wired, June 23, 2008, http://www.wired.com/2008/06/pb-theory. Accessed January 15, 2016.
Berry, David M. Copy, Rip, Burn: The Politics of Copyleft and Open Source. London: Pluto Press, 2008.
Berry, David M. Critical Theory and the Digital. New York: Bloomsbury, 2014.
Berry, David M. The Philosophy of Software: Code and Mediation in the Digital Age. London: Palgrave Macmillan, 2011.
Binder, Jeffrey M. “Alien Reading: Text Mining, Language Standardization, and the Humanities.” In Debates in the Digital Humanities 2016, edited by Matthew K. Gold and Lauren F. Klein, 201–217. Minneapolis: University of Minnesota Press, 2016.
Cecire, Natalia. “When Digital Humanities Was in Vogue.” Journal of Digital Humanities 1, no. 1 (2011), http://journalofdigitalhumanities.org/1-1/when-digital-humanities-was-in-vogue-by-natalia-cecire/. Accessed January 15, 2016.
Chow, Rey. Entanglements, or Transmedial Thinking about Capture. London: Duke University Press, 2012.
Cordell, Ryan. “On Ignoring Encoding.” Ryan Cordell. May 8, 2014, http://ryancordell.org/research/dh/on-ignoring-encoding/. Accessed January 15, 2016.
Cormerais, Franck, and Jacques Athanase Gilbert. “Une nouvelle revue.” http://etudes-digitales.fr/. Accessed April 4, 2017.
Drucker, Johanna. “Humanistic Theory and Digital Scholarship.” In Debates in the Digital Humanities, edited by Matthew K. Gold, 85–95. Minneapolis: University of Minnesota Press, 2012.
Edelstein, Dan. “Intellectual History and Digital Humanities.” Modern Intellectual History 13, no. 1 (2015): 237–46.
Ernst, Wolfgang. “Media Archaeography: Method and Machine versus History and Narrative of Media.” In Media Archaeology: Approaches, Applications, and Implications, edited by E. Huhtamo and J. Parikka, 239–55. Berkeley: University of California Press, 2011.
Folger. “Folger Digital Texts.” https://folgerpedia.folger.edu/Folger_Digital_Texts. Accessed March 29, 2018.
Galloway, Alexander R. “The Cybernetic Hypothesis.” Differences 25, no. 1 (2014): 107–29.
Gitelman, Lisa, ed. ‘Raw Data’ Is an Oxymoron. Cambridge, Mass.: MIT Press, 2013.
Kelly, Kevin. “On Chris Anderson’s ‘The End of Theory.’” Edge: The Reality Club, June 30, 2008, http://edge.org/discourse/the_end_of_theory.html/. Accessed January 15, 2016.
Kirsch, Adam. “Technology Is Taking over English Departments: The False Promise of the Digital Humanities.” New Republic, May 2, 2014, https://newrepublic.com/article/117428/limits-digital-humanities-adam-kirsch/. Accessed January 15, 2016.
Liu, Alan. “Where Is Cultural Criticism in the Digital Humanities?” In Debates in the Digital Humanities, edited by Matthew K. Gold, 490–509. Minneapolis: University of Minnesota Press, 2012.
Lyotard, Jean-François. The Postmodern Condition: A Report on Knowledge, Manchester, UK: Manchester University Press, 1984.
McAfee, Andrew, and Erik Brynjolfsson. The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. New York: W. W. Norton, 2014.
Rosen, Rebecca J. “A Brief Tour of the Folger Shakespeare Library’s Digital Treasures.” The Atlantic, October 2, 2013, http://www.theatlantic.com/technology/archive/2013/10/a-brief-tour-of-the-folger-shakespeare-librarys-digital-treasures/280039/. Accessed January 15, 2016.
Rouvroy, Antoinette, and Thomas Berns. “Gouvernementalité Algorithmique et Perspectives D’émancipation” Résaux 177 (2013): 163–96.
Rouvroy, Antoinette, and Bernard Stiegler. “Le Régime de Vérité Numérique De La Gouvernementalité Algorithmique à Un Nouvel État de Droit.” Socio (2015): 4113–140.
Savage, Mike. “The ‘Social Life of Methods’: A Critical Introduction.” Theory, Culture & Society 30, no. 4 (2013): 3–21.
Scheinfeldt, Tom. “Sunset for Ideology, Sunrise for Methodology?” In Debates in the Digital Humanities, edited by Matthew K. Gold, 124–26. Minneapolis: University of Minnesota Press, 2012.
Scheinfeldt, Tom. “Why Digital Humanities Is ‘Nice.’” In Debates in the Digital Humanities, edited by Matthew K. Gold, 59–60. Minneapolis: University of Minnesota Press, 2012.
Shannon, Claude E., and Warren Weaver. The Mathematical Theory of Communication. Urbana: University of Illinois Press, 1963.
Stiegler, Bernard. La Société Automatique. 1. L’Avenir du travail. Paris: Fayard, 2015.
Stiegler, Bernard. Technics and Time: 2. Disorientation. Translated by Stephen Barker. Stanford, Calif.: Stanford University Press, 2008.