PART IV ][ Chapter 27
A Conversation on Digital Art History
Johanna Drucker and Claire Bishop
Note: The following conversation about the meaning and value of digital art history took place via email between June 23, 2017, and July 12, 2017.
Johanna Drucker (JD): 6/23/2017 The familiar line of criticism against digital humanities is that computational processing is reductive because it performs statistical analyses on complex artifacts (by contrast to the good or neutral availability of digitized materials in online repositories) and that because it is statistical it is necessarily an instrument of neoliberalism. This condition is characterized as a recent development, and the implication is that if only the bad cuckoo of digital scholarship had not come into the nest, the good humanists would have continued to enjoy a rich support stream for their idealistically driven work.
I think we need to take each of these pieces apart, and, in particular, undo the narrative that makes the links in this line of reasoning seem to have a kind of inevitable connection. I think all are based on mistaken and unexamined assumptions, historical ignorance, theoretical naïveté, and a degree of defensiveness with questionable motivations.
- 1. The reductiveness of statistical processing and the contrast with the “merely” digitized, and the imagined threats to traditional engagements with cultural artifacts
- 2. The link between statistical methods and neoliberalism (in relation to the history of “political arithmetik,” managed culture, bureaucratic controls)
- 3. The myth of the goddess university and the ideal humanities, including the stigmatization of entrepreneurialism and blindness to current conditions
Then I suggest that we discuss the ways the intellectual labor of digital work actually contributes to and changes the cultural conditions of humanistic activity, how and why we are all complicit, and what the future development of a digital component of scholarship and research might look like—benefits, costs, and risks.
Other issues you might want to discuss?
Claire Bishop (CB): 6/23/2017 Wow! I guess I’m still a believer in a lot of those myths, largely because I haven’t been thinking about them as long as you. I’m all in favor of debunking clichés and don’t regard myself as a Luddite, but I confess I remain skeptical about your critique of the critique because I haven’t yet been exposed to a digital humanities project that has transformed my intellectual horizons in the way that more conventional humanities scholarship manages. Indeed, the critique of the digital humanities seems to be a richer and more challenging seam of thinking than most of the DH projects I have seen.
I fully admit that what DH projects I have looked at have been within my own discipline, art history, and might more accurately be described as cultural analytics. So let’s start with your first point, which chimes well with art history since it posits a tension between statistical processing and other forms of engagement with cultural artifacts.
I don’t think anyone in art history imagines that the rise of DH poses a threat to more traditional ways of analyzing cultural artifacts. At best it is seen as a useful supplement: (e.g., digital 3D reconstructions of architecture or fresco cycles, assisting with media migration and format obsolescence in museum conservation). Close visual analysis is so deeply ingrained in art history departments (at least for my generation and older) that it will continue to provide the benchmark for a long time to come. At the same time, I don’t want to unquestioningly stamp a seal of approval on this approach because it also has problems (e.g., insufficient attention to art history’s own conventions, to biases within the canon, and to social and political contexts). Plus, it can be extremely boring and formulaic.
But the DH alternative to this, exemplified by the essays that I read in the International Journal for Digital Art History,[1] seems intellectually impoverished and/or bluntly instrumental.[2] The kinds of questions asked in this journal, and the results they produce, come across as crashingly obvious. Statistics are an important part of this banality, but are not the whole story. The overall tone reads like a managerial report from an IT department; it’s all number crunching, and there’s no verbal persuasion.
So does the choice have to be between DH projects driven by statistics/data visualization/modeling, and traditional models of art history, with their own clichés, conventions, and limitations? I’m not sure I’m in a position to answer that question. I’m more interested in thinking through how the dominance of networked technology, especially Google Images, is exerting its own pressures on the study of art history. My colleagues and I increasingly note that students have difficulty in focusing on one image and analyzing it at length. Their attention is more diffuse, and tends to deal with clusters and contexts. The next generation sees and thinks differently—digitally?—and I’m interested in trying to articulate this difference, and to see what this produces in terms of scholarship. Tellingly, none of the so-called digital natives in my department are much interested in DH projects, which they regard as driven by technophilic rather than intellectual questions. At the same time, they have internalized quantity and are adapting their minds to negotiate this.
You have much more familiarity with DH projects than I do. I’d be curious to learn about an example that you think does more complex work with quantitative analysis, but without sacrificing the qualities of verbal narrative.
JD: Sure. First, though, let me clarify a few things. I think many of the scholars who work inside digital projects are at least as skeptical as those outside because they know how the reductive processes work and why they are reductive. A few uncritical champions, such as Lev Manovich, Franco Moretti, and Matt Jockers, are convinced that processing “data” about cultural artifacts produces benefits at scale. My criticisms of their work are fundamental: they are looking at processing digital files which are already radically remediated, and they make claims about these large corpora as if this bypasses the issues of canon formation. They also are using counting methods, not statistical ones, and so are not even doing sophisticated work from a social sciences perspective. I could go on, but I’ve made these arguments elsewhere at great length. But, for instance, take Manovich’s project about Rothko’s paintings. It’s a disaster start to finish. The “paintings” are digital files from various and varied sources, and their provenance is not indicated, so we may be looking at a scan of a reproduction from a four-color separation, and so on. The images are emphatically not paintings. If you are going to use positivistic methods, at least use them with rigor. So, a counterexample would be Stephen Murray’s studies of cathedral architecture in Gothic France. He is using digitally rendered drawings to compare features that can actually be adequately rendered in wire frame for the specific purposes to which his analysis puts them—to be able to see contrasts of proportion, scale, and other elements of structure.
My colleague Laura Mandell has been systematically analyzing the problems of data mining in literature—which starts with imagining that a word is ever self-evident or self-identical—and then builds in the specific issues involved in the creation of topic models and other computationally produced outcomes. But she also raises questions about the place of distant reading practices within a longer history of interpretation by asking about the purpose and motivations of such work. The identification of outliers, of deviations from norms, of statistical anomalies—these are things that can be done computationally. You don’t ask a computer to do the things humans do better, but rather, you ask it to assist by doing things it does well. So, trying to compute resemblance based on feature sets is an interesting problem, while asking a question about “beauty”—as in the example of de la Rosa and Suárez is not. That would be true whether the questions were asked using digital or analog methods. We can cite plenty of bad work, even by esteemed figures, doing close readings in every mode from traditional, to poststructural, psychoanalytic, or using critical race studies.
As I’ve said elsewhere, I find the work that integrates material sciences into the study of cultural materials to be one promising direction. The work that Eric Johnson and his colleagues have done to rethink the characterization of vellum as “uterine” has not only pushed our understanding of medieval herd management and its relation to manuscript production but also posits a paradigm of human-animal-plant ecologies as a way to think about cultural artifacts. I consider that a major contribution. Early work by Kirk Martini, an engineer and architectural historian, analyzing the way walls crumbled in Pompeii, made use of computational models that contributed to an explanation of physical forces that had been at work—and thus provided a different history of the structures and their ruins. I could cite other examples—the comparison of pottery fragments, morphic analyses of letterforms and glyphs, noninterventionary reconstruction, and so on that are all interpretive tools, and/or scholarly engagements. But that’s not my task here. That work exists, and my point would be that dismissing “digital art history” as if it is singular or monolithic does not do justice to the complex variety of work being done and its proven and potential value.
But I want to return to the characterization of the distinction between digital and digitized work, which I made, but by which I did not intend to suggest that “digital” presentations are not computational. Some of the work that involves computational processing is, as we agree, reductive, but, as I have just argued, not all. But making materials available in digital format—to be found online, copied, put into presentations, slide lectures, and so on—is not unproblematic. Every act of digitization is a remediation, and any substantial repository exists in an infrastructure of metadata, information architecture, while every digital file has format specifications that are an abstraction from an analog artifact (or, are the features of a born digital one). Taking these files as if they are simply there, merely the thing, is like imagining that a trout in a case at the supermarket got there on its own. The supply chain side of infrastructure goes invisible, and all of the intellectual decisions, interpretive aspects of these actions, are concealed in that most ideological of statements, the presentation of an image as if it “just is.” My basic argument about information visualizations—that they are all reifications of misinformation—holds with all digital files; they are radical remediations that conceal their production in the immediacy and apparency of display.
So, yes, much of the work done in computational processes is reductive, but that is the fault of the researcher and question being asked, the privileging of certain outcomes, and the uncritical enthusiasm for technique. The same could be said of social art history when it uses selected exceptions as if they are representative, or of psychoanalytic approaches that found the phallus everywhere. That is what you find if that is what you are looking for—or, more importantly, if that is how you are looking. I made the distinction digitized/digital not to suggest that the former is pure and the latter processed, but they are processed to different degrees and with different implications.
But the attack on digital humanities goes beyond defensive reactions to close reading, cultural analytics, and its techniques. The critics make accusations that connect the computational to political values and conditions. That seems ignorant in other ways than the simple misunderstanding of how the technological methods work. So I wonder if we could explore that?
CB 6/26/17 Sure—but first I just want to respond to a couple of things you just wrote. I agree with your point about there being bad scholarship in every method. But this is usually once the method has become conventional and academically entrenched. The writing at the outset of any given method—feminism, psychoanalysis, postcolonial theory, social art history—is by and large usually thrilling, decisive, and overflowing with urgency. Here, though, we have a new method (“digital art history”) that is largely depoliticized and uncritical. These two characteristics make it very difficult for me to get intellectually excited about many of the DH projects you describe. They seem to me tools for the possibility of analyzing something at some unspecified point in the future—rather than exciting/polemical interpretive proposals in their own right. Material is amassed, possibilities are mooted, but not many killer arguments are put forward. I wonder if this sense of deferral is endemic to the field.
Which brings us on to the politics of DH. Elsewhere I have argued against simplistic morphological analogies between culture and politics (e.g., this work of art has the same organizational structure as democracy, so therefore this work of art is democratic). With DH, so the argument goes, quantitative computational analysis reduces works of art/literature/music/theatre to metrics; one of the hallmarks of neoliberal economics is also the reduction of goods and services to metrics, on the basis of which performance and profit can be assessed and monetized. This is what is so galling about the Research Excellence Framework in the UK: every member of every university department has his or her research graded on a scale of 1 to 5, so that each department can be awarded a number that determines the next five years of funding. The neoliberalization of the university—in this respect more advanced in the UK than here in the United States—hinges on rendering unwieldy, intangible qualities (such as knowledge, truth, meaning, etc.) subject to performance indicators. Many DH projects operate with a similar reduction of cultural complexity to metrics, especially those dealing with literature. The important difference is that they tend not to be done with an eye on monetization.
However, the relationship between DH and neoliberalism is not just based on an apparently mimetic relationship to metrics. It goes deeper: the gathering, categorizing, and ranking of data is often undertaken in precarious labor conditions. The top researcher is rarely doing his or her own tagging and filing; that job falls to the students or, worse, to an outsourced company. Manovich, for example, is unapologetic about using Amazon’s “Mechanical Turk” system—an internet marketplace for human intelligence tasks (HITs) that cannot be accomplished by computer. Mechanical Turk is a quintessentially neoliberal global marketplace: workers are dispersed, non-unionized, and unable to negotiate rates with their employers. They are also set in competition with one another: if one worker doesn’t accept the dismal fee that’s offered, you can bet that someone else will. This imbrication of DH research within exploitative distributions of labor, not to mention the corporate platforms that hold the data being analyzed (e.g., Instagram, Google Maps), is troubling.
Of course, this doesn’t have to be the case: the collaboration necessary to accomplish the capture of data opens up the way for more radical approaches, even though the kind of critiques proposed by #TransformDH tend to focus more on approach (decolonization, social justice, identity issues) than on the process of amassing data.
You’re much more immersed and invested in DH than I am. How do you deal with the arguments about the proximity of DH to the neoliberalization of the university—and everything else? After all, it is telling that both have arisen at the same historical moment and that the similarities are more striking than the differences.
JD: As you know, I have stated repeatedly that I don’t think digital methods have defined a critical breakthrough in any way that is comparable to the impact of theoretical formulations like feminism, deconstruction, postcolonial, queer, or critical race theory. I guess I also don’t think it has to do that, and the expectation ought to shift. One of the reasons I am interested in working in the context of the MLIS—Masters of Library and Information Studies—is that I see the management of cultural resources for access and use as central to digital projects. That said, I think that the idea that a new method has to produce immediate results that are staggeringly distinct is ungrounded. Why? To justify its existence? Is novelty the hallmark of good scholarship? Even on those grounds, to dismiss an entire field on the basis of a few bad examples seems as irresponsible as my senior male colleagues telling me in the 1980s that no good scholarship could come of looking at bad art (e.g., work by women). You can resist digital work by saying it has no intellectual excitement, or you can do the work of understanding its intellectual contribution. Very few scholars get excited about metadata and information infrastructure, but that, in fact, is where much of the intellectual benefit lies. If I can now go through the entire corpus of Karl Kraus’s Die Fackel and see how a particular term was used by him to encode a private conversation with his readers by subtle changes in the use of a word over a thirty-seven-year period, that is because the structure of the project contains an intellectual expression of that concept. Again, the dependence of every scholar, teacher, researcher, and student I know on the repositories made by digital projects shows their value and their integral role in our work and practices. Within the community of information professionals, this work is anything but “depoliticized and uncritical”: the debates on the politics of classification, the inherent social and cultural biases in all systems of information production and management, permeate the field. (I can supply references here in preservation, archives, classification, digital access, user analysis, bibliography, and pretty much every aspect of information studies. The fact that digital humanists are often ignorant of this field is an effect of academic silos, but also, frankly, a classist dismissal of “librarians” as secondary citizens within the university system, an offensive and prevalent practice. One of my major complaints about digital humanities is that it ignored the library professions expertise as if it were mere service work and technical, rather than deeply ethical in its outlook and principles.)
The question of the “neoliberalization” of the university has many dimensions, as does the use of quantitative methods. I’ll begin with the second. As we all know, administered culture does not begin with the 1990s or even the Thatcher-Reagan era, though their deliberate goals of undoing the social welfare systems were rhetorically conspicuous and pointed. The Romans administered a census, the Babylonians had tax rolls, as did the Egyptians, and systems of tithes and such are recorded in the Bible. But the invention of “political arithmetik” (William Petty’s term, dated to 1676, but linked to John Gaunt’s work a decade earlier—I have to love their names, Petty and Gaunt, as indicators of something resonantly symbolic) in the seventeenth century is followed a century later by the development of visualization methods. But statistical analysis for social management and the reduction of human beings to numbers is neither recent nor attributable to digital humanities. The idea of transforming complex cultural artifacts into metrics, or quantitative information, has its own history in the Annales school as well as all of the social sciences in the twentieth century. Remember the controversy about Robert Fogel’s Time on the Cross (1974) (cowritten with Stanley Engerman), a statistical approach to the study of slavery as an economic issue. Quantitative methods are problematic in the humanities if they substitute for other engagements, but also, more profoundly, as I have also said repeatedly, if an absolute cultural authority accrues to them. Quantitative methods have just as much possibility for ethical guidelines as qualitative ones, but keeping the ethical issues visible throughout the lifecycle of data production is a real challenge.
But the real issue that troubles me (dismissive resistance and stubborn dig-in-the-heels ignorance are simply silly), is this quick slippage: “quantitative computational analysis reduces works of art/literature/music/theatre to metrics; one of the hallmarks of neoliberal economics is also the reduction of goods and services to metrics, on the basis of which performance and profit can be assessed and monetized.” This suggests that DH is somehow more complicit with this condition than other practices and even causally responsible for it. This is patently false. It also suggests that a “pure” humanities exists that is untainted: the humanities of work that embraces social good and the highest virtues of humankind without complicity in the institutional frameworks that support it. Let’s see, was that in the first medieval universities, founded by the Church? No ideological complicity there, right? Or in the seventeenth-century academies, founded to control knowledge and professionalize it, criminalizing other scholarship (Richelieu’s intention)? Or in the growth of Enlightenment thought, a philosophically altruistic movement, concerned with the same pure motives that allowed the justification of colonial expansion, slavery, the exploitation of natural and human resources, systematic genocide, and other positive humanistic contributions all codified in treatises and historical studies? Or the long-standing British humanistic study that allowed the empire to identify with classical Greece and Rome and thus legitimate its political practices through study of the rhetoric and history of the past? Or the nineteenth-century universities that trained clerics, lawyers, and medical men and dosed them with a bit of poetry in keeping with an Arnoldian notion that “the best that has been thought and said,” or the early twentieth century in which the entrepreneurial spirit of American democracy creates a new international system of trade and so learned men, men of sophistication and class, ought to be able to cite their classical and romantic poets, their Shakespeare and their Milton, alongside a bit of Paine, Jefferson, and Lincoln? The American research universities, the kind of environment that supports Richard Grusin so he can write media criticism, was developed in the post–World War II period on the strength of the GI Bill (which poured money into the higher education system) and enormous research grants from the military-industrial complex. Did the humanities departments that lived off this economic stream imagine they were somehow absolved of responsibility—like the family eating dinner in the plantation and uttering righteous platitudes about Christian piety? I had a friend who was a dean of an art school and raising money for various projects, and her students said they only wanted her to take money from “good” corporations, not bad ones—as if the entire system of creating surplus capital from exploitation of labor were not a problem. Really, I think the idea that digital humanities is the devil here is a convenient way to ignore the big and complex picture—that we are all complicit, that the humanities have served as a part of the system they critique, that they survive on account of that political and economic reality, and that the moral superiority of cultural critics is intellectually suspect and ethically unfounded.
In short, I disagree that DH has justified the neoliberalization of the university. It also carries with it the idea that the digital humanities sucked all the money out of humanities institutions. Because there was so much money going into them? It is true that the NEH and Mellon, among others, engaged digital projects because they saw benefits at scale—access, preservation, and use—which, in fact, as per my earlier statement, is true if you consider the integration of these resources into daily use. As to the exploitation of labor issue, that is another chestnut to take apart. I pay all of my students and always have, as do most of the people working on these projects in the university. Do you think that learning how to do digital work and being paid to do it so that the humanities can thrive in a networked environment is a negative thing? I don’t. And I think giving students skills for work is also positive. Having always had to make a living—as a waitress, typist, print shop worker—I am glad to have had skills that let me survive. That said, the myths that people carry around with regard to digital technology are equally frustrating—that it is immaterial and archival—or that it is somehow ecological. Why don’t we address the ecological costs of all digital technology?
CB: This is a great answer—I love a long historical purview! Just a few brief comebacks:
I don’t think anyone is saying that DH is justifying the neoliberalization of the university. Isn’t it the other way round: that DH is, wittingly or unwittingly, neoliberalism’s justification of the humanities? I think it’s revealing that the only new method to emerge in the humanities in recent years takes up the same operation—the transformation of intangible properties to digital metrics—as neoliberal economics. Obviously there are differences in goal, but in the core principle—quantitative analysis—there is a significant overlap. This doesn’t mean that they are equivalent (that would be a ludicrous claim), but it needs to be borne in mind when we recognize that the humanities is the one area within the university whose recognizable “output” (relative to the investment put into it) is most opaque and inscrutable, and whose measurable “impact” is least accountable.
While I take your points about the impurity of the university, this is also a very American perspective. European state funding at its best maintains an arm’s length from corporations and industry; tuition costs very little (in Germany, 300 euros per semester), and thus allows for great social mobility. It just doesn’t afford the kind of high wages (for faculty) and privileged attention (for students) that are enjoyed here.
Paying students isn’t the issue (that’s great if you have the money; many academics don’t). I feel bad about paying students to do mindless data entry rather than more exploratory intellectual work, even if the results from the latter tend to be uneven and less predictable. I guess that’s because I would always prefer to do the latter.
Finally (I’m not going in any kind of order here), I’m inclined to disagree with your observation that new methods shouldn’t have to be thrilling. I think there’s every reason to expect innovation to be exciting—in academia and in culture, broadly—and it’s one of the reasons why I’m in this game. I guess I’m an unreconstructed avant-gardist.
We’re coming at this from such completely different angles, though, that I’m not sure where we should turn next. It’s already glaringly apparent that I am really not interested in most examples of DH and philosophies of information management. Equally, I’m sure my own preoccupations come across as naïve and misguided, and tend to use DH as a straw man. But just to put some of them on the table: I’m curious to see if and how digital technology can be used performatively to enact arguments about digital technology—either to make an argument that runs parallel (or even counter) to more traditional types of written text or to reinforce the latter by using a different signifying system.
For example, can quantity be mobilized as a critique of an artistic tendency, without one having to use quantitative analysis? Can the accumulation and visualization of data be used to produce affective results in the reader/viewer that make an argument about a particular genre? Can social media posts be harnessed to build a commentary about and interpretation of a work of art/performance—one that might exist quite separately to the intentionality of the artist and institution? These questions have in part arisen from my fatigue with conventional art history, in part because I’d like to foreground the dependency of my research (and contemporary art in general) on digital technology, and in part as a response to invitations to present my research in theater contexts rather than in university conference rooms.
My points of reference are those artists who use digital presentation tools to present their research, but who mobilize data in subjective and subversive ways. The Lebanese artist Walid Raad,[3] for example, uses PowerPoint to accompany his lecture-performance The Loudest Muttering Is Over: Documents from the Atlas Group Archive, but in a way that both asserts and undercuts the authenticity of his own persona as a lecturer and the information he presents (e.g., the number of casualties from car bombs in Beirut during the Civil War). He offers and quickly withdraws data in order not to say anything about car bombs, but to produce a sly takedown of the media devolution of facts into individualized narratives rather than larger political analyses—but that’s only one of many destabilizing effects of his presentation. Raad’s more recent research project, Scratching on THINGS I COULD DISAVOW, looks at first glance very DH: it purports to visualize the financial and geopolitical relationships around the Artist Pension Trust and their involvement in Middle Eastern politics. Raad’s talk that accompanies this flickering, pulsating graphic display turns out to be anything but clear: rather than being an exposé, it’s a meditation on critique and the problems of over-researching a topic. He offers a narrative driven by curiosity, seduction, and finally resignation as he acknowledges the predictability and impotence of his investigation.
All of which feels like (and is) a very different project from the kinds of questions you’re asking about information visualization. So I’d like to twist the discussion now toward art (especially as you taught art history for fourteen years), and ask if and how you ever think about visual art as having anything to say to the DH.
JD: Lots of artists are doing data-driven work that is interesting, topical, and self-reflexive, as well as critically engaged with technology in various ways—Laura Kurgan comes to mind immediately, for instance.
I originally got interested in digital work in the early 1980s, in graduate school, when I wrote a very speculative piece about the future of writing (as in inscription). All the journals I submitted it to said it was too futuristic—I was asking questions about permanence, iteration, etc. But it was artists who first introduced me to digital methods—Jim Pomeroy, a wonderful conceptual artist who died in a freak accident. But we had been using digital technology for typesetting as early as 1977, so I was interested in the conceptual insights that arose from using these tools—alongside letterpress, which I have been involved with for (!) forty-five years. I think the relationships between technology and creative/intellectual thought are ergonomic as much as theoretical or deterministic. What work does your body enjoy, and how does the pleasure or displeasure of work factor into how we think? Thinking in media and materials is not the same as technodeterminism, as you know.
So the creative practices were my route into digital work, and in the early 1990s I guest-edited an issue of Art Journal on digital art. In art history, the theme of “the body” was just emerging, and I was struck by the essentializing tone of this after all of our Lacanian training—and the idea of the body as “real” as a contrast to the attachment to the symbolic that prevails in virtual and even screen spaces seemed like a step back. Digital art and literature still had very small communities around them, and so it was easy to get a sense of the categories of approaches to conception and production—algorithmic, combinatorics, procedural, or focused on display, ambient projection, interaction, etc. Early CD projects had no conventions for interface, and their design was not menu-driven or making use of fixed navigation. Much to talk about here, and to go back into at some point.
But as far as art history goes, I have always found it an extremely conservative field. My interests are in visual epistemology, not art history, but I learned an enormous amount from having to teach across many periods and geographical locations. When we (Todd Presner, Miriam Posner, and I) developed the Beyond the Digitized Slide Library Summer Institutes for the Getty, we felt it was an opportunity to teach concepts and critical issues, not just tools and platforms. This is the way I conceived of DH pedagogy from the outset—when we were trying to build a core curriculum at the University of Virginia in about 2002. (If you are curious, I can send you the DH 101 Coursebook to look at, though the site is currently offline.) When we ran the institutes, the single most interesting exercise was taking the participants from a research question into ways to address it through structured data and analysis. We could see pretty quickly that some projects were mapping projects, some were network analysis, some were repository building, etc. None of that is particularly interesting. The execution in most platforms is so formulaic it feels pointless to me—unthinking and uncustomizable. But the analysis of a complex problem into a data model—that is a really interesting intellectual exercise. Supposing you are interested in identity transformation among diasporic artists for whom country-of-origin and locations of practice are intermixed in self-conception, style, reception, and work. This was a real project. How do you think about the components of this project in terms of what is tractable so it can be modeled and analyzed? That exercise teaches people an enormous amount about how data can be understood theoretically, but also where the sites of resistance are for work with aesthetic objects. To me, teaching data modeling is a crucial skill—because at its core is the question of how we model interpretation. What is it that you think you are doing with an object or artifact—extracting meaning? Significance? Effect? Social value? Economic value? Historical influence? Etc., etc. The best critical writing gives you ways to engage with an object / image that you would not have had without it. So, creating models of an intellectual project is an exercise that does that. We should try it.
CB: All of those questions, about what we are doing with an artifact or object, are extremely pertinent to art history. And I do think art history desperately needs a new intellectual project, because it rarely speaks to anyone outside of the discipline (and here I’m not just talking about a mass audience but also referring to fellow researchers in other fields). It also continually struggles with the fact that its objects are, for the most part, luxury goods for the 1 percent; there’s a compromise and a complicity fundamental to our work (as you yourself have noted). Even those of us who focus on less commodifiable works of art have to face the fact that all of this work circulates within (and often courtesy of) financial structures we detest. One of the ways in which younger art historians have tried to deal with this is to expand the purview of art history to broader (and often more political) questions of the environment, technology, participation, and so on—but the results are rarely focused on visual or perceptual problems that might be of use to thinkers in other fields. Maybe this is because the image itself is dwindling in importance; as Alex Galloway suggests, “The point of unrepresentability is the point of power. And the point of power today is not the image. The point of power resides in networks, computers, algorithms.”[4]
Despite this diminished power, I would want to retain the importance of visual literacy (which seems increasingly necessary in a world of fake news and media misrepresentations) and of aesthetics, which gets lost if we subsume art history within a post-human world of media theory (e.g., Kittler). Works of art are less important as unique entities of individual expression (that model died years ago) than as symptoms of—and at best, commentators on—patterns of contemporary perception and attention. The subject is still central to these questions, however, and not incompatible with an analysis of power’s unrepresentability.
I want to come back, though, to a point I made earlier: my sense that DH projects seem to provide resources to enable future research, rather than making original arguments in the present and in literary form. Is it enough to provide potential resources rather than reasoned arguments? I can imagine teaching DH as part of a general methods class in art history, but this would not involve teaching methods central to DH (like data modeling). Instead I would show a digital art history project (e.g., Manovich) alongside the substantial critiques made of it.
JD: I don’t feel a need to defend projects in digital art history, but I do think the value of certain projects should be recognized as current contributions to knowledge and methods. This leads to my other final point, which repeats some of what I said earlier, about the importance of using digital methods as part of the intellectual skill set in current scholarship. Easy to dismiss these, especially without much knowledge. Harder to engage and understand how they are of use pedagogically and theoretically, but are essential, because this is the world in which we work and the methods permeate the current environment.
Here are several examples of current projects. Much work is being done in preserving cultural heritage sites that have been destroyed by natural (flood) or human disasters (war) or are at risk. The Dunhuang Cave project creates an immersive experience using panoptic digital photography to render the monuments in a three-dimensional model at scale. The model contains embedded files that present dance, ritual, and other materials (see Getty/Unesco project). The caves have 480,000 square feet of paintings that would be destroyed by visitors. The automated detection of prehistoric burial mounds is done by digital processing of data and information to make primary discoveries of previously unidentified sites (see Melanie Riley’s work). The Oseberg Viking Ship burial mound has been rendered in an augmented reality application so that the site is undisturbed (non-interventionary archaeology) but the contents of the mound can be seen on-screen. This is an amazing conceptual breakthrough, since it leaves the site intact. Then, resources like the Getty provenance index, an enormous scholarly undertaking with complex intellectual modeling in it. The structure of the records is it own area of study, and the task of translating the information creates a whole host of research topics. Consider the problem of representing currency values from hundreds of years of history across cultures, fluctuations, and exchange rates in a way that makes any sense. The integration of resources into projects like the Digital Public Library of America or the Getty Portal, so that primary sources are not just findable, but searchable. To make these materials useful, search skills, data mining, and various analytics are essential. This leads me to my second challenge point.
The use of text analysis, feature recognition software for images, and other automated processing provides essential insights at scale. These are methods in use now, for scholarly and creative work (Jonathan Cecil did a terrific project using face recognition software to scan Google satellite images of Los Angeles, finding all of the “faces of LA” that show in its chance topographic configurations.) Knowing how to use these tools effectively is just like knowing how to use any other method effectively—it adds to the capacity of the scholar to do certain kinds of interpretive work. I go back to the Karl Kraus Die Fackel archive because its structure integrates serious linguistic science with cultural analysis in an integrated environment that is stunningly suited to supporting research into the ways language works/worked in that particular period of the rise of fascism—it provides a tool like no other for doing this work. Or Chris Johanson’s models of public spaces in Republican Rome (whose physical remains are largely inaccessible, buried under the later developments of the empire) to see how the textual record and the physical models can be reconciled. This work allows insights into that textual record and hypothesis testing about long-held assumptions about ritual, ceremony, public spectacle. These processes might use quantitative methods, but they don’t “reduce” texts to numbers in the sense that a fire “reduces” a building to ashes—the texts and artifacts remain. Instead of dismissing these methods, the challenge is to learn how to use them, modify them, engage with their limitations in the same way as with other methods. Psychoanalysis can “reduce” every image to phallic symbolism and/or the structuring voyeurism of the male gaze, or it can provide a way to read the workings of imagery through particular interpretive frameworks. Digital methods are no different. All methods have limits and can be used well or poorly. The aggressive dismissal of “the digital” in a blanket statement is simply ignorant. Sounds like the language of someone unwilling to learn. I’m not a proselytizer. If a scholar has no use for digital methods, he or she does not have to use them. But as a teacher, I feel deeply committed to providing students skills to understand these methods, use them responsibly, understand their limitations, and work with them as intellectual tools, rather than use them as mere technical instruments. Understanding how digital methods work is a survival skill, not a luxury, or you are merely at the mercy of their effects. Knowing what tools are useful and why, how they work, how to read the results of their application and use—these are present-day expertise. The students want to know these things, they are interested in what happens in the “black box” of processing, in the production of data and its life cycle, and in the reading of outcomes in whatever form—lists, tables, spreadsheets, visualizations. I am as interested in what I have learned from engaging with these tools and methods as I was in what I learned from my successive encounters with theory and methods at every point—textual analysis, close reading, semiotics, structuralism, poststructuralism, deconstruction, feminist theory, postcolonial theory, Marxism, critical theory, critical race studies, queer theory, bibliography, thick reading, and so on—because each refracts a work and project differently. I think digital methods should be a part of every student’s experience and, at the graduate level, integrated into the standard methods classes.
Notes
2. See Claire Bishop, ‘Against Digital Art History’, https://humanitiesfutures.org/papers/digital-art-history/.
4. Alex Galloway, The Interface Effect (Oxford: Polity Press, 2012), 92.